Immersive Point Cloud Manipulation for Cultural Heritage Documentation - Inria - Institut national de recherche en sciences et technologies du numérique Accéder directement au contenu
Article Dans Une Revue ERCIM News Année : 2017

Immersive Point Cloud Manipulation for Cultural Heritage Documentation

Résumé

Virtual reality combined with 3D digitisation allows to immerse archaeologists in 1:1 copies of monuments and sites. However, scientific communication of archaeologists is based on 2D representations of the monuments they study. We propose a virtual reality environment with an innovative cutting-plan tool to dynamically produce 2D cuts of digitized monuments. A point cloud is the basic raw data obtained when digitizing cultural heritage sites or monuments with laser scans or photogrammetry. These data represent a rich and faithful record provided that they have adequate tools to exploit them. Their current analyses and visualizations on PC require software skills and can create ambiguities regarding the architectural dimensions. We propose a toolbox to explore and manipulate such data in an immersive environment, and to dynamically generate 2D cutting planes usable for Cultural Heritage documentation and reporting. Due to their states, complexity and archaeological interest, the two subjects studied in this work are Breton architectural sites for which the development of new analytical techniques appears quite appropriate. The first is the chapel of Languidou, built in the middle of the 13th century in the municipality of Plovan, which seems to be the "founding element" of a religious architectural style. The second building addressed in this work is the "jeu de paume" court of Rennes, built at the beginning of the 17th century and registered as a Historical Monument. For several decades, the study of these kinds of architectural styles and reorganizations has been carried out by archaeologists by producing different types of 2D documentation: plans, drawings, sections, profiles and orthophotos. 3D documentation comes from classical topographic survey, laser scanning or photogrammetry. In the case of the French Grand-Ouest, some of this documentation has been carried out within the scope of the West Digital Conservatory of Archaeological Heritage. Until now, an engineer has performed the segmentations of the 3D documentation on a PC and tries to better meet the archaeologist's expectations, who is sometimes unwilling to use new technologies and frustrated by his lack of autonomy. The objective of this work is to involve the archaeologist more deeply in this process by immersing and allowing him to segment, in real time and on 1:1 scale, the 3D survey. The first step consists in loading the point clouds of the architectural sites into the Virtual Reality platform Immersia. The scans were done in June 2013 and June 2014, with a Leica ScanStation C10 and a Focus3D X330, and were integrated a few months later. The data structure of the points displayed in our virtual reality device is a billboard with 3 coordinates, a colour and a scalar. The files are in a binary format that we designed for Unity. For a correct exploration within the Immersia, a subsampling was systematically done and two cloud loading modes were implemented. For the first mode, the cloud is distributed over several hundred octrees. They are loaded dynamically and are visible from the user's point of view, thanks to a culling technique. The second mode consists in the use of a Level-of-Detail technique. The number of points loaded at the start in a single file is smaller and their size is fixed. The selection of cloud segmentation tools is done through a MiddleVR menu that allows the user to interact with three parameters (cf Figure 3). When the first mode is active, it is possible to modify the size of the points. On a more global scale, the user can switch between the two loading modes and change the display distance to the cloud. Concerning the "cutting plane", it is possible to display or hide it, change its thickness and assign a unique color to the points contained in it. It is also possible to change the opacity of points outside the cutting plane. Its manipulation within Immersia is done thanks to a Flystick which modifies its translations and rotations. The display in the Immersia platform (MiddleVR) has a frame rate which can reach 30fps. This result is two to three times less fluent than that on PC (Unity / MiddleVR). The rendering of the 2D resulting plane is done with a camera orthogonal to the 3D cutting plane and can displayed on a tablet. The user can also adjust the distance, field of view and roll of the camera.
Fichier principal
Vignette du fichier
13-barreau - ercimnews DH.pdf (2.21 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01659814 , version 1 (08-12-2017)

Identifiants

  • HAL Id : hal-01659814 , version 1

Citer

Jean-Baptiste Barreau, Ronan Gaugne, Valérie Gouranton. Immersive Point Cloud Manipulation for Cultural Heritage Documentation. ERCIM News, 2017, pp.1-3. ⟨hal-01659814⟩
1101 Consultations
152 Téléchargements

Partager

Gmail Facebook X LinkedIn More