The Deals3D project, funded under grant 13N15313, ran from 1 June 2020 to 31 May 2023 and was carried out by Julius Maximilians Universität Würzburg in partnership with the Zentrum für Telematik (ZfT) and other stakeholders. The subproject “Multimodale 3D‑Modellierung für den Denkmalschutz” aimed to create a secondary system that can be deployed after damage events or for preventive monitoring of cultural heritage sites. Its core objective was to automatically merge data from a sensor network—comprising UAV‑borne multispectral cameras, ground‑based LiDAR scanners, and previously acquired survey data—into precise, annotated three‑dimensional models that provide essential information for assessment, documentation, and conservation.
Technically, the project began with a sensor‑selection phase in collaboration with ZfT. A multispectral camera was mounted on a UAV equipped with an RTK‑GPS system to deliver centimeter‑level positioning. The camera’s raw data, originally in a proprietary format, were converted to multi‑channel TIFF images and fed into the Structure‑from‑Motion (SfM) software Colmap. From these images, key channels were extracted and linked to GPS coordinates, enabling the generation of dense point clouds. To complement aerial data, the same multispectral camera was also mounted on a terrestrial LiDAR scanner. A calibration routine aligned the RGB channel of the camera with the LiDAR point cloud, allowing the projection of images onto laser scans and the creation of multispectral point clouds that combine geometric and radiometric information.
Geometric and radiometric calibration of all sensors was performed to ensure consistency across modalities. The reconstruction pipeline was extended to produce multispectral point clouds, and a radiometric correction algorithm was developed to mitigate color inhomogeneities caused by varying illumination. The system also incorporated simultaneous localisation and mapping (SLAM) techniques to register successive scans and to detect changes over time. A change‑detection module analysed temporal differences between datasets, highlighting structural alterations or damage.
The workflow was validated on the UNESCO World Heritage site Erzbergwerk Rammelsberg, covering an area of roughly 500 m × 200 m. The integrated UAV‑and‑ground‑based data collection produced high‑resolution, colour‑accurate 3D models that were subsequently evaluated against ground truth measurements. The evaluation phase confirmed that the multimodal fusion approach yielded models with sub‑centimetre geometric accuracy and improved radiometric fidelity compared to single‑modal reconstructions.
Beyond the technical achievements, the project established a data‑management framework under the “denkmal3D” initiative, standardising data formats and facilitating future reuse. The five work packages—sensor integration, data acquisition, photogrammetry and point‑cloud fusion, multimodal 3D modelling, and change detection—were executed sequentially, with milestone reviews at the 2100, 4100, 5100, and 6100 stages. The collaboration between JMU and ZfT combined expertise in heritage conservation, photogrammetry, and telematics, while the funding from the German federal programme enabled the procurement of specialised hardware and the development of custom software modules. The project’s outcomes provide a scalable, automated pipeline that can be applied to diverse cultural heritage sites, supporting both post‑damage assessment and preventive monitoring.
