The LocSens project, funded by the Fraunhofer Society for Applied Research (grant 16ES0920) and carried out from 1 January 2019 to 30 November 2022, aimed to create a multi‑sensor system for environment perception and localisation in industrial settings where fog, dust or other disturbances are common. The Fraunhofer Institute for Open Communication Systems (IOSB‑AST) led the effort, analysing sensor requirements for service‑robotics applications, designing an ultra‑wideband (UWB) localisation subsystem, developing radar‑based mapping and localisation algorithms, fusing the results from the different modalities, and supporting the adaptation of an existing robot platform to various application scenarios.
The technical core of the work was the integration of a radar sensor as the primary source of environmental data and a UWB system that supplies additional position information. This combination was chosen to maintain robust localisation even when optical sensors are degraded. The radar data were processed to generate 3‑D point clouds, which were then used in SLAM‑style mapping and localisation routines. The UWB subsystem provided range measurements to fixed anchors, enabling a complementary localisation estimate. A quality‑driven fusion algorithm was developed to compute an optimal pose estimate from the individual results, taking into account the confidence of each sensor.
Hardware was organised in a modular architecture. Three computing units were deployed: one on the robot, one at the control centre (cockpit) and one for the human‑machine interface. The software stack was built on ROS, allowing individual nodes to be swapped or updated independently. The design also included a new anchor module concept for the UWB system and a mechanical adaptation of the sensor layout on the vehicle platform, which was mounted on a lift for testing. A simulation environment with a URDF model was used to validate the algorithms before real‑world deployment.
Validation experiments were performed in a hardware‑in‑the‑loop laboratory and in real industrial settings. The system was evaluated under line‑of‑sight (LOS) and non‑LOS (NLOS) conditions, including scenarios with heavy fog and dust. Comparative plots show that the LocSens localisation outperforms a state‑of‑the‑art reference method in both pure LOS and mixed LOS/NLOS environments. Radar and LiDAR scans were compared, revealing that radar maintains reliable detection when LiDAR data are partially blocked by fog. A reference 3‑D point cloud was obtained with a FARO S150 scanner to benchmark the accuracy of the generated maps. The RTLS (real‑time localisation system) performance was measured against a tachometer reference station, demonstrating consistent localisation accuracy across the tested scenarios.
The project also produced demonstrators and simulation tools that illustrate the full sensor fusion pipeline. The results indicate that the combined radar‑UWB approach delivers robust localisation in challenging industrial environments, while the modular hardware and ROS‑based software enable rapid adaptation to new platforms and sensor configurations. The collaboration between Fraunhofer IOSB‑AST and partner Fraunhofer IVV, along with the support of the Fraunhofer Society, ensured that the system meets the stringent hygiene and durability requirements of the food‑processing industry, where it is intended to be deployed.
