The NextPerception project, funded by the ECSEL consortium under grant number 16MEE0068, ran from 1 May 2020 to 31 July 2023. The German partner s.m.s, smart microwave sensors GmbH, led the technical development of a next‑generation sensor fusion system for traffic monitoring, control and safety. The overall project was coordinated by VTT Technical Research Centre of Finland Ltd, with Johan Plomp serving as the overall project manager, while Stephan Schlupkothen acted as the coordinator for the s.m.s sub‑project. Consider IT contributed the vehicle‑to‑everything roadside unit (V2X‑RSU) that delivers the fused object lists to the traffic network. The consortium also included several other partners, notably UC3, who provided safety and comfort expertise at intersections, and other European partners who supplied radar and camera hardware and contributed to the development of the fusion algorithms.
Technically, the project delivered a complete sensor‑fusion stack that combines a 24 GHz FMCW radar with a full‑HD camera in a single compact hybrid sensor, the TRUGRD‑STREAM. The radar provides high‑resolution range‑Doppler data, while the camera supplies rich visual context. The fusion unit, named COMHUB Fusion, performs multi‑radar tracking, fuses the resulting radar tracks with camera detections, and applies a deep‑learning‑based object classification model to the radar objects. The video‑processing unit is built around an Nvidia Jetson platform that runs a convolutional neural network for real‑time detection, tracking and classification of visual objects. All components communicate over Ethernet, and the final fused object list is transmitted via the V2X‑RSU to downstream traffic management systems.
The system was evaluated in a series of field trials that demonstrated significant improvements in localisation accuracy and object‑class prediction compared with legacy single‑sensor solutions. In particular, the multi‑radar fusion reduced position uncertainty for vulnerable road users (VRUs) near intersections, while the camera‑based classification increased the reliability of pedestrian and cyclist detection. The pilot configuration, consisting of two hybrid sensors, a fusion unit, a video‑processing node and a V2X‑RSU, was successfully deployed at an intersection test site. Although supply chain constraints and staff shortages forced the project to split the final system into subsystems, the individual components were validated independently: the single‑sensor detection chain produced accurate object lists, and the multi‑radar fusion at the intersection achieved the expected gains in spatial resolution and classification confidence.
Beyond the technical achievements, the project established a robust data‑sharing and dissemination framework. Regular consortium meetings, technical workshops, and a dedicated exploitation plan ensured that the research outcomes could be translated into commercial products. The exploitation strategy, still under development, will focus on integrating the sensor‑fusion technology into s.m.s’s existing radar‑based infrastructure portfolio, targeting smart‑city and Car2X applications. The project’s outcomes also feed into the broader ECSEL roadmap for high‑resolution sensing at intersections, providing a benchmark for future research and development in intelligent traffic systems.
