The SySiKo project set out to create a vehicle‑based system that detects and warns of collision risks with vulnerable road users in the blind‑spot area of large commercial vehicles such as buses, trucks, and construction machinery. The goal was to improve safety for cyclists, pedestrians, and children, especially in densely populated urban environments where severe accidents are most common. To achieve this, the consortium pursued a multi‑modal sensor fusion strategy, combining image‑based sensors (CMOS, Time‑of‑Flight, infrared), a radar unit, and a real‑time location system. The fusion of these complementary modalities was expected to deliver a higher detection accuracy than any single sensor could provide.
A key technical milestone was the design and fabrication of a multi‑sensor platform (MSP) that integrated the selected sensors. The MSP was built around an ARM‑based integration platform, while the drivers for the individual sensors were implemented on a Linux system running on a standard laptop to allow early data‑recording tests. The sensor suite included a FLIR Radiometric Lepton 2.5 infrared camera, a CMOS image sensor (IDS UI‑1240LE), a Time‑of‑Flight unit, and a radar module. During the evaluation phase, the project team discovered that the initial ToF sensor did not meet the required performance under varying weather conditions, leading to a switch to a radar sensor. This change required additional integration effort and impacted the overall system architecture, but ultimately resulted in a more robust sensor set.
The processing core of the system is an NXP i.MX 8 subsystem‑on‑chip (SoC) mounted on a carrier board developed by Solectrix. The SoC hosts an FPGA that runs a deep‑learning inference engine trained to detect cyclists and other vulnerable road users from the fused sensor data. The deep‑learning model was trained on a dedicated application and integrated into a reference software framework that also handles sensor data acquisition, fusion, and signal generation. The system can cover a 50° field of view and operates at 9 frames per second, providing timely detection of cyclists in the blind‑spot region. When a potential collision is identified, the system generates alerts for both the driver and the vulnerable road user, thereby enhancing situational awareness.
Testing was carried out on real vehicles equipped with the full sensor array. The consortium installed the sensor arrangement on test buses and trucks, conducted driving scenarios that mimic real‑world blind‑spot situations, and recorded data for further analysis. The test results confirmed that the combined sensor approach and FPGA‑based deep learning could reliably detect cyclists at the required range and speed. The data also guided iterative improvements to both the sensor configuration and the machine‑learning model.
SySiKo was executed as a consortium project with dresearch‑fe.de as the lead partner. Solectrix contributed the SoC carrier board and provided technical support for the integration of the i.MX 8 subsystem. Other partners, whose identities are not disclosed in the report, contributed to sensor evaluation, system architecture, and field testing. The project was funded by German research agencies, with financial support from Deutsche Bank AG and Landesbank Berlin, and ran over a multi‑year period. The outcome is a demonstrator that proves the feasibility of the approach and lays the groundwork for future commercialization, either as a retrofit solution for existing commercial vehicles or as a component of new vehicle platforms.
