The AutoModal project set out to advance automation in inland multimodal terminals by developing a prototype for automated crane operation. The core technical achievement was the creation of an end‑to‑end software stack that integrates sensor data acquisition, machine‑learning‑based person and vehicle detection, and crane control. Four open‑source components were released: the Sensor and Neural Data Platform (SAND) provides a modular framework for ingesting, processing, and redistributing camera or video streams; the Thing Action Management System (TAMS) offers a vendor‑agnostic interface for generating and monitoring crane tasks such as pick and drop operations; the Crane Control System (CCS) adapter connects TAMS to the crane’s PLC, and the Thing Action Management Interface (TAMI) defines the protocol for transmitting job descriptions to the crane. Together these modules enable a complete automation loop from perception to actuation.
During the project, a reference portal crane at a Contargo terminal was instrumented with a range of sensors, including high‑resolution cameras and depth sensors. Laboratory tests evaluated sensor performance under varying lighting and environmental conditions, and a preliminary machine‑learning model was trained on annotated video data to detect humans and vehicles in the crane’s operating area. The model was integrated into SAND and evaluated on the live crane, demonstrating that the system can reliably identify personnel and vehicles in real time, thereby providing the safety and situational awareness required for autonomous container handling. The tests also revealed practical limitations, such as reduced detection accuracy in low‑light scenarios and the need for additional redundancy in sensor placement, which informed the design of a robust deployment strategy.
The project produced a detailed requirement catalogue and a hardware evaluation matrix that outline the necessary sensor suite, computing resources, and communication infrastructure for scaling the solution to other terminals. A roadmap was derived from the pilot results, outlining incremental implementation steps—from sensor installation and data pipeline setup to full autonomous operation and safety validation. The open‑source release of the software stack allows industry partners to adapt the solution to their own crane models and terminal layouts, accelerating the path to market readiness.
Collaboration was structured around three main partners. Fraunhofer Institute for Machine Learning (Fraunhofer IML) led the scientific work, conducting literature reviews, process analyses, and the development of the machine‑learning models and open‑source components. Synyx contributed expertise in applying neural networks to customer projects, integrating new crane systems, and maintaining the open‑source repositories. Contargo supplied the terminal environment, the reference crane, and operational data, and identified future automation opportunities at additional sites such as Neuss and Mannheim. The project ran over a two‑year period, with regular bi‑monthly status meetings and annual consortium gatherings to ensure alignment and progress. Funding was provided through German federal and European Union programmes aimed at advancing maritime and inland logistics automation. The combined effort produced a demonstrable prototype, a set of reusable software modules, and a practical roadmap that collectively advance the feasibility of automated crane operations in inland multimodal terminals.
