The OpenFarmingAI project aimed to create an open‑source plant‑monitoring system that uses image‑based AI for visual inspection and combines it with AR and VR visualisations to explore light‑plant interactions. The sub‑project “Visualization of PMS data and provision of results and learning environment for the public” was carried out by Studio B12 GmbH from 15 August 2020 to 14 August 2023 under the funding reference 13N15387. The main partners were Protohaus gGmbH, which supplied the physical greenhouses and hydroponic setups, and the National Jürgens Brauerei, which participated in a co‑creation phase for the final output. The project was part of the larger OpenFarmingAI consortium, which focused on environmentally friendly urban cultivation methods.
Technically, the team reconstructed up to ten different hydroponic models in 3D, using Blender for geometry and Substance Painter for realistic material textures. These models were imported into the Unity engine, where both AR and VR applications were developed. The AR application streams live data from the greenhouse cameras, displaying real‑time parameters such as light intensity, nutrient levels, and plant health indicators on the user’s device. The VR application creates an immersive learning environment that simulates the greenhouse, allowing users to interact with the plants and adjust environmental variables. A gameplay element is included: users balance parameters to achieve optimal yields and earn high scores, with a typical session lasting about five minutes after a 1½‑minute introductory tutorial. The VR experience is compatible with Oculus Quest and Quest 2 headsets.
Data collected from the monitoring system include quantitative measures of plant‑light interactions, such as photosynthetic efficiency and growth rates under varying light spectra. These data are visualised through simplified charts and dashboards within the applications, making invisible processes like energy flow and nutrient uptake tangible for non‑experts. The design manual produced during the project documents UI guidelines, colour schemes, iconography, and interaction patterns, ensuring consistency across the AR/VR interfaces, the project website, and marketing materials.
The project demonstrates the feasibility of integrating AI‑driven image analysis with mixed‑reality visualisations to support both scientific research and public engagement. By making complex biological and technical information accessible through interactive 3D environments, the work aligns with broader Industry 4.0 goals and addresses pressing issues such as urban food production, resource efficiency, and climate resilience. The collaboration model—combining expertise from a software studio, a greenhouse operator, and a brewery—showcases how cross‑sector partnerships can accelerate the development of innovative agri‑tech solutions. The final deliverables include fully functional AR and VR applications, a design manual, a public website, and a set of marketing assets that together provide a comprehensive platform for learning about light‑plant dynamics and the potential of biophotonics in sustainable agriculture.
