The project, funded under the IDeA programme (FKZ 16SV8106), ran for 36 months from early 2018 to late 2021 and brought together a consortium of research institutes, industry partners and technology developers. Key collaborators were the Max Planck Institute (MPI), Ludwig Maximilian University (LMU), the Institute for Applied Physics – University of Tübingen (IFA‑UT), Zeiss, Talkingeyes & More, Blickshift, LivingLabs, PupilLabs and Valve. The consortium coordinated through regular project meetings, workshops and joint development sessions, with a dedicated project management team overseeing the integration of all components.
Technically, the core achievement was the stable integration of eye‑tracking into virtual‑reality (VR) environments. Initial tests with the PupilLabs eye‑tracking module in early 2019 failed to deliver reliable, stable data, prompting a workshop at Blickshift in Stuttgart where partners exchanged experiences and devised a temporary workaround. A Unity module was then written to simplify calibration, secure tracking data and enable plug‑in of eye‑tracking into diverse VR scenes. In June 2019 Valve released a new VR headset equipped with an integrated eye‑tracking sensor. Using project funds, the team acquired this headset, and its performance surpassed that of the earlier module, leading to its exclusive use in the final demonstrator.
Building on this foundation, the team created three realistic 3D environments— a supermarket, a living room and an urban street segment with basic traffic simulation— and developed a parametric shader that renders scotomas in real time based on gaze data. These environments served as testbeds for diagnostic and assistive scenarios. For diagnostics, the project implemented a standardised vision‑function test (AP2) and an implicit perimetry method using gaze tracking (AP3). Diagnostic data generated by Blickshift were visualised within the demonstrator on a virtual monitor, a 3‑D eye model and a height map, providing clinicians with intuitive feedback.
Assistive features were added in later phases. AP4 integrated a gamified training module from IFA‑UT, controllable via a Vive controller, requiring extensive changes to the scene‑loading and input systems. AP5 introduced information‑substitution tools that help users read wall clocks and recognise facial emotions; these were realised through ray‑casting on objects such as a wall clock and a television‑displayed face. User‑interface development (AP6) produced adaptive interfaces for patients with age‑related macular degeneration, including gamified everyday applications, all designed and implemented in Unity. Parallel to this, AP7 focused on multi‑user interfaces, defining synchronous and asynchronous communication protocols between the demonstrator, diagnostic scenes and a front‑end backend, with close involvement from Talkingeyes and Blickshift.
The culmination of the project was the functional demonstrator (AP9), built in collaboration with MPI, LMU and Blickshift. Key scenarios were implemented and evaluated with end users, and a commercialisation concept was drafted in partnership with UTüb, Zeiss and Talkingeyes. Throughout the project, the consortium maintained a tight schedule, meeting monthly on the last Wednesday and holding additional workshops in Düsseldorf, Munich, Stuttgart, Erlangen, Tübingen and other locations to ensure alignment and rapid iteration.
In summary, the project delivered a robust, gaze‑contingent AR/VR framework that integrates eye‑tracking hardware, realistic patient environments, diagnostic and assistive modules, and both single‑ and multi‑user interfaces. The technical achievements—particularly the successful adoption of Valve’s integrated eye‑tracking headset and the real‑time scotoma shader—provide a solid foundation for future clinical applications and commercial deployment.
