The DaSKITA project “Data Sovereignty through AI‑Based Transparency and Disclosure” aimed to give consumers a higher level of informedness and self‑determination in data‑driven services. Part 2, iRights.Lab, focused on creating a machine‑readable representation of transparency information, evaluating consumer‑enabling technologies, and disseminating results. The project ran from 1 January 2020 to 31 March 2023 and was funded under the German research grant FKZ 28V2307B19.
A central technical outcome was the development of a formal transparency‑information language and the accompanying Toolkit for Implementation and Learning of Transparency (TILT). The language was derived from a detailed analysis of the transparency obligations in the General Data Protection Regulation (GDPR). It provides a structured, machine‑readable format that can be automatically parsed and adapted, unlike traditional legal privacy notices. The toolkit was fully implemented and enabled the conversion of real privacy statements into the new representation. Several use cases demonstrated that the language can express GDPR‑compliant transparency information and support automated processing of large volumes of data. The result is a privacy‑engineering solution that developers can use to embed transparency and disclosure into modern web systems.
Evaluation of the consumer‑enabling technologies (CETs) was carried out with a user group that included consumers, developers, and data‑protection experts. The study confirmed that the technologies effectively increased transparency, but it also revealed that transparency alone is insufficient. Consumers need contextual information about the implications of the disclosed data to interpret it meaningfully. This finding points to the need for further research on how to present implications alongside raw transparency data.
The project also produced a corpus of machine‑readable transparency information, which serves as a reference for future research and tool development. The TILT toolkit and the corpus together provide a foundation for automated compliance checking and for building user interfaces that present privacy information in a clear, actionable way.
Outreach activities were an integral part of the project. Five events were held, all conducted via Zoom due to pandemic restrictions. These workshops attracted 150 participants from diverse stakeholder groups, including consumers, website operators, companies, and developers. Eleven experts from academia, industry, and civil society spoke at the events, underscoring the relevance of transparency and disclosure. Attendance was affected by the online format, with a 20–30 % drop compared to in‑person events, but the sessions still facilitated meaningful dialogue.
Collaboration was led by iRights.Lab, which took responsibility for work packages on detailed requirement analysis, impact assessment, and regulatory implications. The project partnered with the Technical University of Berlin, which contributed research expertise and helped implement the TILT toolkit. A Berlin‑based start‑up, Dilecy, was also part of the consortium, aiming to analyze corporate disclosure behavior on a larger data set; however, the start‑up’s second seed round failed, limiting the scope of that contribution. Additional partners included the University of the Arts Berlin and the German Data Protection Association, which provided multidisciplinary perspectives and helped shape the design of user‑centric solutions.
Overall, the project delivered a technically robust, GDPR‑aligned language and toolkit for transparency information, validated consumer‑facing tools, and a corpus for future research. The collaborative effort combined academic research, industry practice, and civil society input, and it produced results that can be adopted by developers and companies to enhance user sovereignty in data‑driven services.
