Digital platforms rely heavily on harvesting end-user data to provide personalised content. While people have started to gain knowledge on how their data is being used to predict their behaviours, hook them on the content, and influence their worldviews, they are growing increasingly concerned about not knowing what such data reveal, for what purpose, and to whom. Yet, digital platform organisations need data to sustain their business models. This data is of great value for their partners, who acquire it for scopes that too often go beyond the control of both the end-users and organisations. This is problematic because of end-users’ current inability to practice their digital rights to privacy and consent and organisations’ lack of incentives or tools to safeguard both their interests and their end-users.
Through empirical design research, my thesis defines a future vision desired by both end-users and organisations by identifying the frictions that hinder its achievement and the value similarities and tensions to be considered for obtaining effective and meaningful consent practices and disclosure interactions. The resulting design directions are applied to the recent 2019 IBM-Flickr case, where permissive Creative Commons licenses allowed the collection of photos of human faces by the surveillance industry worldwide. A new consent journey was created that balances the privacy considerations of end-users and the interests of the platform and its partners in creating image datasets for AI training purposes.
This graduation project was supervised by Elisa Giaccardi (DCODE), Lianne Simonse (TU Delft), and Heather Wiltse (DCODE) in collaboration with DCODE partner Open Future as part of the DCODE Labs / Digital Sovereignty Lab.