We discussed issues and ideas for internet platforms that protect and serve marginalised groups in a panel at PrivacyCamp22.
Our own take-away points:
- We discussed how privacy and data protection interact with concepts such as personal safety or freedom from exploitation and manipulation, on the example of groups and communities with specific risk profiles: sex workers, people on the move, and children.
- We note that there are big differences between risk awareness and the ability to act on personal risk awareness, and that power imbalance prevents action even if the risk awareness is there.
- We discussed community-managed platforms and regulation from the perspective of freedom of thought as an absolute right to protect communities. Of course, due to differences in legislation and empowerment of the community, none of these approaches caters equally well for all marginalised groups.
There is also a more comprehensive event summary available from the EDRi people at https://edri.org/our-work/privacycamp22-event-summary/.
Many internet services are designed to collect personal data and exploit established and novel marketing techniques to nudge users to abandon increasingly more of their undivided attention. While users may expect that, in return for their data and attention they will receive content tailored to their interests, what they get is content selected and moderated based on the services’ business interests, irrespective of user enjoyment or societal welfare. Indeed, many internet services enforce moral and societal frameworks that the target audience may neither be subject to, nor agree with. Instead of serving the needs of users and treat them as ends, add-driven services objectify these users as means for profit, reducing the users’ purpose to that of consumers who are to be manipulated to consume more and specific content at the choice of international corporations. Thus, in order to build respectful technologies away from structural exploitation, we must go beyond considerations of data privacy to examine the ways in which technology fails to meet users’ expectations for what they will receive in return for personal data and engagement. Specific issues for certain groups of society show that detrimental and discriminatory effects are pervasive and indicate that the underlying issues require novel approaches of regulation and community-driven platform governance.
Examples for these effects are:
- Children: Manipulative online services for children do not cater for their best interest, but may present a threat to their development and freedom of thought.
- Sex workers: Digital services frequently deplatform and censor discussions of sex and sex work preventing sex workers doing legal work from being visible to broader society, and from having their own access to supportive community, harm reduction information, and digital financial services.
In this panel we will look at digital infrastructures reflecting the needs of these two groups, children and sex workers. Our analysis drives is driven by the understanding that a sole focus on privacy and data protection may not be the appropriate way to regulate digital platforms and to guarantee a safe environment for users. We will discuss different personal, legal, and technological aspects of personal safety, internet governance, and regulatory ideas beyond the General Data Protection Regulation and the Digital Services Act, to work towards new community-driven infrastructures that cater for intersectional justice. Specifically, we want to explore the boundary where the limits of regulation and community-driven privacy tools clash with platform governance.
- Elissa Redmiles, Research Faculty, Max Planck Institute for Software Systems, DE
- Tommaso Crepax, IT Researcher, Scuola Superiore Sant Anna, IT
- Patricia Garcia, Assistant Professor, School of Information, University of Michigan, USA (Patricia couldn’t make it in the end)
- Lola, Utsopi, BE
- Jan Tobias Muehlberg, Research Manager at imec-DistriNet, KU Leuven, BE
Last modified: 2022-05-09 22:20:35 +0200