User Information
Christopher Chagnon
Username: christopher.chagnon@helsinki.fi
Email: christopher.chagnon@helsinki.fi
Biography: Christopher W. Chagnon is a doctoral researcher of Global Development Studies at the University of Helsinki, and affiliated with the University of Zambia. His research focuses on extractivist approaches to personal data, the social damages that they cause, and the resistances/alternatives that they stimulate, specifically in the context of Zambia.
Thank you for your comments and excellent questions! In response: 1) This is a very important question, though I will be a bit pedantic to start my response. I think that in the case of citizen science or researchers collecting data in local communities, the risk would be more of what Grosfoguel terms "epistemic extractivism" (or I've also seen people use the terms "data extractivism", "academic extractivism", and "research extractivism") rather than (personal) data extractivism, as it generally doesn't depend on the mass automated harvesting of personal data (though, it certainly can, depending on the sources). This is actually something that a colleague and I have been wanting to write about to delineate/define more. However, I do think that social pollution can definitely come from this. Overall, I think that for researchers and citizen scientists, the big thing is approaching the communities in an open and responsible way; including communities in the research design, rollout, and collection; handling the data in ways agreed upon with the community (anonymity, who gets to access the data, who gets copies of the data, etc.); and ensuring that all outputs from the data are shared with the community (and, if this is for something profit-generating, then being clear about that from the beginning, and money should primarily go to the community that provided the data). I think that the Te Mana Raraunga (Māori Data Sovereignty Network) guidelines (and those of the Global Indigenous Data Alliance) provide some great approaches to the collection and use of data. 2) This is one of the biggest and toughest questions. In an ideal world, an open-source not-for-profit social network that focuses on privacy and accessibility would take off, and using technologies like Pine Phone (which is modular, allowing for less ewaste, and also greater control over what data is being harvested/collected) and Qubes OS (which allows for compartmentalization of activities on a computer to enhance security and reduce data harvesting) would be more widespread. However, sadly, these remain niche, and many alternative technologies can be daunting for the general public to approach, as they can be more technical. Also If people and groups want to share their message, they need to be on platforms where there are people to receive that message. That user base and ease of use gives platforms like Facebook such a leg up. In my own research, as well as some research I've helped out with, I've seen how in a lot of contexts in Sub-Saharan Africa it is the norm for activists, organizations, and small businesses to have a Facebook page (often instead of a website). It makes sense - one doesn't need to pay money for it, it's easy to do, and most people getting online in those contexts are on Facebook. If you're going to get a message out, you need to go where there are people to hear that message. So, unfortunately, at this point I think that balancing the risks comes down to the individuals/organizations. I think that sensitization of the risks/how to mitigate those risks, how platforms operate, and how to act responsibly are about the best that can be done right now. Though, to be clear, while I think that there is a degree of individual responsibility and ethics when engaging on any digital platform, I do think that there should be much more of an onus put on the platforms/corporations to be better and on governments to hold these corporations accountable. I mention this because of how often I still see people say, "It's all right there in the terms of service! It's people's own fault for not reading them. If they don't like it, they can just not use the service!" as a way to shield corporations from criticism and responsibility.