AI Ethics: Data centralization and privacy

  • Posted on: 24 February 2019
  • By: Juho Vaiste

Established approaches to privacy have become less and less effective because they are focused on previous metaphors of computing, ones where adversaries were primarily human.
(AI Now, 2017).

The concern of data can be divided in two like the topic states: Firstly, everyone has the right to privacy and to trust that his or her private and personal data is managed justifiably and assuredly goes as back as the United Nation’s Declaration of Human Rights (IEEE, 2018). Secondly, the increased use of internet services and now AI-powered systems are raising the worry that if our private data is centralizing too much in the hands of few technology companies.

Similar worries can be raised on all kinds of data, but personal and private data is the primary area of concern. Businesses are developing more data-centered and investing heavily in data gathering and analytics (Economist, 2017). The speed of the development is so fast that ethical concerns are surely justified. Internet Society (Internet Society, 2017) encourages the developers and owners of AI system towards the principle of "data minimization”: AI systems should collect only the data they need and delete it when it is no longer needed.

Data seems to be centralizing to the hands of a few major companies, and the big technology companies possess an awful lot of information about us. The report by the UK Government states that there must be a change to the ways of how the data is gathered and accessed, for the citizens to protect their data better, but also for providing fair and reasonable access to data for smaller companies and academia. (the UK, 2018).
Technology corporations are possessing large datasets on their users, but the public is not that aware of the phenomenon. One solution to develop the data awareness is better education and communication, noted by the reports of the UK and IEEE. The UK report also asks for “legal and technical mechanisms for strengthening personal control over data” (the UK, 2018) to which IEEE’s proposal on “data privacy warnings” (IEEE, 2018) is one idea.

Still, the data centralization remains only as an unpleasant risk, at least in the Western countries. The development taking place in China puts us to think of the more dangerous scenarios (Future of Humanity Institute, 2018; Wired, 2017). It seems likely that the citizens of the Western countries do not want that kind of world where one will be seen only as large datasets and where one’s actions and behavior is predictable and modifiable.

Added sources and references:

Smith, H. J., Dinev, T., & Xu, H. (2011). Information privacy research: an interdisciplinary review. MIS quarterly, 35(4), 989-1016.

AINow: AI Now 2017 Symposium Report. https://ainowinstitute.org/AI_Now_2017_Report.pdf

Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. Profile Books.
- Chapter Summaries: https://openphilosophy.be/reading-group/
Zuboff, S. (2015). Big other: surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75-89. https://link.springer.com/article/10.1057/jit.2015.5

Trask, A. (2017) Safe Crime Detection - Homomorphic Encryption and Deep Learning for More Effective, Less Intrusive Digital Surveillance https://iamtrask.github.io/2017/06/05/homomorphic-surveillance/