This project was inspired by something that happened to me and my close friends, when you say something that the platform or the government doesn’t agree with, your content is deleted or even your account is blocked and your data is not under your own control. In this project I reflect on the tyranny of big data and the power and privacy of the individual in the age of intelligence,I emphasises a ‘self-blurring’ strategy of resistance that shifts the balance of power between the tyrant and the public.
Introduction
With the EU’s General Data Protection Regulation (GDPR.EU, 2018) becoming law, the field of data security and privacy has changed dramatically. Companies in the U.S. and abroad doing business in the European Union—whether directly or virtually—are obligated to implement a series of strict privacy protections. (Faitelson, 2018). EU consumers can now request a snapshot of all their personal data. More importantly, they have the right to be forgotten. And their data could be erased on their demand.
However, The fact is that‘data’is never completely safe. In 2019, the MySpace profiles famously vanished. That is just one example. It also comprises that the many Google services have been shut down over the years. Moreover, there are some online data storage companies that assert they provide security for people’s data, but ironically, they sometimes specify that certain data could be deleted. In actual situations, users are not in an initiative position.
Perhaps it is time to re-contemplate the power and privacy of individuals under the tyranny of big data, and explore a new strategy to shift the balance of power between data autocrats and the public.
Main body
Data that is opacity to users
“The Internet is only permanent when you don’t want it to be.”——Pitt, 2020
Today, the notion of algorithmic objectivity is also challenged by counter -discourses related to general concerns such as automation, corporate accountability, and media monopoly (Tufekci, 2015). As we use computational tools as the primary medium of expression and digitize not just mathematics, but all information, we are substituting human discourse and knowledge under the procedural logic. In the end, There may be something impenetrable about algorithms (Gillespie, 2013).
Opacity appears to be at the heart of new concerns about“algorithms”among legal scholars and social scientists. It includes three distinct forms (Burrell, 2016): (1) Opacity as intentional corporate or institutional self -protection and concealment and, along with it the possibility for knowing deception; (2) Opacity stemming from the current state of affairs where writing (and reading) code is a specialist skill.
Since algorithms are driven by their inherent commercial interests (O’Neil, 2020 cited in Orlowski, 2020), we need a socio -technical approach to look at the way they are applied to specific groups of real -world users (and the data they generate), so -called ‘algorithms in the wild’. It has been the case for a long time that the information is not fully available to those subject to it. Often there are new concerns about the composition of the data, as well as about privacy and the possibility (or, disturbingly, the impossibility) of opting-out.
The right of‘self-opacity’
“Self-opacity is a concept with‘fuzzy edges’”. ——Jaeggi, 2014
`Self-opacity’ means that one is unable to give an account of one’s everyday activity (Saji, 2009). Perhaps exploring opacity as a strategy and a material condition might address two intertwined issues of our time: technological control and embodied materiality (Blas, 2016). Therefore, the subject historically constructed as the other should have‘the right of opacity’—the right of complete autonomy.
Conclusion
The disappearance of the Internet means that, for whatever purpose, the authorities behind the Internet are invisible. This‘invisible empire’is similar to a technological collective. While tech companies are under intense scrutiny from government regulators and the media, as a sovereign entity it is quite difficult to be detected and supervised with our current laws. Since the average person does not understand how technology products and services work, technology companies can use this advantage. That is to say, the average user is actually at a rather negative disadvantage when faced with the current opacity strategy from digital authorities.
Self-opacity is like an‘an eye for an eye’strategy for resistance. It is a low-cost, high – effectiveness way of experiment. This coexistence of visibility and invisibility has strong political implications. Therefore, it is necessary to guide the audience to reflect on the tyranny of big data and the power and privacy of individuals in the age of intelligence by exploring a resistant strategy of “self-opacity”, and try to shift the balance of power between data autocrats and the public.
Reference
1) Baraniuk, C., 2021, The online data that’s being deleted. BBC. [online] Available at:<https://www.bbc.com/future/article/20210715 -the-online-data-thats- being-deleted> [Accessed 17th July 2021].
2) Blas, Z., 2018. Informatic Opacity. In: Rosi Braidotti and Maria Hlavajova, eds. Posthuman Glossary. London: Bloomsbury Academic.
3) Burrell, J., 2016. Howthemachine ‘thinks’: Understandingopacityin machine learningalgorithms. [e-journal] 3(1), pp.1 -11. DOI: 10.1177/2053951715622512.
4) Dewey-Hagborg, H., 2015. DNA Spoofing. Available at: <https://vimeo.com/116299297> [Accessed 23 June 2021].
5) Faitelson, Y., 2018, Why ‘Right To Delete’ Should Be On Your IT Agenda Now. Forbes. [online] Available at:<https://www.forbes.com/sites/forbestechcouncil/2018/10/22/why -right- to-delete-should-be-on-your-it-agenda-now/?sh=6c517b491b7f> [Accessed 17th July 2021].
6) GDPR.EU, 2018, Complete guide to GDPR compliance. GDPR.EU. [online] Available at:<https://gdpr.eu/> [Accessed 6 February 2021].
7) Gillespie, T., 2013. The RelevanceofAlgorithms. [e-journal] 167(2014), DOI: 10.7551/mitpress/9780262525374.003.0009.
8) Glissant, E., 1990. Poeticsofrelation. Ann Arbo: University of Michigan Press.
9) Haraway, D. J., 1985. A Manifestofor Cyborg.
10) Howe, D. C, Nissenbaum, H. and Zer-Aviv, M., 2015. AdNauseam. danielc. hown. [online] Available at:<https://rednoise.org/daniel/adnauseam> [Accessed 27th July 2021].
11) Howe, D. C. and Nissenbaum, H., 2012. TrackMeNot. danielc. hown. [online] Available at:<https://rednoise.org/daniel/trackmenot> [Accessed 27th July 2021].
12) Jaeggi, R., 2014. Alienation. New York: Columbia University Press.
13) Milberry, k. 2014. In: Megan Boler and Matt Ratto, eds. DIY Citizenship: Critical Makingand SocialMedia. Cambridge: The MIT Press.
14) Pasquale, F., 2015. The Black Box Society:The SecretAlgorithmsThat Control Moneyand Information. Cambridge: Harvard University Press.
15) Pitt, S., 2020, The Internet Is Only Permanent When You Don’t Want It to Be. OneZero. [online] Available at:<https://www.forbes.com/sites/forbestechcouncil/2018/10/22/why -right- to-delete-should-be-on-your-it-agenda-now/?sh=6c517b491b7f> [Accessed 15th July 2021].
16) Sahbaz, U., 2019. Artificial Intelligence and the Risk of New Colonialism. Journal of International Relationsand Sustainable Development, 2019(14), p.58-71.
17) Saji, S., 2009. Threeaspectsoftheself-opacityoftheempiricalsubjectinKant. [e-journal] 35(3), pp.315 -337. https://doi.org/10.1177/0191453708100233.