The Entangled Nature of Human and Artificial Intelligence
produced by: Sarah Cook
This essay was written in response to Cognitive Sensations 2021 programme ‘The Downloadable Brain’. ‘The Downloadable Brain’ is a programme that examines the biological connection between humans and technology, and questions what we can expect as brain-reading technology becomes increasingly embedded within society.
We often approach the notion of a ‘Downloadable Brain’ from a highly futuristic standpoint. Stories in the media are filled with speculation around mind augmentation and immortality. We imagine super-intelligence embedded in our brains and digital humans living forever in the cloud. In our pursuit of a speculative future filled with high-tech tools, wires, and implants, we overlook the reality of our present. We fail to recognise that our malleable brains are already being augmented by the technology we interact with, and to question the motives and implications behind it.
This paper aims to consider the theme of ‘The Downloadable Brain’ from the current climate of the Digital Age: surveillance capitalism, the attention economy and human capital. It explores mind augmentation through the notion of ‘humans as natural-born cyborgs’ and the extended mind thesis, questioning the neurological impact of our digital tools and their subsequent impact on our autonomy.
To explore the idea of a downloadable brain, we first need to understand that our brains are fluid and constantly evolving organs. They are influenced by and adapt to our environment, with the ability to physically remodel and restructure themselves. This interface between the brain and the physical environment, crucial to our evolution and survival, is made possible by neuroplasticity - the ability of neural networks in the brain to change through growth and reorganization (Park and Huang, 2010).
Next, we need to consider what makes up our thinking system. Though we often consider cognition to take part exclusively inside the brain, there are many theories that dispute this. The extended mind thesis proposed by Andy Clark and David Chalmers (1996) identifies a mental apparatus that extends far beyond our biological bodies into the physical world around us. Objects and tools form part of our thinking system – spoken words and counting become written text and numerals, calculators aid our ability to solve complex math problems. This cognitive hybridisation is not a modern development but an aspect of humanity. We are ‘natural-born cyborgs’ with thinking and reasoning systems spread across biological brain and non-biological circuitry (Clark, 2003).
The combination of the natural plasticity of our minds and the extended mind thesis suggests that our tools do more than allow us to store our thoughts or transmit ideas. They constitute ‘mind upgrades’ in which the architecture of the human mind is transformed through neural reconfigurations (Wheeler, 2011). This process can alter our cognitive processes, changing the way we store information and perceive the world.
For example, in today’s networked environment our brains tend not to store information but to remember how to access that information using technology - demonstrating an adaptive use of memory in which computers and search engines form an external memory system (Wheeler, 2011). Though it’s clear how this might be beneficial in a world where we are flooded with information daily, it also illustrates the power our tools can have over our minds - highlighting the need to be critical of the technology we use and the intentions of its creator.
In the digital age the optimisation of our brains to the environment is a double-edged sword, particularly when that environment is undesirable or the plasticity of our brains is exploited by external parties without our consent (or in many cases, knowledge). As our tools become ‘smarter’ they become increasingly opaque, not only through complexity but also proprietary software licenses that restrict access to their inner workings. We no longer have ownership or control over our tools, yet they exhibit unprecedented levels of influence over our daily lives. There is an underlying power dynamic between humans and their tools that did not previously exist. They are no longer passive actors but are objects with intent – intent that is not always clear to the user or aligned with their motives and desires.
Technology is routinely designed to be habit-forming, exploiting the plasticity of our brains to create addictive products (Eyal, 2014). Our attention is treated as a rare commodity that companies such as Facebook, YouTube and Netflix invest large amounts of resources in trying to attain – and retain. As Shoshana Zuboff highlights in ‘The Age of Surveillance Capitalism’, the current mode of digital capitalism is reliant on behavioural surplus. Human experience is claimed by big data companies as free raw material to be processed and sold as prediction products to companies hoping to profit from our future behaviour (Zuboff, 2019). Algorithms created by big data companies attempt to profile us and predict our actions, deciding what adverts or news stories we are most likely to click on, or which political campaign tactics we are most likely to respond favourably to. Our ‘environment’ becomes much more fluid as reality is tailored to us. Algorithms that respond to our online profiles alter our virtual environment and our experience - and in turn, augment our malleable minds. Our profiles are not only predictions of who we are, but indicators of who we may become.
Yet as our virtual environment shapes us, we too shape our virtual environment. Algorithms used by online platforms are reliant on the vast amounts of data we provide to make accurate predictions of our behaviour. Much like behavioural surplus is exploited in the surveillance capitalism business model, human cognitive capital is exploited in the design process of artificial intelligence. Many of our online interactions - such as product reviews, participation in personality questionnaires and social media hashtags, all contribute to the development of machine learning algorithms. A prominent example is Google’s use of CAPTCHAs (Completely Automated Public Turing test to tell Computers and Humans Apart). CAPTCHAs were initially designed to limit online bot traffic by presenting website visitors with a task that is easy for a human, but difficult for a bot to complete, such as identifying an image. However, they are increasingly used as a tool to train AI systems. As we respond to demands to prove our humanity, the by-product we provide is a dataset of labelled images that can be used to improve the accuracy of computer vision systems. The activities of users form part of the extended cognition of the artificial intelligence as we become ever more entwined in a recursive feedback loop between human and machine.- less cogs in a machine than neurons in an artificial mind.
Artificial intelligence is not a separate and autonomous entity but a conglomeration of human intelligence and the data we provide. Neural networks, modelled on the structure of our brains, share our biological vulnerabilities as they shift and respond to human input. The computational artefact I have produced for this assignment aims to highlight the symbiotic, co-dependent relationship between organic and artificial minds by borrowing from the techniques and processes used in the production of artificial ‘intelligence’.
The computational artefact I have created is ‘Artificial Evolution’, a continually evolving artificial organism in the form of an artificial mind feeding on visitor interaction. Artificial Evolution is a neural network that assimilates human knowledge into its biological make-up. Visitors form a part of its mental apparatus, helping it to grow and shift in real-time.
A provocation into the relationship between man and machine, Artificial Evolution explores the symbiotic relationship between humans and technology in the era of artificial intelligence. Inspired by Andy Clark’s ‘Humans as Natural Born Cyborgs’ and the extended mind thesis, this computational artefact aims to highlight the entangled nature of our minds and our tools, questioning our position in collective cognition as online activities are exploited to fuel the evolution of artificial minds.
To view the work, visitors must first complete an image-based CAPTCHA that asks them to select images of brains. Images they have selected are fed into a neural network that is attempting to generate an image of a brain. As more visitors interact and it’s dataset grows larger, it becomes increasingly realistic.
- Images of brains and other tools that form part of our extended cognition were scraped from Google images using a Python script and added to a folder for sorting.
- A neural network is trained on the dataset using Keras, an open-source software library that provides a Python interface for artificial neural networks.
- The neural network tries to produce images of brains, initially these are not very accurate due to the inaccurate dataset.
- The images produced by the neural network are displayed on a webpage that requires visitors to complete a CAPTCHA in order to gain access.
- Images shown in the CAPTCHA are taken from the dataset scraped from Google in step one.
- Visitors select all the images of brains.
- The visitors' selections are recorded, and the images are added to a clean dataset folder.
- The clean dataset is fed back into the neural network.
- The neural network produces increasingly accurate images of brains based on the data it is fed by visitors.
Clark, Andy. Natural-Born Cyborgs: Minds, Technologies, and the Future of Human Intelligence. New York, Oxford University Press, 2003.
Eyal, Nir. Hooked: How to Build Habit Forming Products. Penguin Canada, 2014.
Lindblom, Jon. “Late Capitalism and the Scientific Image of Man: Technology, Cognition and Culture.” Alleys of Your Mind: Augmented Intelligence and Its Traumas, Meson Press, Hybrid Publishing Lab, Centre for Digital Cultures, Leuphana Univeristy of Lüneburg, 2015, pp. 107-122.
Park, Denise C, and Chih-Mao Huang. “Culture Wires the Brain: A Cognitive Neuroscience Perspective.” Perspectives on psychological science : a journal of the Association for Psychological Science vol. 5,4 (2010): 391-400. doi:10.1177/1745691610374591
Rosen, Larry. iDisorder: Understanding Our Obsession with Technology and Overcoming Its Hold. Palgrave, 2012.
Wheeler, Michael. “Thinking Beyond the Brain: Educating and Building from the Standpoint of Extended Cognition.” Computational Culture, November 2011, http://computationalculture.net/beyond-the-brain/. Accessed 20 04 2021.
Zuboff, Shoshana. The Age of Surveillance Capitalism. Profile Books, 2019.