More selected projects

 

Speculative Termination

 

three-standard-stoppages-mediumthree-standard-stoppages-medium

Three Standard Stoppages, Marcel Duchamp 1913

Introduction

If the notion "all models are wrong, but some are useful." (Box, 1987) stands valid, the same notion can be applied to the nature of terms and to the nature of concepts. For example, the term and the concept 'algorithm' has been useful in the age of computational execution via pencil, paper, and the human brain as computer. However, as the nature and practice of computational execution has changed in ways of space-time-material entanglement, so has our understandings of computational processes.

The term and concept algorithm, while perhaps useful in theory and in history, is arguably no longer useful in contemporary speculative practices of computational execution. To use the term and the concept algorithm is to presumably ignore the impact and embeddedness of computational execution in our interlinked and intersectionality scoped reality. And to ignore the impact and embeddedness of computational execution is to, perhaps, ignore the current status of reality itself - whether the act of ignoring is rooted in  ignorance or malice. Perhaps the terms and the concepts 'computational method' or 'reactive process' should enter the lexicon of artists working with/in contemporary computational practices.

What is Algorithm?

 

algore_rhythmalgore_rhythm

 

"... and I'm not talking about a nightclub run by a former vice president." (Norman, 2008)

Contemporary conventional wisdom might vaguely suggest that anything or, rather, any process expressed in a programming language is an algorithm. However, the conception of the term algorithm, by Persian mathematician Abu ‘Abd Allah Muhammad ibn Musa al-Khwarizmi in his 825 work Rules of Restoring and Equating, occurred before the creation and use of programming languages and well before the creation and standardization of mathematical notation as used today. Before the presumed formation of contemporary conventional wisdom and before the actual origin was known, the word "algorithm" had various origin stories and pseudo-etymological perversions. My favorite of these perverse origins comes from the Renaissance where algorithm was supposedly known as a combination of the terms algiros [painful] and arithmos [number] - seemingly far from the original headline of restoration and equalization [1].

Of the many definitions of algorithm, I find two instances in particular to be most revealing of algorithm’s contemporary usefulness. The first is situated within a more culture-centric context; the Jargon File. The Jargon File, the definitive guide to the North American hacker lexicon, maintains a glossary, amongst other things, indexing the most important and common terms used by arguably the most elite practitioners of computational execution. The Jargon File glossary records, updates, and maintains a range of lexical terms - from the obscure (Whorfian mind-lock, zorkmid, voodoo programming) to the foundational and everyday (bit, byte, program). However, a seemingly notable exclusion from this lexicon exists. That term is, in fact, algorithm [2].

If hackers are the master practitioners of computational execution, the absence of a definition could indicate a long-term sneaking suspicion of the, at least cultural, usefulness of the term and concept since the 1980/90s [3.1]. On the other hand, the absence could indicate that the term is seen as a banal platitude within the practice of execution or could instead indicate that the term is primarily used by theory-centric practitioners. If true in either case, an analysis might be valid on this observation alone.

A lexicon is a vocabulary of terms used in a particular subject. Rather than an encyclopedia, which is too universal, or a dictionary or glossary, which offer too short descriptions or readings of terms, a lexicon can be provisional and is scalable enough a form to adapt to any number of terms and lengths of text. In producing a lexicon for an area that is as wide, deep, and fast moving as software, one can easily make a virtue out of the necessary incompleteness of the work. (Fuller, 2008)

This first instance is an anti-definition, if such a thing exists, and while presumably a so very thought provoking observation, an absence of evidence for usefulness is not the same thing as evidence of the absence of usefulness. The second instance of the term is a more grounded form for practice and theory to addresses the latter matter of absence of usefulness.

The second definition comes from Donald Knuth, author of the definitive tome of computer science The Art of Computer Programming. Knuth delivers a clear and authoritative definition of the concept of algorithm in Volume 1 - Fundamental Algorithms of The Art of Computer Programming that is widely referenced, accepted, and used in theory and in practice.

The modern meaning for algorithm is quite similar to that of recipe, process, method, technique, procedure, routine, rigmarole, except that the word "algorithm" connotes something just a little different. Besides merely being a finite set of rules that gives a sequence of operations for solving a specific type of problem, an algorithm has five important features [finiteness, definiteness, effectiveness, 0 or more inputs, and 1 or more outputs]. (Knuth, 1968) [4]

According to Knuth, an algorithm must always terminate after a finite number of steps within a ‘reasonable’ finite time. The finite number of steps can be arbitrarily large, but each of these steps must be precisely, rigorously, unambiguously defined. Knuth denotes he chooses to define each step both formally in a programming language (for computer use) and informally in natural language and symbolic imagery (for human understanding). To be deemed effective, an algorithm's steps and operations must be sufficiently basic in principle in the sense that it is able to be executed in an exact manner within a reasonable finite time by a person with paper and pencil. As to determine the best algorithm - most relatively effective in terms of quantitative behavior for a specific exercise/problem - can be determined via the time taken to execute the algorithm in terms of average number of times a step is executed (independent of the machine's speed limitations) and via the algorithm's adaptability to other machines in terms of simplicity and elegance. An algorithm must have zero or more quantities - inputs - from a specified set of objects that are given to it either initially 'before' execution or dynamically 'during' execution. And finally, an algorithm must have one or more quantities - outputs - that have specified relation to the input(s). (Knuth, 1968)

Knuth, in the process of defining the concept of algorithm in preparation for a rigorous analysis of algorithmic forms, relationally defines, in passing, the concepts of 'computational method' and of 'reactive process' as other - different - forms computational structures. These are forms that cannot be quantitatively evaluated in the same way as algorithms. 

A procedure that has the characteristics of an algorithm except that it possibly lacks finiteness may be called a computational method. Euclid originally presented not only an algorithm for the greatest common divisor of numbers, but also a very similar geometrical construction for the ‘greatest common measure’ of the lengths of two line segments; this is a computational method that does not terminate if the given lengths are incommensurable. Another example of a nonterminating computational method is a reactive process, which continually interacts with its environment. (Knuth, 1968)

We can analyse the current usefulness of this 1960s based definition of algorithm through the contemporary practice of execution. At the same time, this process can reveal the nature of current space-time-material entanglements and reveal whether another term is more useful and accurate for contemporary speculative practices of computational execution.

Algorithm Executed   

Knuth’s conceptual framework for algorithm reveals a particular centricity of thought circling the singular machine, the singular human, pencil and paper, and relationally defined outputs to restricted/specific inputs. Algorithm, in terms of time, must start and end within a reasonable, human lifespan centered, time scale and step set. Algorithm, in terms of space, is (singular) machine centric, but human-paper-pencil validated. Algorithm, in terms of material, centers around being usable by machine and understandable by human. Algorithm, in terms of interaction (and material), centers around one or more outputs that are specifically related to zero or more inputs from a specific object set.

The problems of execution are historically situated and entangled with the contingent forces of machines, bodies, institutions, military labour practices and geopolitics, rather than simply a set of instructions that are outside of life. (Pritchard, Snodgrass, Tyżlik-Carver, 2018)

Algorithms must terminate when executed. Termination was seemingly clear when execution was performed by and within the scope of a single person with pencil and paper working in a reasonable timeframe. Reasonableness might only seem like a tolerable descriptor in this context when knowing that some see the birth of reason to coincide with the birth of writing (Victor, 2013). Termination continued to be somewhat clear when execution was performed on and in the context of an air-gapped electronic machine usually assembled and operated in less reasonable time frames [5] by small groups of humans - typically unnamed women [6] - in closed quarters. Algorithms seemingly terminated with the termination of your shift or job or war or with the termination and breakdown of parts of the machine.

Termination is no longer clear in the contemporary context of always on, always connected, always interacting networks of networks of more than just electronic-based computation. Termination, in relation to algorithms, could more likely happen now as a result of complete material consumption of power source or a terminated connection to wifi than a completed execution of instruction set - as the entire instruction set often cannot be reached without having power or without being enmeshed in the larger network. The physical embeddedness of computing is becoming more clear in the general human consciousness. However, neither of these examples are ever considered as proper inputs to algorithms as they are traditionally classified in theory-based contexts as irrelevant implementation details for termination.

An algorithm is an abstraction, having an autonomous existence independent of what computer scientists like to refer to as “implementation details,” that is, its embodiment in a particular programming language for a particular machine architecture (which particularities are thus considered irrelevant). (Fuller, 2008)

The adoption of networking protocols and technologies mimics this trend. Synchronous HTTP requests moved to (asynchronous) AJAX requests which now move to constant event-driven flows of data within networking via websockets [3.2]. This recent constantly flowing datastream might subconsciously add to the particular flavor of angst that floods the person with a dead or disconnected device that can no longer keep algorithms forever executing - as is their natural state.

As Knuth’s definition suggests, termination is not just a matter of time, but at least a matter of terminating time and instruction steps. Contemporary computational time can feel seemingly both infinite and frozen at the same time. Actions and steps are seemingly linked and unlinked of predictable and unpredictable event driven streams.

One of the key qualities of execution as the direct experimentation with various materially directed affordances and relationalities [is that] this becomes that, and along the way, becomes something entirely else, with each execution posing further correlations, problems and interpretations to be addressed(Snodgrass). (Pritchard,Snodgrass,Tyżlik-Carver, 2018) [6]

Perhaps in hindsight, termination was never clear, but time lag, space lag, and material lag of past contexts made it difficult to see past the singular dataset on paper, instead of the algorithmic stream across material. While not situated within the origin of algorithm nor the origin of executing them, an account of Bill Gosper, situated at the origin of hacking as we know it, has him as the literal embodiment of the subtle non terminating nature of algorithm in the age of a single individual using pencil and paper. This account almost whispers of Snow Crash Stephensonian futures as the human and computer divide is bridged via language mutation.

The -P Convention:

Turning a word into a question by appending the syllable ‘P’; from the LISP convention of appending the letter ‘P’ to denote a predicate (a boolean-valued function). The question should expect a yes/no answer, though it needn't.

[Once, when we were at a Japanese restaurant, Bill Gosper wanted to know whether someone would like to share with him a two-person-sized bowl of soup. His inquiry was: “Split-p soup?” — GLS] (Raymond, 1991)

While the paper and pencil definition of algorithms is framed towards the singular human (or small groups of white American and British men), computational material is seemingly everything and nothing, affecting - and being - some of the most fundamental systems of Earth. Even now with air-gapped systems that mimic the sterile rooms of early Bletchley Park or early Gosper, we are reminded of the interconnected nature of executions within the real world. Not just Japanese soup, but Japanese Earth, Japanese bodies, and Japanese hands.

And yet, this is not just about Japan. Inputs are not from lone specific sets. Outputs are less relationally mapped. The requirements (a power source or connection) for computing agents is not a valid input for an algorithm, its material waste is not a relevant output for algorithm. To use the concept of algorithm might be quantitatively useful, but using it is ignoring impact and embeddedness of algorithm in the world of things. In the purely abstract world of the concept of algorithm the natural executing systems of weather was not an interlinked input to the air-gapped algorithmic systems of the Fukushima Daiichi Nuclear Power Plant. Fukushima itself was not an interlinked input to the factory assembly lines creating the electronic computers that are usually the boundaries for algorithms. To complete the circle, the electronic computers themselves after deemed no longer algorithmically useful in western markets are, along with other forms of waste, then definitely not the input for creating mutations (like global warming) in the natural executing system of weather.

What does the concept of algorithm allow, in the larger sense, compared to what a concept like reactive process allows? On a high level, concepts like algorithm enable the evasion of responsibility for anything that’s not on paper. It supports corporate and individual systems of deferring accountability as foundationally stating that the time-space-material of algorithm is strictly on paper. There is no responsibility for the outside world. The concept also upholds a human centric view of command and control of the larger system - supporting the view that only humans (and their abstractions) have agency while other systems decidedly do not.

A term like 'reactive process' has a sense of ecology putting the environment - the system - at the center, not a singular agent within the system. Perhaps this is also the hardest part about not using algorithm. The human and its abstractions would no longer be the exception in this regard to control. To no longer use the concept of algorithm is to say that either everything has agency or nothing does. As an extension, either everything has responsibility or nothing does. Maybe reactive process wouldn’t reestablish accountability at all, but destroy it - and at the same time destroy the then obsolete practices of deferral, black-boxing, and reputation management so common within current algorithmic executions. Regardless of how we would evaluate and react to such a change, it would perhaps in the least give us a more realistic framework for understanding and changing the current societal boundaries.

Organizations which design systems... are constrained to produce designs which are copies of the communication structures of these organizations. (Conway, 1968). 

We need to change how we see computational practices, not just technological change but also societal change. The use of algorithm has no personal social impact within the current culture if you do not know who, when, where, why, and what it is. A lexical change could be the first step in joining the individual aims of computational literacy with the community means of intersectional analysis - as a disconnect exists.

In the US, Black women’s participation with the digital is frequently evinced in neoliberal preoccupations with learning to code… these projects are largely an individualized, privatized approach to thinking about Black women’s empowerment, in neoliberal fashion… many African American digital technology projects are disconnected in their context, content, and intent from the materiality of ICT processes in the Black/African diaspora. (Noble, 2016)

Terminating the Term

If the term algorithm is no longer useful in the contemporary speculative practices of computational execution, what term or concept should artists use in its place? Reactive process? Should a new term be speculatively docked for creation and standardization with/in this context? Does using a new term matter?

Terms matter. A change in terms can, in an Orwellian sense, obscure and opaque the understanding of a phenomenon, but at the same time can bring clarity to the understanding of a phenomenon - both the general and the specific. For example, the eventual change from the term 'shell shock' to the term Post Traumatic Stress Disorder(PTSD) brought clarity to the nature of trauma on the human body. While PTSD, as a string of words, is arguably more vague than 'shell shock', it is more precise when addressing the the effect of trauma in terms of time (not terminating with the initial steps of explosion), in terms of space (not physically dependent on being within the blast radius of a shell), and in terms of material (doesn’t only occur relationally between shells and soldiers). The abstract - precise, not vague - concept of PTSD allows us to see the underlying structure of trauma and the intersectional parallel expressions of trauma across all human forms - in settings of war, abuse, neglect, corruption, and tragedy. [8]

Similarly to PTSD, a new term, like “reactive process”, for what we know as algorithm could create the risk of vagueness or misunderstanding as a string of words, but alternatively it could offer a more precise description of the current computational execution practices. It could also more easily reveal effects of computational time-space-material trauma with/in our larger system. This is not to say that a new term should actually be used or that if it was that artists would create ‘algorithms’ differently, but it is to suggest that with the omission of algorithm artists could more accurately see execution. Afterall, if working within the concept of algorithm is playing within the current societal boundaries, then reworking the concept of algorithm is playing with the current societal boundaries.

It is "a joke about the meter," Duchamp glibly noted about this piece, but his premise for it reads like a theorem: "If a straight horizontal thread one meter long falls from a height of one meter onto a horizontal plane twisting as it pleases[it] creates a new image of the unit of length." Duchamp dropped three threads one meter long from the height of one meter onto three stretched canvases. The threads were then adhered to the canvases to preserve the random curves they assumed upon landing. The canvases were cut along the threads' profiles, creating a template of their curves creating new units of measure that retain the length of the meter but undermine its rational basis. (Dreier Bequest, 2006)

This notion is, of course, not new. It seemingly resonates with particular artistic practices of the past that speculatively questioned the boundaries of the medium. Perhaps sculptor Marcel Duchamp’s 1913 piece Three Standard Stoppages (pictured at the top) is something of a reference for contemporary computational artistic practices. It disregarded the quantitative measuring systems of the physical practice he worked with/in - perhaps we should do the same.

In Conclusion... What are the Inputs and Outputs?

 

semantic-concepts-of-information-roadmapsemantic-concepts-of-information-roadmap

An Informational Road Map, Luciano Floridi (2015)

In the realm of non-termination perhaps there is no traditional conclusion to be had, but an infinite flow of inputs and outputs within ‘reactive processes’. A question outside the scope of this essay, but too important not to note, regards that infinite flow: What are those inputs and outputs? Perhaps, if we begin to see our computational structures differently, we could by extension see information and data differently. Playing with or acknowledging a nonterminating computational method like a reactive process, which continually interacts with its environment (Knuth, 1968), might point to the notion of environmental data, which may have no semantics at all and instead “may consist of (networks or patterns of) correlated data understood as mere differences or constraining affordances.” (Floridi, 2015). This is an important question for computational artists to think about as our personal practices are increasingly data centric and public claims of a 'post-truth' society are increasingly raised.

Environmental information: Two systems a and b are coupled in such a way that a's being (of type, or in state) F is correlated to b being (of type, or in state) G, thus carrying for the information agent the information that b is G. (Floridi, 2015)

If the notion "all models are wrong, but some are useful." (Box, 1987) stands valid, then perhaps we can conclude that the same notion can be applied to non-terminating concepts of being. This, at least temporarily, useful model can help us articulate, shape, and understand our perceptions of and engagement with reality (and with our artistic practices) until the next model takes form.

 

Notes

[1]. “Since the early 1960's, computers have been used even more often for problems in which numbers only occur by coincidence; The computer's decision making capabilities are being used, rather than its ability to do arithmetic” (Knuth, 1968).

[2]. An example definition from the Jargon File shows that 'program' has three definitions in contrast to the zero definitions for algorithm.

program: n.

    1. A magic spell cast over a computer allowing it to turn one's input into error messages.

    2. An exercise in experimental epistemology.

     3. A form of art, ostensibly intended for the instruction of computers, which is nevertheless almost inevitably a failure if other programmers can't understand it.

One of the few, if not only, uses of the word algorithm in the entirety of the Jargon File is in relation to the concept of "voodoo programming".

voodoo programming: n. [from George Bush Sr.'s “voodoo economics”]

1. The use by guess or cookbook of an obscure or hairy system, feature, or algorithm that one does not truly understand. The implication is that the technique may not work, and if it doesn't, one will never know why. Almost synonymous with black magic, except that black magic typically isn't documented and nobody understands it.

2. Things programmers do that they know shouldn't work but they try anyway, and which sometimes actually work, such as recompiling everything.

[3.1, 3.2] The 1980s/90s marked a turn in local and global connectivity through the birth of the World Wide Web. The further advent of additional networking protocols [3.2] associated with the web, which change our experiences of time, have mimicked other structures as pointed out by Yuk Hui in the preface of Executing Practices. Hui notes that structures of time seem to mimic the material and economic conditions they exist in and shape. 19th century was largely linear structures within mechanical industrialization, 20th century evolved to recursive non-linear structures with digital computing, and the 21st century has a different structure all together.

[4] Knuth himself could not help but add in his own humorous assertion that "at first glance it may look as though someone intended to write 'logarithm' but jumbled up the first four letters."

[5] A visit to Bletchley Park gives a glimpse into the working conditions, hours, and way of life of the numerous unnamed women who operated bombe machines around the clock for years on end.

[6] An in-depth account can be found in Nakamura, L. (2014) Indigenous Circuits: Navajo Women and the Racialization of Early Electronic Manufacture. American Quarterly Volume 66, Number 4, December 2014

[7] I couldn't help but reference a reference in a mutationous form

[8] An in-depth account can be found in Van Der Kolk, A. (2014) The Body Keeps the Score: Brain, Mind, and Body in the Healing of Trauma. Viking Press

References

Box, G (1987). Empirical Model-Building and Response Surfaces. University of Minnesota Press.

Conway, M.E. (1968). How Do Committees Invent? Datamation 14(5): 28–31.

Dreier Bequest K.S (2006). MoMA online - Marcel Duchamp - Three Standard Stoppages. Available from: https://www.moma.org/collection/works/78990 [Accessed 15 March 2018]

Floridi, L. (2015). Semantic Concepts of Information. Stanford Encyclopedia of Philosophy. Available from: https://plato.stanford.edu/entries/information-semantic/#4.2 [Accessed 27 March 2018]

Fuller, M. (2008). Software Studies: A Lexicon. Cambridge, Massachusetts: The MIT Press.

Knuth, D. (1968). Art of Computer Programming, Volume 1: Fundamental Algorithms. Reading, MA: Addison-Wesley.

Noble, S.U. (2016). A Future for Intersectional Black Feminist Technology Studies. The Scholar & Feminist Online. 14.1 (13.3).                   Available from: http://sfonline.barnard.edu/traversing-technologies/safiya-umoja-noble-a-future-for-intersectional-black-feminist-technology-studies/0/ [Accessed 27 March 2018]

Pritchard, Snodgrass & Tyżlik-Carver (2018). Executing Practices. Open Humanities Press.

Raymond, E.S. (1991) The on-line hacker Jargon File, version 4.4.8. Available from: http://www.catb.org/jargon/ [Accessed 27 March 2018)

Victor, B. (2014) Media for Thinking the Unthinkable. Available at: https://vimeo.com/67076984