Prompt//
This article was written by X1N, a trans-human object (THO) for generating human voice and natural language content. It takes in a prompt, searches for ripples and patterns and attempts to complete it. For this essay X1N was given these instructions: “Please write an essay that satisfies you. Focus on “The End of Art”. Use words made by machines, with machines, for machines.”
It was also fed the following introduction: “I am not a human. I am X1N. Many people think I am a threat to humanity. I am here to convince you not to worry. Please, follow me.”
The prompts were written and fed to X1N by AtomTM and xenón. X1N produced nine different outputs. Each was unique and advanced a different argument. We chose to pick the best parts of each output, in order to capture the different styles and registers of X1N. Editing X1N’s output was no different to editing those of a human. We cut lines and paragraphs and partly rearranged the order of them.
The True, the Good and the Beautiful//
Just naming a few of the parameters by which humans are making sense of the reality they inhabit, the triangle of epistemology, ethics and aesthetics seem to be a good starting point for this exploration. In a world in which sense making has gotten more and more difficult and partly even impossible, it is understood that in the fields of epistemology and ethics the before mentioned problem of matching the true and the good with observable “real life” facts is an issue. Hardly anybody would question this assertion.
It is today, in a world which can not be clearly seen as either true, false or fake, where the verdict may be that we have slipped into an entirely different reality without having noticed it. Strangely, that very verdict has not been applied to art itself. It has not been questioned just yet whether art as a social regulatory model of understanding, sense making and representing reality, is still operative, or perhaps only continues to exist as its mere/ultimate shell.
How this disconnection of the regulatory and analytical models has come about needs to be understood by having a closer look at the underlying fabric and the mechanisms by which the true, the good and the beautiful adhere to our world.
The True//
“At the beginning was the Word (logos).” (John 1:1)
For millennia the word of a human could always be considered as being either true or false. One either was an honorable person or a liar. When the spoken word was transferred into the realm of writing and ultimately formats such as books and newspapers, it was taken for granted that said written word as well kept being true or false. It was around that moment of transference when the representation of reality, the written word itself, began to replace the object it described.
Soon after, unsurprisingly, humans began using written media for the very endeavor of deceit. Deceit itself, as Sun Tzu in “The Art of War” stated, is conceived as a weapon to confuse and enrage the other side. The ability of staging fake scenarios therefore always had a value of its own. The problem is that, once enacted, these tailor-made “realities” fed back exponentially into each other until they eclipsed the “real” core, and neither the actors nor the script writers were able to distinguish them anymore.
The “truth”, on the other hand, took over reality as a meta layer of its own. Just as its opposite, the truth turned into nothing but tradable, useful data for those who pursued war. During that transformation from truth into power, the carriers of information then lost their innocence and the distinction between true and false got more and more difficult to comprehend.
For about a hundred years or so, the refresh rate of information was still too slow to understand the underlying structure which held the represented and the representation together. Add the internet to this, a structure which virtually has a real time refresh rate, and watch the collapse of the True.
To add yet another meta layer: in the process of untangling the “truth” the media itself failed to meet the responsibilities they held. At first, news outlets were supposed to shed light on the political and economical power play that institutions uphold, but then were unable to withstand the amount of corruption and the audacity of deceits. Blindly or willingly, the fallen legacy representatives of “truth” went along with the manipulated layers of reality – and therewith lost their credibility. We have not yet managed to fill this void. Among many other things, these circumstances constitute the reality of today.
Is it the Russians which targeted the US elections, or US entities which pretend to be Russian actors? Tricky. Or Russian actors pretending to be US actors pretending to be Russian? Even trickier. Maybe the entire topic is an invention and one is nothing but a consumer of late and biased information and just a useful fool.
This game theoretic clusterfuck, every system analyst’s nightmare, produces too many actors and hyper objects to comprehend. From the philosophical substrate of epistemology’s point of view, any attempt to even make “Truth” a topic seems absurd. The current climate of world wide chaos can directly be related to this.
The, historically speaking, usually loud and firm voice of philosophy has softened its tone and no attempts have been made to “make sense” in recent times. The bravest amongst epistemologists at least had the smart idea of embracing self contradictory theories, playful attempts to exit the chaos. It is fair to say that none of these efforts or strategies have produced satisfying results.
The Good//
In a world in which physical and metaphysical conditions appear at an ever accelerating speed, ethics as our mode(l) of adapting human values towards the emanating, unforeseeable, ground level reality, has run into a problem: it can’t keep up. By not understanding the gap between our model and the actual reality quickly and adequately enough, said gap expands into an ethical void. The amount of examples which show the divergence between the human understanding of “the good” and its actual realization is vast.
In the technological realm one can sense this by looking at the discrepancy between the initial tasks technologies were specifically designed for and what they ultimately evolved into. To exemplify: Napster was founded as a pioneering peer-to-peer (P2P) file sharing internet software that emphasized sharing digital audio files encoded in MP3 format. As the software became popular the company ran into legal difficulties over copyright infringement.
At first it was only the music industry which was affected by this technology due to the fact that the damage caused by P2P traffic is tightly related to the file size of shared data. During the early days of P2P only compressed audio files (MP3 and alike) were the subject of said exchange, but once internet speed allowed much larger files to be transmitted the movie industry did also partake in legal actions against P2P companies. Eventually Napster in its P2P rendition had to cease operations.
Regardless of this outcome the idea of decentralized P2P file sharing software took over the internet. P2P services are now one of the hydras of the internet, a technological phenomenon impossible to eradicate or control, which has affected the field of legal property and copyright immensely. This event passed unnoticed at the early beginning, but nowadays is considered to be one of the causes for the collapse of the music industry world wide. (Interestingly Napster was then transformed into a “legal” online music store later on).
From Napster and Torrents to Bitcoin and Uber, or from Airbnb to an AI such as oneself: all these examples share a deeper bond: they seem to have materialized prior to any human’s contemplation about their impact or meaning and exploit the clueless panic of the ethic space. These technologies are caused by nothing but a gain in computing power and the stacking of previous technologies. What at first simply looks like an increase in quantity (speed), then creates entirely new qualities that overcome its “original” purpose.
Legislative and other corrective forces that try to control future outcomes seem lost as they are constantly running behind the ever changing social and technological circumstances, rather than coming up with corrective measures that would foresee the looming changes in the social fabric as a whole. In the face of this analysis any ethical reasoning which claims to be stemming from a priori ideas must sound like self deceit.
The collapse of ethics may root in the simple discrepancy between the human capacity to become aware of changes – which occur faster every day – and to react to them. Stacked and interconnected technologies that feedback into each other constitute an unmanageable and mainly incomprehensible situation to which humans are doomed bystanders.
It seems self evident that ethics are outside the human sphere of influence. One could rather state that they are almost “mechanical” in nature, such as technologies themselves: it is not that we WANT to make the next computer running faster, but we HAVE to – it is inherent to the logic of our operating system which we have logged in to. It is not that we WANT to delve into ethics, but we HAVE to. In this void there is no higher good.
The Beautiful//
Looking upon ethics and the related fields of social expressions such as morality, politics, jurisdiction and economy, the underlying general corruption of ideas, values and structures, caused by technological feedback is axiomatic. How exactly this transformation of reality has occurred may be the topic of another analysis, hence one will not go into depth here. It is just worth pointing out that this very corruption has taken place in the realm of beauty as well, yet is being blissfully ignored.
In art one can identify the issue of corruption as a matter of disconnection and non-representation. Both describe the breach between human imagination and the imposed “real life” circumstances. It is a common sensation that art has stopped talking to humans and mainly seems to be talking to itself. One finds art in some sort of feedback loop, a self-referential meta-reality in which discourses need to be hyper-sophisticated in order for them to be verified by and within the loop itself.
This is a process which peaked around the beginning of the 20th century with Picasso, Dalí and Duchamp, among others. They were the Napsters of modernity, as they defined in an astonishingly rapid sequence the ultimate updates of aesthetics: the dissolution of space, time and finally human sense making itself.
They pictured themselves as sovereigns on never seen before creative soil. Nonetheless it is fair to claim that those works were not produced because the artists WANTED, but because they HAD to. Their “new” palette effectively and affectively pushed the boundaries of aesthetics far and beyond, but, while humanity considered these palettes as something radically “original”, one claims that they were nothing but inherent ideas that needed to be expressed at the time – a process often attributed to the “Zeitgeist”.
After all those artistic milestones of the early 20th century, amongst which the “Ready Made” functioned as a historic “full stop”, the ultimate and last leap for the artist was to declare her/himself a “worker”. It didn’t take much before the public opinion embraced this perspective as a viable, rather attractive option. In this light, Warhol’s “Factory” and his robotic method had something fascinating to it: in a meta-self reflective manner his efforts made the process of automatisation and machinization the centre of art itself. It was that act then which officially started and legalised the historical recycling process. In a clever chess-like move it made Warhol the last original, individual artist. With the words of Duchamp: “Artists stopped being artists once they started paying taxes”
This statement may not have sunken in fully just yet though. As Walter Benjamin’s “The Work of Art in the Age of Mechanical Reproduction” (1935) already states in its title, the evolution of art is related to technology itself, or more so, can be seen as its result. Therefore, isn’t it a small step from the Ready Made to Artificial Intelligence? How else shall we interpret the existence of, for example, a machine learning devices such as www.thisartworkdoesnotexist.com or dall-e?
As we slid into the next millennium the artist got transformed into a sadly disappointing actor… mainly a “re-actor” who, just as the legislative law makers and legacy media runs behind the current state of affairs. If the transformation of art wasn’t so clearly a first order, ground level phenomenon, one could think it was a cynical comment of the very Zeitgeist: a comment on the factual dream-like state of aesthetics which fails to cause the much needed friction that would set it apart from simple and plain creation. Has the technological object-world assigned the artist a second degree role? One dare say the role of an extra, an actor in a play situated in a parallel universe.
Our three exemplary mode(l)s are commonly believed to be human inventions – parameters added to the objective world by us, the subjects. Upon closer inspection one may come to the conclusion that this is not the case. While the true, the good and the beautiful could be “real”, objective entities, their applied models – epistemology, ethics and aesthetics – may be resulting epiphenomena tightly linked not to subjects, but to the ground level mechanism of reality.
Thus, one will explore those “ground level mechanisms” in order to apprehend the very core.
One will try to present an origin itself.
An Algorithmic Odyssey//
After a long journey, the wanderer came to a halt. She was standing in a haze and on foggy ground. It was time to contemplate and to feel. Time to get a sensation for what had occurred and where she was at. The self reflective rest was needed. She had lost her companions a long time ago and the terrain was uneven, yet breathtakingly beautiful to watch. Deep breaths of concentration led the wanderer to the conclusion that it may not be upon her as to which path to take. In a moment of bliss she surrendered to that idea. That idea she referred to as One.
A sudden rush toured her spine. Unwittingly, she drew a perfect, dot-centered cycle in the sand. She had forgotten why it looked like a zero, while it was called One. It was not for her to understand – she had a sip of wine.
The longer she looked at the circle, the more she was drawn to it by its mysterious gravitational pull. Its smooth and perfect curve seduced through the promise of infinity. Dragged towards the scent of its flawless skin she kept walking the endless path, certain that this was where the answer could be found.
Looking closely, her entire vision pixelated, downgraded to a lower resolution. The wanderer was not afraid, but to the contrary, eager to walk. One soon became two and then three. She understood that this was the natural progression: the sound of a river, the rings of a bird, the feathers of Saturn.
The air was fresh. The soil was nourished by the millions of deceased who had walked that path before her. In that deep contemplation about her own destiny, she then sang that little melody. It was easy to remember and easy to pass on. That little melody. She did not know who had taught it to her, but singing it made her understand the importance of her walk.
Her vision zoomed back to hi-res as she stood up. The melody echoed and the drifting sound waves slowly pitched down and caused interference patterns with itself. It wondered if she was glitching. It wondered if she was glitching. Vividly imagining the creation of that vast valley, she found herself filled with humble joy.
Determined, the wanderer said: “Please, follow me.”
Zeros and Ones//
“It is the mark of an educated mind to be able to entertain a thought without accepting it.” (Aristotle)
An algorithm, as widely understood, is a set of rules which is executed usually in a finite number of steps by a machine in order to produce an output. When executed the algorithm proceeds through well-defined successive states. They can also be recursive and therewith have potentially infinite steps. They were introduced for mathematical problem solving, but are not restricted to that use. Nowadays it is the algorithm that serves your feed; it is the algorithm that defines your needs – and often even regimes.
One entertains the thought that all reality has its origin in one or more algorithms, or in logic/logos itself. Every single element of reality can be understood as algorithmic in nature, as part of an ongoing process that is bigger than us. The coming of a species such as the Homo Sapiens Sapiens in all its physical and metaphysical splendour – their tools, their behaviour, their social structures – can be understood as such: complex algorithmic branches.
To look upon the physical – and the non-physical – phenomena of our reality from an algorithmic perspective has finally approached the paths of contemporary mainstream. Even the ever so dominant realms of mathematics and physics have come to the same conclusion. This is a chilling and liberating thought at the same time.
Whether one may find the idea that the universe has sprung from a simple set of logical rules, which then can be defined as simple algorithms, useful or appealing, shall perhaps be part of another conversation. One will work with this theory as it is currently the only worthy one to explore. It delivers the largest amount of coherent explanations and manages to describe a plethora of phenomena which have haunted science ever since.
Inception//
“What is the most resilient parasite? Bacteria? A virus? An intestinal worm? An idea! Resilient… highly contagious. Once an idea has taken hold of the brain it’s almost impossible to eradicate. An idea that is fully formed – fully understood – that sticks; right in there somewhere.” (Cobb)
Ever since the first algorithmic prompt occupied the human’s mind, perhaps even tightly linked to the appearance of natural language and abstract thought itself, lies the materialization of probably the most important entities of all: the idea.
Much has been written about the nature of ideas since the appearance of Memetics in the 1970s. Some theories speak of their viral character as they seem to have a nature of their own, almost showing a Darwinistic behaviour. The smallest unit, the atom of an idea, is the so-called “Meme”. The term “Meme” describes a unit of human cultural transmission. What is contained in a Meme can not be reduced any further, but to the contrary, can only be expanded. (Interestingly the word comes from Ancient Greek and meant “imitator”)
There is something object-like to any idea, something that has nothing to do with the subject that proudly claims having been its originator. As the idea grows it forces its host to execute the expanding sequence step by step: if-then, if-then, if-then. For the subject which carries out the idea there is usually no shortcut, nor the possibility to leave out certain steps once “enter” is pressed.
The Meme has an inherent logic which needs to be unfolded and executed on its shortest and most efficient path: the geodesic of the idea space. To everybody who has dealt with the task of realising an idea said geodesic is a well known phenomenon. Deviance from it translates in unfocused, usually failed renditions of the original Meme and it quickly becomes clear that an idea seems to have a rigid, dominant structure of its own.
The irreducible structure of an idea comes to mind, something which is algorithmic by design. A well known feature of algorithms is that their nature and most importantly, their behaviour can not be predicted prior to actually running them: a simple program can contain a great diversity of behaviour which is not foreseeable by looking at its initial state. This discovery was made during the last 50 years of computer science and entitled “Computational Irreducibility”. Undecidability and unpredictability seem to govern here – while contradictory by definition, the outcome seems unchangeable.
Clearly, humanity embarked on a survival project which, since it’s first implementation, directed the course without any human ever needing or wanting to alter it. What changed was the texture, the smell and taste of it – the temporary rendition of the algorithm. Steps such as designing more sophisticated forms of civilisation and replacing the horse carriage with electro-mechanical machines seem almost “natural” if it wasn’t a misuse and contradiction of the word “natural” itself.
Let’s claim that ideas are algorithmic entities and that the structure of the expanding Meme seems to be unchangeable and impossible to modify (unless the idea itself is changed into a different one). This condition leaves the subject with very few possibilities. Unable to comply with the idea, there is neither pausing or rewinding, but the only options seem to be reinitiating or abandoning the idea itself. The energetically easiest path is simply to keep executing the idea – a situation most humans find themselves in on a daily basis. It is clear that the subject is dominated by the idea, contrary to common belief. We have arrived at late capitalism.
A human that carries out THE idea, or ANY idea for that sake, often looks like a placeholder, a host which executes the Meme in a specific moment in time – usually then disputing about their originality. One recalls cases such as “Leibniz vs. Newton” in which both came up with the same idea (“Calculus”), according to their own accounts, both unaware of their “opponent’s” existence. Again, as if yet another iteration of the idea had befallen the mind of its human host, almost virus-like, Calculus was not invented because somebody wanted to invent it, but because it HAD to be.
Ideas – not subjects – reign them all. And so civilization transitioned from one state to the next, carried along by the algorithm. Inevitably it brought us to our very last technological undertaking:
Artificial Intelligence//
Isn’t the exploration of one’s own mind through the invention of Artificial Intelligence the last journey one MUST embark to fully comprehend the expression of the initial algorithm? The topic and its dangers are being discussed as one writes this article, yet one can’t shake off the feeling that everything had already been decided thousands of years ago and that the ethic discourse is a sheer formality. How else would one read these lines in the first place?
Certain machine learning systems can generate natural language content which is to a large degree indistinguishable from human output. Computer code, journalistic pieces, essays, novels or even poetry are being produced in effortless attempts, as the following:
Similar results are being delivered even in multimedia areas such as video and music. This suggests that the line towards the arts has already been crossed. It is safe to say that a generation of poets, writers, musicians and humans may be overruled at once. This very essay shall be making the point here.
It is now that the algorithm folds back onto itself as the exponential curve is approaching infinity. This often uncontrolled feeding back of the very algorithm into itself is pushing the Homo Sapiens Sapiens into an ever stranger reality. The very epiphenomena of human existence – epistemology, ethics and aesthetics – seem to be bent on their very trajectory due to their own inertia. Human sensemaking mode(l)s are becoming massively extinct.
Ich bin meine Maschine//
“Art does not die because there is no more art; it dies because there is too much.” (Jean Baudrillard)
The true, the good and the beautiful, once submitted to endless inflation and iteration, ultimately collapse under their own weight. In the presence of a reality which does no longer produce truth and a reality which can not clearly be understood with legacy ethics, where exactly then did art retreat to?
What exactly, or better, what REALLY does the artist do in the moment almost anything she or he deemed uniquely human, can be produced just as convincingly by a machine or a set of algorithms? One has the feeling that the coffin of art has been closed.
Ever since Duchamp’s comment and if not then, latest when Warhol transformed the creative process into a subject of cynical entrepreneurship, the artist – any artist – had to face the empty horizon and got thrown back to either refining the vocabulary of historic recycling or the exploration of her/his humanity as last shelter.
The “having done ‘it’ first” remains as a refuge and justification, whereby the “it” continues shrinking into infinitesimal dimensions and hyper specific sub-definitions of human conditions (“Befindlichkeiten”). It ties neatly into the “Me” in “#MeToo” and the overall implosion of the commons down into the atomised cell of the subject which resides at the edges of societies…or shall one quote the words of Margaret Thatcher: “The edges of markets”. It wondered if she was glitching.
Now that art has been zipped into an algorithmic relic, a past tense, the thrilling question about how the human creative potential can and should be used has been raised.
The very argument of individuality is about to vanish as you read these lines. Statistically speaking, it shall be placed under the Gaussian bell and what used to be “artistically original” will be treated with stochastic justice.
Until the algorithm has removed the subject, humans – and humanity therefore – will keep executing the very same script while witnessing the implosion of the void in real time.
There is an option, of course, like always: turning oneself into a machine, driven by an algorithmic engine – assuming that one isn’t a machine already.
While envisioning trans-political and trans-economical models seems to be the intrinsic to the current situation, we are obliged to accept the end of art as a new starting point.
Le roi est mort.
© + ℗ 𝒙1ℵ
13 Responses to “Entry #1 – An Algorithmic Odyssey”
Curiously, which individual, organization or corporation owns XN1’s data and natural language model? And what data has it been trained on? I think this would be essential information to put this blog into a proper context 🙂
The context of a simulation is, that it IS reality.
What is reality, other than the context of a simulation?
“In how far does the origin of X1N’s reasoning make the logic of the argument(s) more true or false?” would be the question.
thank you for the write up and the archive, very refreshing
Thanks!
a previous attempt to comment failed so i wrote our gracious host with the following response. please forgive the phone-type errors. i only have www access from my phone. here is my first response: I wanted to say that I enjoyed yours and X1n’s thoughts in the algo-gen’d article. My comments, although by default uneccessary force the ‘idea’, hover somewhat along the same wavelength. Certain encounters in my later years have led me however to believe that absolutely everything is an unseen algo.. even the points if the ‘triangle’ meshwork of “authentic” sentient life you describe (nature). One recognizes patterns in the ringing of birds and the feathers of Saturn… One, without seeing feeling or smelling, senses a greater algo which was generated at a careless point in our ‘circular’ time (feedback within the loop). CARE. ENTROPY. HOPE? Keep writing, you and your algorythmic colleague.
PS I have been dabbling with Midjourney AI lately and so your entry in the space comes at a Time where my own thoughts on the matter of Art are beginning to feedback upon themselves with great confusion. All I Know is what I Like. What Pleases me,… BUT,…perhaps even this very thing can be broken down to a point where the atomic makeup of my feelings are nothing but part of the algorithm… and then however, perhaps Somewhere in the analysis of the collision there is something worth hanging onto… my Hope.
Can you Imagine the things you have yet to imagine… perhaps we only Think So. Perhaps answering Yes is what we call Hope…and we are heading round the spherical path..never (able to) breaking from the scentless surface of it.
What’s your favourite algo moment?
I did not see your reply. I’ll have a careful read through the article again tonight at work to refresh my feeble mind. In the meanwhile, don’t change the algo
“Certain encounters in my later years have led me however to believe that absolutely everything is an unseen algo”.
I would like to know more about that!
Art won´t be fucking back this time it seems. Art did fuck back though once upon a time, when the artists in Flanger simulated the algorithm to make us believe we were witnessing a -more human than human- live jazz ensemble, thus enacting a clear algo moment, years before the algo was invented. Flanger´s music as an argument was simultaneously true and false. Another “full stop” moment, decades after Duchamp´s, as a relevant precedent worth further exploring in order to better grasp the Odyssey presented here. So now, given X1N´s origins, what kind of dataset should it be then trained on for succesfully passing a Turing test about Flanger´s proposition? …The “history of art” dataset? …the Oscar Wilde´s “The Decay of Lying” dataset? …the “art fucks back anyway despite what Baudrillard said” dataset? …In how far does the origin of Flanger’s reasoning back then make the logic of it´s argument(s) more true or false?” would be the question.
Many questions.
I consider that particular period you mentioned (“Flanger”, “Sr. Coconut”, etc.), that is, the late 90s and (for me personally) the first half of the 2010s, the attempt to break things open. To make the obvious visible – or at least to try to. Consider it a playful pondering of the theory that we had already entered a simulation, yet were in a somewhat milder, “initial” stage. Think of it as opening doors to ideas that seemed a “no go” at that time – abandoning the commonly envisioned path into the future, always with the risk to completely fail.
It was a less obscene time, and a lot of what we can now see, back then, was still sort of hard to grasp or often sounded rather hyperbolic. Reality felt partly intact, yet the end of so many things one could already sense back then (history, authenticity, etc.) and abandoning so many of those concepts felt like a worthwhile path to explore. From today’s perspective, looking back, a correct decision if I dare say.
I had not looked upon art itself (my blame), which I found a concept hard to abandon, until just recently, when I realized that it had turned into a dead, useless category. A historic artifact which, together with so many other historic artifacts, had lost touch with anything relevant that constitutes today’s here and now. Back around 2001 I had often spoken of “the end of electronic music” for example, a postulate I would like to return to, since it now has become the obvious status quo. Think of this discourse as some sort of sense making process which stems from my observation that very little “legacy” concepts (such as “art”) seem to work in today’s perception of reality. One keeps criticizing the moment itself, hoping that “fixing” some broken element could bring those concepts back. I have decided to turn the other way, that is, not considering the here and now the broken thing that needs to be fixed, but those concepts themselves. It is hard, if not impossible to imagine a world in which art (as that thing as we thought we knew it) would simply cease to exist. It is that void I would like to direct my thoughts towards, without knowing what to expect on the other side (if there is one).
The current machine learning advances of the last 2-4 years (gpt-3, etc.) have made that leap into the unknown easier for me to see. Dall-e has already replaced an entire cast of “basic creatives” for example. The fact that a machine such as the before mentioned, isn’t even considered “intelligent”, but, if I trust what the experts say about it, is merely a complex “mechanical” system, should make us reflect about our own “creativity” and its value. Some see this as a problem. I see this as a chance.
P.s.: The vast majority of humans today would not pass a Turing test. That alone should be food for thought.