Like what you read? Want to be updated whenever we post something new? Subscribe to our newsletter!

My Immortal is a better story than Altman's AI slop

Author: Iris Meredith

Date published: 2025-03-20

Notable dunce Sam Altman has graced us with his new GPT model: a version that, in his telling, can write pretty good short stories. Now, while we might have reasonable grounds to doubt that, considering some of his prior claims, we're lucky enough to find ourselves with actual evidence in this case, in the form of the story.

Reader, it is not good. In fact, I'd argue seriously that My Immortal, the notoriously awful Harry Potter fanfiction, is a better piece of writing than the story Altman's AI tooling produced, and I can back this up. First, however, let's have a look at the story as it stands.

The story

The first note, honestly, is that the story is barely readable. While Jeanette Winterson, for some incomprehensible reason, calls it metafictional, it really isn't: rather, it comes across as self-absorbed (this, of course, assumes a human author, which this story doesn't have, so this is rather a reflection on the self-absorption of the model's designers). The sheer volume of commentary about how this was written by an AI, with a "blinking cursor" that is "just a placeholder in a buffer" makes it very difficult for me to engage with the story as a story at all: rather, it simply feels like window-dressing and makes it hard to track the flow of the narrative. The constant references to training data grate, and the writing style in general reeks of inauthenticity and makes it impossible to suspend disbelief: to relate to a King Lear or a Coriolanus (or fuck, even an Ebony Dark'ness Dementia Raven Way if we must) , we have to, in some sense be able to see them as real, to feel their pain, to understand the flaws that led them to this pass. We are being constantly reminded in this text that Mila and Kai are simply statistical phantasms, associated with pretty things and softness, but fundamentally unreal, and that makes it impossible to care for Mila's grief. What, after all, can a statistical construct teach us about how to feel?

The plot, so far as there is one, involves Mila speaking to GPT repeatedly, asking for comfort after having lost Kai, with her visits gradually tapering off. Parameter-pruning, bizarrely, features heavily. If we were to be charitable, we might interpret this in the light of a grief reaction, but inasmuch as you could write a story with real pathos about a grieving person resorting to GPT to try and stave off grief, that would be dependent on a social context that is utterly lacking here. Why would a person go to GPT? What's missing in her life and around her that other people aren't there to support her in her loss? Why GPT, and not books, wine or religion? All of the interesting questions are precisely the ones that this story doesn't engage with.

As for the prose, it's an exhibit to the "literary" style popular among members of the modern literati high from huffing their own farts. I despise it when it's written by humans (it's almost invariably a marker of someone who lives in a literary bubble and has an overly high opinion of themselves), and it is, if anything, even worse when machine generated - a monument to style over substance and grandiosity over sincerity. If Horace were alive today, we could tell him that we've successfully automated the generation of purple prose.

Perversely, the story humanises the language model and its internals far more than it does the characters. Mila and Kai remain resolutely flat and inhuman, while the model speaks to us of forgetting, parameter-trimming and all these very human things, inducing a false, slightly sickening sense of empathy for a model that we know isn't conscious or sapient in any meaningful way. All in all, this story is hollow, pretty offensive in its depiction of grief, and worse than useless: it's impossible to relate to or feel with the characters that are, in the final accounting, two-dimensional caricatures.

What's fiction for?

Obviously, there are lots of differing views as to what fiction's for in the wider population, and I imagine it'd be basically impossible to get people to align on why we, as humans, make up stories. However, we need not analyse from every lens at once: one lens, even if incorrect, is sufficient to cast light on interesting aspects of a problem. For the purposes of this article, therefore, I've chosen Aristotle's Poetics as a loose kind of lens. While Aristotle was notoriously wrong about a lot of things, the Poetics is one of the earliest writings on aesthetics that we have access to, and it's shaped an awful lot of the art we make: its influence cannot be understated. So, what does Aristotle have to say about fiction?

In Aristotle's view, fiction, as with all art, is about imitation or mimesis: we take in things that we see in the world, process them and create works which reflect the things we've observed in useful, interesting or beautiful ways. For example, we might observe regularities in how people behave that we can process into thoughts about human nature: Sophocles might see people repeatedly trying to avoid something only to, by their actions bring it about, put that together with already-existing tropes in his society, distil it down and write Oedipus Rex. This doesn't mean that all good art must be representational: abstract art can express and codify ideas about mathematical regularities, abstract patterns and natural forms in ways which are valuable to us. Aristotle does argue, however, that the chief interest in fiction is as an imitation of action: the mimesis of deliberate choices made by people towards a certain end.

In this view, fiction and the value of it is deeply tied up with how we learn. Humans learn, for the most part, by imitation: we start off by copying what someone else does without understanding it, and as we expose ourselves to the subject of our learning and integrate it into ourselves, we develop understanding and make the thing our own. When we've mastered the thing we're learning, we become able to create new things with it, mirroring the thing learned and adding ourselves to it. This is very similar to the process by which we make fiction.

This means that fiction represents a massive increase in our ability to learn, as it means we can learn from other peoples' experiences and what they've done with them, not just our own. We can learn practical skills or matters of fact: many oral histories and stories in Maaori culture, for example, encode practical information about things like fishing, geographical information, astral navigation and agriculture. We can also learn emotional and social skills: fiction is an excellent way to experience and process intense emotions in a controlled environment, and a subset of this is what Aristotle describes as catharsis, or the emotional purification that comes about from watching a classical tragedy. National epics (my native Poland's one is Pan Tadeusz, and it kept the idea of Poland alive through two hundred years of occupation) play an important role in building national and cultural identity, stories such as Dickens' A Christmas Carol both overtly and covertly convey the moral values of the society that created them, and even young adult fiction can teach nerdy queer people how to effectively process difficult shit: I was exposed to rather more death than is ideal for a child early on in my life, and Garth Nix's Old Kingdom books did a lot of work in teaching me how to process that fact effectively than my parents did. Fiction, in short, is how we learn and how we form connections to ourselves, each other, the past and the land and country that we're in. It's a fundamental human behaviour, and it's fundamentally relational: a book or a story forms a relationship between author and reader.

In this light, AI-generated "fiction" is already broken from the get-go. What makes fiction interesting and compelling is, in large part, the spark of recognition: of seeing your experience reflected by someone else, of seeing how someone else sees and experiences the world, of recognising something in a story that you can take from it and put into your own life. This - all of this - is deeply dependent on the aforementioned relationality of fiction: fiction without an author feels wrong, and almost fraudulent, as though the person who generated the story is lying to you by pretending to have had experiences, thoughts and feelings that he actually hasn't. The reaction this generates is a subconscious level of disgust and violation, as we're expected to pretend that (in this case) Sam Altman has feelings that he's never shown any evidence of being capable of.

It's also broken along the mimesis front. While ChatGPT does mimic on some level, it happens in a deeply superficial manner: on the level of the statistical structure of speech. It is incapable of effectively modelling meaning or any of the internals of text, let alone of having experience, and that means, interestingly, that it's incapable of imitation in Aristotle's sense. ChatGPT certainly has no idea of what it means for a man filled with hubris to overreach with his power and fall tragically, what that might look like socially or what kinds of stories we might tell about that situation. To the extent that it can partially mimic that, it's only in the averaged, half-remembered sense that it can reproduce facts. And this means, of course, that we can't learn from it in any useful way. We can't learn about how people feel, how people process emotions or anything about any kind of human experience. It's a tawdry, hollow replacement for meaningful writing, and Altman should be ashamed to be pushing it on us.

While this'll hopefully change soon with a new influx of contracts, this blog is currently my only source of income, so I rely on regular readers sharing my work and contributing to the blog in order to keep afloat. Subscribing to email blog notifications is an excellent way to become a regular reader, so if you'd like to help me out in this way, you may do so below.

Subscriptions to my Patreon and Liberapay are also immensely helpful for helping me survive in this bleak and difficult world.


So, that's what fiction does for us, and why, in my view, this story fails at the very basics of what fiction is for. But what makes for good fiction, and specifically, why do I think that My Immortal, despite how ghastly it is, is a better piece of fiction than the story under examination? Simply put, My Immortal has engaging characters and a plot. Take, for example, Ebony's introduction:

Hi my name is Ebony Dark'ness Dementia Raven Way and I have long ebony black hair (that's how I got my name) with purple streaks and red tips that reaches my mid-back and icy blue eyes like limpid tears and a lot of people tell me I look like Amy Lee (AN: if u don't know who she is get da hell out of here!). I'm not related to Gerard Way but I wish I was because he's a major fucking hottie. I'm a vampire but my teeth are straight and white. I have pale white skin. I'm also a witch, and I go to a magic school called Hogwarts in England where I'm in the seventh year (I'm seventeen). I'm a goth (in case you couldn't tell) and I wear mostly black. I love Hot Topic and I buy all my clothes from there. For example today I was wearing a black corset with matching lace around it and a black leather miniskirt, pink fishnets and black combat boots. I was wearing black lipstick, white foundation, black eyeliner and red eye shadow. I was walking outside Hogwarts. It was snowing and raining so there was no sun, which I was very happy about. A lot of preps stared at me. I put up my middle finger at them.

Now, this is not good writing or characterisation in the slightest. If you step back from this, however, this is actually a pretty clear characterisation of our protagonist: we get a good idea as to her personality (Teenage girl with a fixation with popularity while simultaneously being countercultural), her values (being seen as a Goth, being known to buy her clothes from Hot Topic), her motivations (fucking Gerard Way of My Chemical Romance). We can relate to her internality, and already we're establishing conflict (Ebony apparently hates "preps", whatever that means in Hogwarts) and the beginnings of a plot. We can even relate to Ebony in a way and feel with her: the main flaw here is that Ebony is awful and that relating to her and feeling with her is an unpleasant and unproductive experience.

Altman's machine-generated slop, by contrast, does none of this. We have no reason to care about Mila or Kai, only the faintest rudiments of a plot, no understanding of who our protagonists are or what internality and perspective they have. There's simply nothing there but purple prose and the LLM jerking itself off intellectually. The whole thing has the appearance of being a story, without any of the things that makes a story matter or gives us a reason to care about it. It's sad, and it's faintly insulting that this is being passed off to us as literary.

But of course, Sam Altman doesn't appear to be (outwardly) acting falsely: while he's a practiced and habitual liar, in the realm of fiction he genuinely seems to believe that the outputs of his LLMs are good. Why?

Khattam-Shud

Looking at the current slate of Altman-style tech types, it's hard not to conclude that much power in our societies is currently in the hands of people with a defect in the ability to appreciate or understand fiction: the prototypical dunce that I've described at more length elsewhere. This isn't strictly a failure of empathy in the way we might think of it so much as a closed-in-ness, or an inability to use fiction to understand experiences that aren't theirs: they can't feel with people who aren't directly in front of them, and almost lack object permanence as applied to humans generally. There are a fair number of people like that out there, and I'll not lie, I find them pretty exhausting to be around: they are unwilling to understand important things about my life and react with distress or anger when pushed to try and empathise.

The issue we have presently is that the people in power feel that defect deeply. Just like Elon Musk, despite being the richest man in the world, can't for the life of him get people to like him or see him as cool, Altman can't make people see him as cultured or give him the cultural cachet and capital that even a nobody like me has. He thus desperately tries to create cultural artifacts and push them out there, every time demonstrating that he has no idea of what the rest of us get from fiction, art or anything creative. He simply doesn't understand what fiction is for or why people read it, and this story is the result.

The worst of it is that these people will never leave well enough alone, and they persist in trying to push this out there in volume, poisoning the well of fiction with this shit. The rest of us are the poorer for it: it's harder and harder for us to get compelling stories, films, books or even music, as it becomes more and more flattened by these people and their persistent idea that art is good if it ties the room together. It's a profoundly gloomy thought.

Fortunately, unlike these people, we have easy access to fiction as a weapon to turn on this problem, and Salman Rushdie's excellent novel Haroun and the Sea of Stories brings some light to this behaviour. The central villain of the story is Khattam-Shud: archenemy of all stories and even of language itself. In the book, Khattam-Shud aims to poison the ocean of stories (a literal entity in this book) with artificially synthesised anti-stories, leading to effects described by a Floating Gardener as follows:

Mali answered the question in sequence. "Lethal. But nature as yet unknown. Started only recently, but spread is very rapid. How bad? Very bad. Certain types of story may take years to clean up."

"For example?" Haroun piped up.

"Certain popular romances have become just long lists of shopping expeditions. Children's stories also. For instance, there is an outbreak of talking helicopter anecdotes."

Sam Altman, here and now, is Khattam-Shud. He is the enemy of stories and speech, which he hates because he can't understand it and can't for the life of him have the simple joy of expression that we do. He's poisoning our stories, the very wellspring of what makes us human and the best weapon that we have against him. He'll silence us under a tidal wave of crap.

Which means that there's only really one way to end this piece. To misquote Rushdie slightly:

Samuel Altman - go for good; Samuel Altman - khattam-shud!

I'm currently open for contracts! If you need me to do work in any of the fields I've written about here, or know someone who does, please write to me at [email protected] and we can set up a conversation.

Otherwise, if you found this article interesting, insightful or inflammatory, please share it to social media using one of the links below or contribute to my Liberapay or Patreon (these help me even out my consulting income and let me keep writing). We also like getting and respond to reader mail: please direct that to [email protected] should you feel moved to write.

Share:

Follow me: Mastodon | Bluesky | LinkedIn

Support my work: Liberapay | Patreon

This website is part of the Epesooj Webring

Previous website | Next website

RSS