Advertisement

Opinion: As a writer I don’t despair about AI — it can’t replicate our imaginations

The OpenAI logo is seen on a mobile phone in front of a computer screen displaying output from ChatGPT.
A mobile phone in front of a computer screen displaying output from ChatGPT.
(Michael Dwyer / Associated Press)
Share

When I first encountered generative AI at a law seminar two years ago, I had the passing thought, as a novelist and critic, that it could actually be good for fiction written by humans — at least the kind of fiction I like.

Just think: Machines would potentially standardize prose in a contemporary fiction landscape that already tends to value minimalist writing, and the ease with which robots might reproduce simple sentences and formulaic plots would mean that the stylists, the eccentrics, the strangely passionate human authors would finally stand out. This optimism was quickly defeated by the realization that one of my day jobs, as a ghostwriter, would almost certainly become automated.

The panic over products from OpenAI and other companies says more about our cultural moment than about the tech itself.

April 9, 2023

At my mother’s house that evening here in Northern California, I ranted about this development in a society that already devalues art and the printed word. I preached about how ridiculous it was that artificial intelligence companies and startups were creating meaningless applications that reinforce layers of racial bias and leave a heavy carbon footprint, rather than investing that money and time into, say, solutions to mass incarceration and climate change.

My mother has had a long successful career in tech and remains involved in the field of AI. She’s one of the most moral and wise humans that I know. Unfazed by my argument, she said, thoughtfully, “Yes, ChatGPT will probably change things.”

Advertisement

AIs can spit out work in the style of any artist they were trained on — eliminating the need for anyone to hire that artist again.

Dec. 21, 2022

“It will put writers out of business.” I continued talking about how precarious living in America already is for most people whose lives center on writing.

“But machines will always need people to control them,” she explained. “The job will shift a little to be more like editing and supervising machines. Maybe 60% of people will go into those jobs instead.” Why couldn’t you be one of those people? she asked, saying people have always found a way to do new types of work when old jobs are gone.

Growing up in Silicon Valley, I learned to daydream. But tech corporations have fostered conformist thinking. We need to make room for misfits and eccentrics.

July 22, 2022

I originally thought of generative AI as being a threat to writers with a more generic or untextured style — people writing for businesses. But after thinking about what my mother said, it seemed part of a larger, timeless conversation about the new crushing the old and leaving behind those who didn’t simply go along. A larger fear began to gnaw at me. As writing looks increasingly sanded-down and generic, the way logging and other older types of work look to many of us now — repetitive and unstimulating and rote, better for machines than humans to do — the expectations people, including publishing gatekeepers, have around literary writing will change, too.

Right now, the publishing world is experimenting in how AI performs, not only with regard to “writing,” but at marketing, distribution and the production of audiobooks with synthetic voices. Most writers I know who are exposed to AI tools feel anxiety about losing potential streams of income. Authors have filed copyright infringement lawsuits against OpenAI for training ChatGPT on their books. But when I interviewed Vauhini Vara, a Pulitzer Prize finalist for fiction and tech reporter, she said she’d produced one of her best passages of writing while using AI as a tool to write about her sister who had passed away. The literary agent Andrew Wylie has wholly dismissed the effect of AI on the literary authors he represents. He claimed only bad, popular books are susceptible to replication.

Many physicians are burned out and rushed, so it’s no wonder if they fail to connect meaningfully with each patient. If a robot can help, why not?

Nov. 12, 2023

I don’t believe popular writing equates to “bad writing” — usually popular writing features transparent prose that is more straightforward for the reader to interpret — but in terms of a unique style being difficult to replicate, I’m with Wylie. Literature is typically authored by people who are receiving the world with a rare openness and finding their own voice, their own breaths and their own hard stops. It’s slow.

Take the complexity in sentences from the first page of Toni Morrison’s “Jazz,” which sets up so much: “When the woman, her name is Violet, went to the funeral to see the girl and to cut her dead face they threw her to the floor and out of the church. She ran, then, through all that snow, and when she got back to her apartment she took the birds from their cages and set them out the windows to freeze or fly, including the parrot that said, ‘I love you.’”

Advertisement

Morrison could have approached this prosaically, more like the way ChatGPT would, something like: “When Violet went to the funeral to cut the dead girl’s face, they threw her out of the church. It was snowing outside. When she returned to her apartment, she freed a parrot who said, ‘I love you.’” Instead, Morrison pulls off a high-wire act. We traverse the unexpected in images tinged with magic — from funerals to parrots to declarations of love.

An algorithm trained on well-known books can’t find what’s both moving and surprising the way Morrison does. But it will figure out three-act structure, cliffhanger endings for chapters and what events will titillate readers faster to maintain reader attention. Literature doesn’t work solely to satisfy a reader’s prefabricated idea of how they should be made to feel. As Elena Ferrante said in a Paris Review interview, “Literature that indulges the tastes of the reader is a degraded literature. My goal is to disappoint the usual expectations and inspire new ones.”

Like AI, however, so one argument would go, a writer’s own consciousness is trained on past material, including past books. If a robot, like a human in a vast library, were trained on a wide range of literature’s finest, it’s conceivable that we’d wind up with knock-off Morrison or Ferrante books the way we have knock-off Prada. Still, the human desire for authenticity, for the rare and the real, might balance that out.

More worryingly, perhaps, is that we could have a new ocean of robotic titles that mimic books written by the dominant group of people with roughly similar experiences, swamping the viewpoints of writers from underrepresented backgrounds. Generative AI, unchecked by the needs of humans, could become saturated with mechanistic, unsensual and conservative literatures. And instead, it could gravitate to what’s already been done frequently through human history — the cruel bigotry, the brutal wars, the crushing oppression — rather than what we might otherwise imagine in languages of our own.

Perhaps loggers and potters from a past age also expressed talent and nuance in their work. I hope literature doesn’t meet the same fate. I think of writing my last novel, a family saga set in Silicon Valley. In the still hours before dawn, I tried to excavate an initial, shallow thought about a fictional technology that allows users to experience others’ memories, in order to eventually express a deeper thought about the aggressive forward movement of technology and how it can misrepresent memory. The time spent imagining the repercussions of that thought and rendering it in the particular language it needed was more valuable to me, as a human, than the quick production of formulaic work.

Anita Felicelli is a novelist and serves on the board of the National Book Critics Circle.

Advertisement