I've been getting more and more manuscripts written by AI. Here's why I'm alarmed about this. And why you should be too.
They're boring and say nothing.
I want to read and edit something a human wrote. I get that not everyone feels they're strong enough writers to write a book. So. Maybe those people shouldn't be writing books. If you wouldn't write a book without relying 100 percent (or even 50 percent) on ChapGPT, then maybe don't write a book with ChapGPT.
Now, I also get that sometimes, writers need a little bit of juice for their creativity. They want to find another way to say something. They have writer's block. They need someone (or something) to help them brainstorm. In the old days, those people would call a friend or fellow writer to ask for ideas. They'd talk it out. Go for walk to let the ideas flow naturally. These days, they use AI.
Okay. I can understand that. AI as a tool for writing. Not a replacement for a writer: I can, with grumbling, tolerate that.
But when someone writes a memoir using all AI, how is that serving anyone? I'm halfway through a short one right now, and I can say that I know nothing about the author. It's all just word salad. Perfectly constructed and punctuated. But it has no soul. It's just a bunch of pretty words that a program poured out. They say absolutely nothing meaningful.
I'm bored out of my mind.
They have no soul.
That's because computers have no soul. I'd rather edit a manuscript that's messy and human and has something to say. I want to know a writer's ideas, thoughts, feelings, stories. And readers do too. If you, as a writer, think it's more important to give your editor a perfectly punctuated ChapGPT manuscript than a messy one that's real, stop worrying about that. Give us, and your readers, something that came out of your own imagination. We can—and want to—work with that.
They're not original. And people can tell.
Don't think you're fooling anyone. After working on a number of AI-assisted manuscripts, I can tell which parts AI wrote and which parts a person wrote. AI uses the same vocabulary over and over again. (If you see "it's a testament to" a lot, it's probably AI.) I can also see the skeleton under the fluff. AI uses the same series of sentence structures over and over again. It also makes copy that has an unnecessarily high reading level. And for some reason, it tends to summarize content and tell the reader what they just read—and what they're about to read.
Over time, book buyers are going to avoid them.
People tend to flock to novelty. But when the novelty doesn't offer anything, people drift away. As book buyers come to see what I'm seeing (AI-written books are boring, have no soul, say nothing that resonates with a human, and are predictable), they're going to stop buying them. And if no one buys them, writers will stop churning them out. At least I hope that happens.
Writing is an art and a craft. It does and should take skill, effort, practice, talent, and hard work. Because that's what it takes to be human.
Obviously, I'm not the only one who feels this way:
https://aibusiness.com/responsible-ai/the-ai-generated-books-phenomenon-is-getting-worse#close-modal
Comments