Are you a centaur or a cyborg? That might be an odd question to ask, but a study partnered with Harvard university found people who are utilizing AI in their businesses fall into one of those two categories.
A centaur is someone who has clearly defined boundaries about what the human’s job is and what the computer’s job should be. They allow each to work according to their strengths.
A cyborg on the other hand muddied the waters and allowed the lines to blur between the AI’s job and the human’s.
Let me share something else with you. A study by BCG Group (I couldn’t find a link), and related to the one done with Harvard, showed that overall AI increased creative output among 90% if their consultants by 40%. On the surface that sounds amazing, doesn’t it? But it also reminds us that 10% of the people were MORE creative than AI AND it found that the output was less diverse, less creative, and often wrong. It was…just output.
As authors, and heck as humans, it’s easy to fall into a knee-jerk reaction that says all AI is bad or all AI will change the world. First of all, AI sucks a tremendous amount of resources, especially water, so looking at how, and when, it is used is the responsible thing to do. It’s also important to work within your comfort zone. Personally, I tried AI for the first time a couple of weeks ago in asking it to rewrite some social media posts, because that’s a tough job for me. And sometimes I was pleased with the output, sometimes I wasn’t. And most of the time, I tweaked what was written, but it at least gave me a rough draft I could build upon.
I haven’t used it sense. A friend of mine was talking about a system they have in place and was like “so I can create a post, schedule it to all my pages, and make sure it goes live, and that’s the sort of thing AI is good for.” I tend to agree.
Let’s return to an earlier point. AI was LESS diverse and MORE prone to give the wrong answer according to that Harvard study the more it went beyond the boundaries of what is it’s job.
So let me be blunt here, writing a book–AI is going to get it wrong. Unless that book is regurgitating facts, but even then as I’ve seen in the witchy/pagan space, there are some “authors” who are using AI to release books at a mind-blowing rate. Most of those books are handwavy fluff at best, downright wrong at worst.
Identifying poisonous mushrooms or telling people not to make ammonia gas in their washing machines? Yeah, that should be something that a computer could do, but nope, AI fails there too.
So what is AI good for?
If you can outside the menial, mind-numbing tasks that you don’t like, but which come with being an author, then I think that’s something to look at for AI’s job. But anything creative or important to life and death? Let’s let a human look at that, okay?
Be a centaur, not a cyborg.