In a continuation of my out-of-order impromptu series on creativity, I have a brief rumination that came out of the AI discussion from last week.
Should AI/robots be restricted to doing only things that are dangerous and undesirable for humans to do? I’m thinking repetitive and easily replicable tasks, mining, idk—things that have a high likelihood of killing or maiming a person, or things that would be made much more productive by the use of a robot.
To me, even that looks like lost jobs and livelihoods for the people who do those jobs. The argument, then, is that not having to work those jobs would free people up for more creative endeavors. To which I respond, “But most people don’t get paid a living wage for their creative endeavors.”
In Viola Davis’ book, Finding Me, she said something like 95% of actors don’t work and only 1% make more than $50,000 per year. That’s not a living wage for a family in Los Angeles or New York, where most acting opportunities are. The same goes for writing, art, you name it. And we are living in an age where people want to consume art, but they don’t want to pay for it. Most creators can’t make a living on their creativity alone. They might work on freelancing or commissions; many, even non-writers, turn to writing (like here on Substack) to supplement their income.
Of course, this is where a universal basic income might come into play…if we, as a society, were progressive enough and valued our citizenry enough to guarantee that everyone living under our umbrella wouldn’t have to worry about going hungry or choosing between back-breaking labor and personal and intellectual fulfillment. I don’t see that happening, at least in the U.S., anytime soon.
What do you think? Is there a future for us where creators and our creations are valued?
I’m a little shook about AI at the moment.
I’ve seen different reactions around the internet and IRL about the latest AI writing technology. Some are amazed, others are nonplussed, many are entertained. And it is entertaining, especially when you read my friend Alex Dobrenko`'s experiments with ChatGPT in Both Are True.
But it’s also terrifying. A friend of mine tried out a few prompts, and they looked like the kind of vanilla thing I would have turned in for a high school (or college, let’s be real) writing assignment. No, the prose wasn’t beautiful and artful. But it was coherent enough to get the point (Yes, but whose point?) across, and unless I had personal experience with a person’s writing beforehand, I probably wouldn’t question it if they turned it in as an assignment.
My daughter, in fact, wanted to write an essay about humans’ impact on the environment over break (for fun, so if you were wondering if she’s really mine, you can rest easy), and she stumbled across an AI writing website. She didn’t know what it was, but damned if the computing machine didn’t write the whole thing for her.
“That’s not your writing,” I said.
“But I don’t know any of these things,” she said.
“That’s why you research and learn those things!” was my reply. She did end up researching and writing her own five-paragraph essay, which was pretty impressive for a nine-year-old, if you ask for my perfectly unbiased opinion.
But, not so secretly, I’m worried this kind of thing will become commonplace. Don’t forget, technologically speaking, AI is still in its infancy. It’s still learning, and I have no doubt it will learn to be more poetic, more coherent, and more undetectable.
Writers like me sometimes take jobs writing for companies’ websites. A business will post a request for a blog post about, for example, different kinds of snowboarding equipment, and hire someone to write it (usually for not much money, but some writers cobble together a decent income writing for many different websites). If an AI can generate an 85% usable post instantly, for cheap, complete with search engine optimization, why would a business pay for an expensive, potentially unreliable human to do it? I actually see this as a pretty big threat to the writing gig economy, which is one of the only ways beginning writers are able to make an income.
As far as novels, I have to assume people would rather read one written by a human than a robot, but you never know.
“My kids don’t need to learn to write anymore,” said a friend of mine, but I think it’s much worse than that. Artificial intelligence can already replace some forms of writing. But, for kids growing up with AI, it has the potential to replace deep thought.
And that scares me.