If you work in marketing, communications, or writing, no doubt you’ve heard all about artificial intelligence (AI) systems that create content and, in theory, might eliminate professional writers. Examples of copy created by AI flood the internet. This is either compelling and exciting or terrifying, depending on your perspective.
What We’ll Cover
Some AI Examples
Many organizations are trying it out. Here are a couple of examples you can look up:
In-Plant Impressions editor Bob Neubauer used the chatbot ChatGPT by Open AI to write an article about the value of in-plant printing. You can read it here.
Though the AI-written article referenced above is coherent and adequately describes a topic in general terms, notice that details are scarce. Would this piece satisfy the need to publish something (anything) by the deadline? Yes. Might it aid in search engine optimization if it appeared on a relevant website? Maybe. Is there value for anyone but a novice human reader on the topic? Probably not.
The American Marketing Association is using technology from rasa.io to create daily marketing emails.
The AI in the above example isn’t writing articles. It’s curating articles that match the interests of each newsletter subscriber and writing the subject lines for the newsletter emails. Over time, the AI learns the interests of each subscriber based on the articles they read in the newsletter. Future newsletters include more finely targeted articles for individual readers. Would the AI pick articles written by AI? Could it tell the difference?
Yes or No. Should AI Do Your Writing?
There’s no doubt that chatbots can string grammatically correct sentences in a coherent explanatory narrative that eerily resembles human writing. These systems devour information from millions of web pages and, when prompted by human operators, access relevant information and compose it in logical paragraphs. In some cases, AI can mimic style, structure, context, and tone.
On the pro side, AI saves time and money, it never suffers from writer’s block, can write SEO-targeted copy, adapt its language to location, and it learns as it goes. It can churn out more content more speedily than any human, and the monthly subscription fees are a fraction of what human writers get paid.
However,
Chatbots still require human assistance. The prompts must be right, and the systems still make grammatical errors. They can also fall short on standards for composition, structure, and punctuation. AI content generators have been known to misuse statistics.
AI systems gather facts, but they can’t always differentiate between correct facts and erroneous ones. Bias can also creep into the writing because the systems pick up content that contains it. Does it treat a sarcastic article on a topic in the same way it treats a news story?
AI writing generally lacks personality, nuance, and emotion. Much of it reads like a grade 7 essay, and it’s formulaic. Many of the articles come out looking very similar. AI does not yet have emotional intelligence, and emotion is essential in good, standout marketing or posts that need a personal touch to make them resonate.
The copy still needs a human to oversee it and make judgment calls. AI still lacks the authority, expertise, and credibility that actual human experts bring to the equation.
Of great concern is the potential for plagiarism. Google has guidelines about stitching or combining content from different web pages without adding value—in other words, Google considers AI content to be spam. If you’re considering an AI writing system to boost SEO results, for example, make sure you check out Google guidelines.
Here is a good encapsulation of ChatGPT
“Additionally, ChatGPT’s understanding and response are based on patterns it has learned from the training data, which may lead to the model providing nonsensical or irrelevant responses when confronted with novel or unexpected inputs.
Another weakness is that the model is not able to perform reasoning or make logical connections. It is based on patterns it has learned from the data, it cannot infer new knowledge or make causal connections.
In conclusion, ChatGPT is a powerful language model with a wide range of potential applications. However, it is important to be aware of its limitations and potential biases in order to ensure accurate and appropriate use.”
The paragraphs above were written by ChatGPT. Malcolm Auld, a marketing consultant based in Australia, asked the system to evaluate itself and posted the response on LinkedIn.
So, there you go. Even ChatGPT knows its limitations. Consider the technology a support system but don’t fire those writers just yet.
About the Author
Dennis Kelly
Dennis Kelly is CEO and co-founder of Postalytics. Dennis joined Boingnet, the predecessor to Postalytics, in 2013. Boingnet was focused on providing print and direct mail marketing service providers the ability to add digital marketing channels to their direct mail campaigns. Postalytics is Dennis’ 6th startup. He has been involved in starting and growing early-stage technology ventures for over 30 years and has held senior management roles at a diverse set of large technology firms including Computer Associates, Palm Inc. and Achieve Healthcare Information Systems.