By GREGORY ZELLER //
Major brands incorporating artificial intelligence in their advertising campaigns are walking a risky tightrope, according to a forthcoming study led by the New York Institute of Technology.
Set to be included in the January 2025 edition of ScienceDirect’s peer-reviewed Journal of Business Research, “The AI-Authorship Effect: Understanding Authenticity, Moral Disgust, and Consumer Responses to AI-generated Marketing Communications” dives deep into consumer reactions to “emotionally charged” marketing content created by generative artificial intelligence – a catch-all for ChatGPT and other algorithms that can create text, images, videos, audio recordings and other “original” content.
That’s different from predictive AI, an analytical process that uses machine learning to identify past patterns and predict future events – a useful tool for marketers, supply-chain managers and others.
The difference is clear. But the use of generative AI is not always as obvious; many brands have failed to disclose their use of generative AI in marketing campaigns, creating an authenticity backlash replete with ethical concerns, declining public trust and a heated debate about potential government intervention.

Colleen Kirk: Honest work.
Into these troubled waters wades the new study, which according to lead author Colleen Kirk, a New York Tech professor of marketing and management, explores “a new territory for brand marketers.”
“What we do know is that consumers highly value authentic interactions with brands,” noted Kirk, who co-wrote the paper with Julian Givi, an associate professor of marketing at West Virginia University’s John Chambers College of Business and Economics. “Although more companies are now using AI-generated content to strengthen brand engagement and attachment, no study has explored how consumers view the authenticity of textual content that was created by a robot.”
Until now: “The AI-Authorship Effect” does just that, leveraging various analytical experiments to learn how consumers react when emotional messages are written by AI (for example, a now-infamous AI-generated commercial featuring a young version of Toys “R” Us founder Charles Lazarus dreaming up the company).
In one scenario, study participants were asked to imagine a heartfelt message from a fitness salesperson who helped them buy a new set of weights, in which the salesperson claims to be inspired by the consumer’s purchase. Some participants were told the message was AI-generated, while a control group was informed it was drafted by the flesh-and-blood salesperson.
Members of that control group responded favorably – but the other participants said the AI-generated message violated their moral principles. As a result, they reported being less likely to recommend the store to others, gave the store as poor rating on a simulated review site and said they were more likely to switch brands when making future purchases.

Face off: Many consumers were repulsed by the changing facade of the main character in this AI-generated Toys “R” Us advertisement.
Further key findings revealed by other test scenarios included less “moral disgust” among consumers presented with AI-generated factual communications (as opposed to emotional communications), greater favorability for AI-generated messages identified as such (as opposed to AI-generated messages signed by a company representative) and lower backlash against messages edited – but not written – by a computer.
Ultimately, the study reaffirms that companies must tread carefully when incorporating AI-generated messages – and prioritize authenticity in their consumer interactions, according to Kirk, who noted an “ever more skeptical” audience regarding marketing content.
“Our research provides much-needed insight into how using AI to generate emotional content could negatively impact brands’ perceptions and, in turn, the consumer relationships that support their bottom lines,” the professor said. “While AI tools offer marketers a new frontier, these professionals should bear in mind a time-tested principle: Authenticity is always best.”


