AI plays a growing role in sentimental writing
February 5, 2025
Lifestyle

AI plays a growing role in sentimental writing

A new era of communication

Two years ago, Jebar King, a writer by profession, was faced with the daunting task of drafting an obituary for his late grandmother. Grieving and unsure of where to begin, King, 31, was overwhelmed. “I was just like, there’s no way I can do this,” he recalls. Yet, amid his grief, he turned to an unlikely source of help: OpenAI’s ChatGPT.

At the time, King had already been experimenting with the AI chatbot for tasks like creating grocery lists and organising finances. Wondering if it could assist with the obituary, King inputted key details about his grandmother—a retired nurse, an avid bowler, and a proud grandmother—and asked ChatGPT to write it.

The result was surprisingly fitting. “I knew it was a beautiful obituary, and it described her life,” King says. “It didn’t matter that it was from ChatGPT.” He worked with his mother to fine-tune the language, making sure it reflected the emotions he wanted to convey. Ultimately, King felt that the AI had helped him genuinely honour his grandmother despite its robotic origin.

King’s experience reflects a growing trend of people using AI for deeply personal forms of communication. From emails and wedding vows to breakup texts and thank-you notes, generative AI tools like ChatGPT are being used to navigate the complexities of sentiment. While some embrace AI’s assistance in crafting meaningful messages, others worry that its use may compromise sincerity, leading to a backlash against those who rely on it.

Since the release of ChatGPT in late 2022, the use of generative AI has exploded. Early applications involved simple tasks like predictive text in messaging apps, but the technology’s rapid evolution has allowed it to tackle more intricate projects. Users began experimenting with AI to write recommendation letters, emails, and dating profiles. Yet, there’s been pushback, as some feel AI-generated messages lack the warmth and authenticity of those written by humans.

AI’s role in sentimental communication—whether for weddings, condolences, or obituaries—has sparked debate. For some, outsourcing heartfelt messages to a machine seems off-putting and inauthentic. However, those who use AI in such personal contexts often see it as a tool to help articulate feelings that might otherwise be difficult to express. “It’s not about manufacturing sentimentality,” says one user. “It’s about using AI as a template onto which I can map my emotions.”

Many can relate to the difficulty of conveying complex emotions. Writing a speech, crafting an apology, or comforting a friend can be nerve-wracking, especially when the stakes are high. Generative AI can be a valuable resource for these moments, helping people refine their words or avoid awkward phrasing. “It’s a great way to sanity-check yourself,” says David Markowitz, an associate professor of communication at Michigan State University. “It can offer suggestions to make your message warmer or more compassionate.”

While AI doesn’t experience emotions, it has been trained on vast amounts of literature and psychological research, enabling it to recognise patterns in how people typically express feelings. AI reflects human sentiment and offers guidance on effectively conveying it.

Katie Hoffman, a 34-year-old marketer in Philadelphia, has turned to AI to help draft sensitive messages. She used ChatGPT to write a tactful message to a friend about missing her wedding and to address a delicate situation involving a friend asking for money back after pulling out her bachelorette party. “I didn’t want to sound like a jerk, but I also didn’t want to over-explain,” Hoffman says. “ChatGPT helped me strike the right balance.” After some tweaks, her friends were none the wiser about the AI’s involvement.

Despite these benefits, there’s a downside to using AI for sentimental communication. Some worry that relying too heavily on AI can make the message inauthentic or contrived. Research by Mor Naaman, an information science professor at Cornell University, suggests that the more an AI-generated message is edited, the more the author feels a sense of ownership and authenticity. However, when the output requires minor modification, the author may feel they haven’t written the message themselves.

This tension was evident when Jebar King shared his experience using AI for his grandmother’s online obituary. The backlash was swift, with some critics accusing him of being insincere. King says the adverse reactions made him second-guess his decision, but ultimately, his mother reassured him that the obituary was heartfelt and well-crafted.

The reaction to AI-generated content varies depending on the audience’s perception of the technology. Some believe AI is inherently inauthentic, while others view it as a helpful tool for communication. Malte Jung, an associate professor at Cornell University, points out that people’s attitudes towards AI are influenced by broader cultural scepticism. “People still see AI as a threat to authenticity and sincerity,” he says. The use of AI might be seen as undermining the emotional depth of a message.”

Still, some users argue that the intentions behind using AI matter more than the tool itself. Chris Harihar, a 39-year-old public relations professional from New York City, frequently uses AI to help with speeches, such as the one he delivered at his sister’s wedding. “I use AI to fine-tune my messages, not to replace my feelings,” he says. For him, AI is a resource that enhances communication, not something that diminishes it.

Image: The reaction to AI-generated content varies depending on the audience’s perception of the technology. Credit: Lil Artsy

Last Updated on 3 hours by News Desk 2

News Desk 2

News Desk 2 produces the latest news for the Middle East region, with a key focus on the six GCC nations: UAE, Saudi Arabia, Qatar, Bahrain, Kuwait, and Oman. News Desk 2: press@menews247.com
Follow Me:

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *