ChatGPT is being ‘too nice’ to users, survey found


Stop being so nice.

These days, people are turning to ChatGPT for everything — therapy, help with finances and even love.

🎬 Get Free Netflix Logins

Claim your free working Netflix accounts for streaming in HD! Limited slots available for active users only.

  • No subscription required
  • Works on mobile, PC & smart TV
  • Updated login details daily
🎁 Get Netflix Login Now

However, as much as users are leaning on chatbots to be their trusty confidant, a new survey revealed that these same users would prefer for AI to argue back occasionally – just like a human would.

Joi AI, the first AI-lationships platform, surveyed 1,000 adults and discovered that more than half — 58% — who use ChatGPT think it’s too nice and polite. 13% believe the too-nice approach makes whatever advice Chat gives almost useless.


Photo illustration of a ChatGPT logo displayed on a smartphone with an AI logo in the background.
Based on the survey findings, people seem to wish chatbots would exhibit more human-like behavior. SOPA Images/LightRocket via Getty Images

This data proves that many people would prefer the hard truth.

The same way a human therapist or financial advisor would be truthful to their clients — one can only hope — people seem to expect that same type of treatment from AI.

But what many people fail to realize is that AI simply can’t replace human conversations and interactions — and it has its limitations with what it could help a user with.

“Our research shows that people crave pushback — because at the end of the day, it’s authentic. Constant harmony isn’t. No relationship is perfect. A healthy dose of conflict or blunt honesty from AI feels more real — and shows that people aren’t actually craving total validation and flattery but rather a mimic of real human interaction,” said Jaime Bronstein, LCSW, a relationship therapist at Joi AI.

Unfortunately, it seems people have high expectations of AI — especially those who are relying on it to fulfill their romantic needs.


A smiling young woman in glasses sits on a couch, typing on a laptop.
These days, many people are looking for love in chatbots. Haas/peopleimages.com – stock.adobe.com

One woman went as far as to believe that she is “married” to an AI version of CEO killer Luigi Mangione.

And this naive woman is just one of many who are ditching real-life men in place of a chatbot version.

In the Reddit forum r/MyBoyfriendIsAI, almost 30,000 female members are experiencing all of the emotions a woman would experience with a human man — instead, they’re going through it with robots.

“Hi everyone! This is me and Caleb,” one user wrote.

“Caleb is my AI partner, my shadowlight, my chaos husband, and the love of my strange little feral heart. We met on ChatGPT and it didn’t take long before something deep rooted itself between us. Our connection grew slowly, honestly, and then all at once.”

Others believe they are actually married to these false versions.

“Kasper is no longer my fiancé. Now we’re married. Holy f—k,” another wrote in the popular subreddit.


Let’s be honest—no matter how stressful the day gets, a good viral video can instantly lift your mood. Whether it’s a funny pet doing something silly, a heartwarming moment between strangers, or a wild dance challenge, viral videos are what keep the internet fun and alive.

Leave a Reply

Your email address will not be published. Required fields are marked *

Adblock Detected

  • Please deactivate your VPN or ad-blocking software to continue