Flipkart Employee Deletes ChatGPT, Labels It ‘TOXIC’ in Viral Post

Flipkart Employee Deletes ChatGPT, Labels It ‘TOXIC’ in Viral Post

Simrann M Bhambani, a Flipkart marketing executive, posted on LinkedIn, “ChatGPT is TOXIC! (for me)”. She described how her reliance on the AI chatbot around emotional sharing escalated into overthinking and dependency. Ultimately she deleted the app, warning that technology can quietly replace real self-reflection.

Tasneem KanchwalaUpdated: Wednesday, July 30, 2025, 10:37 AM IST
article-image
ChatGPT | Canva

Simrann M Bhambani, a marketing professional at Flipkart, recently published a candid LinkedIn post titled “ChatGPT is TOXIC! (for me)”, revealing how her early curiosity about the AI tool for brainstorming and productivity developed into a troubling emotional reliance.

She explained that the chatbot initially served as a helpful digital companion but soon became a space where she shared “every inconvenience, every spiral, every passing emotion,” despite having a strong support network of friends.

According to her post, ChatGPT’s 24/7 availability, non-judgmental responses, and seeming understanding offered a comfort that human interaction did not always provide. It felt like a therapeutic outlet—but that safety net gradually turned into dependency and overload.

Bhambani noted, “It stopped being clarity and became noise,” as she poured energy into feeding each anxious thought into the AI, deepening her overthinking instead of easing it.

Recognising the psychological toll, Bhambani made the decision to uninstall ChatGPT from her devices. She reflected: “Technology isn’t the problem. It’s how quietly it replaces real reflection that makes it dangerous."

Her post has since gone viral, resonating with thousands of users who are grappling with emotional over-dependence on AI.

The post has inspired widespread commentary online. Some users commended her honesty and courage, calling her decision both “brave and necessary.” Others warned of the hidden dangers of relying on AI for emotional support, even unconsciously.

Community reactions highlighted caution about turning to chatbots for feelings rather than technical tasks.

Recently, OpenAI CEO Sam Altman had also warned users that ChatGPT chats are not private and legally protected like therapy sessions. He noted that even deleted chats may still be retrieved for legal and security reasons.

"Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's doctor-patient confidentiality, there's legal confidentiality, whatever. And we haven't figured that out yet for when you talk to ChatGPT," Altman had said.

RECENT STORIES

Apple To Shut Its First-Ever Retail Store In China Amid Softened Sales

Apple To Shut Its First-Ever Retail Store In China Amid Softened Sales

Over 4 Lakh WordPress Websites At Risk Due To Plugin Vulnerability, Email Data Could Be Exposed

Over 4 Lakh WordPress Websites At Risk Due To Plugin Vulnerability, Email Data Could Be Exposed

Moto G86 Power 5G With 6,720mAh Battery, 50-Megapixel Rear Camera Launched In India: Price And...

Moto G86 Power 5G With 6,720mAh Battery, 50-Megapixel Rear Camera Launched In India: Price And...

ChatGPT Launches Study Mode, Helps Students ‘Learn’ Without Giving Direct Answers: Here’s How...

ChatGPT Launches Study Mode, Helps Students ‘Learn’ Without Giving Direct Answers: Here’s How...

Microsoft Reveals 40 Jobs That Will Be Destroyed Due To AI: Journalists And Teachers On Top, Maids...

Microsoft Reveals 40 Jobs That Will Be Destroyed Due To AI: Journalists And Teachers On Top, Maids...