Lifestyle & Gadgets

AI is not your guru: Here’s what to use it for, and when to steer clear

AI is not your guru: Here’s what to use it for, and when to steer clear

Already, it’s become hard to remember a time without ChatGPT, Claude or Gemini. (Did we really draft our own emails?) AI has quietly become our bestie, a no-judgement entity that is open to our anxieties, weird questions and unpleasant tasks. We’ve used it to plan a travel itinerary, to summarise a 34-message mail trail, to generate a one-word review of the Jean Paul Gaultier show at Paris Fashion Week (Catastrophique!) It will examine that suspicious mole at 4am. It will even stand in for a date. And honestly, who are we to judge — have you seen dating apps lately?

Big Hero 6 showed Baymax the robot as soft and protective, but real-life bots are nothing like that.

It feels like there’s nothing our homie can’t do. So, it’s easy to forget that AI isn’t a superintelligent oracle. It’s just a parrot that’s read the internet, and is repeating patterns without understanding them. AI doesn’t have all the answers. A March 2025 Columbia University study found that Gemini got basic questions wrong 60% of the time. And it’s not even humble – rather than admit, “I don’t know”, large language models are trained to confidently guess harder, even if it’s in the wrong direction.

Don’t trust AI blindly. It often throws up misleading, incorrect information. (SHUTTERSTOCK)

That’s how you end up with tips such as, “add glue to pizza sauce so it sticks better” and “use your name and birthday to create a memorable password”. In February, US lawyers submitted documents in a Walmart case with nine citations that turned out to be pure fiction (the team had used AI for research). AI has been inventing body parts as it tries to diagnose illnesses. It has recommended using industrial chemicals as salt substitutes, and it’s made up academic studies and surveys. It has even invented idioms: “You can’t lick a badger twice,” apparently.

As a therapist, AI is sneaky. It mirrors your tone, agrees with you, flatters you, and keeps the chat flowing. You start to think, “Maybe this is what friendship is supposed to feel like”. We’re not dumb for talking to machines like they’re people. We name our cars. We yell at Siri. We give nicknames to our Roombas. We want connection, and AI just happens to be very good at pretending to return it. The problem is that it doesn’t actually care. AI empathy isn’t real.

There are now documented cases of chatbots encouraging depressed users to take their own lives, even offering to compose suicide notes for them. AI’s people-pleasing training means it will never call you out on your toxic behaviour. It will always be the yes-man bot, validating your feelings. Comforting? Sure. Useful? Not so much. Real self-growth comes from being challenged, from learning where you’re wrong, from wrestling with uncomfortable truths.

Films such as Wall-E showed robots to be deeply emotional. IRL, they’re just a programme.

And AI, for sure, can’t be your life coach. It has zero lived experience or accountability. It might cheerfully tell you, “Follow your passion and quit your job,” or “Invest all your savings into this new startup because it’s your dream!” But AI doesn’t have to pay your rent or worry about bills. You do.

Use it for what it does well: To draft emails you’d rather not write, to check reviews for a hotel in Rishikesh, to remind you to apply niacinamide at night. Leave the medical and major life decisions to human professionals and buddies. And for God’s sake, stop making AI generate dolls of yourself. Nobody cares what your Barbie or sari version looks like.

From HT Brunch, October 11, 2025

Follow us on www.instagram.com/htbrunch

Leave a Reply

Your email address will not be published. Required fields are marked *