Her in Real Life: A Reflection of AI and Emotional Dependency
- Amaani Ziauddin
- Sep 16
- 2 min read
There are so many instances where you happen to be in a crisis at 2 AM and you want to text your friends about it so they can provide their insight and help you, but conveniently, like most normal people, all of your friends are sleeping. Filled with embarrassment and desperation, you pick up your phone and start texting your pal ChatGPT. While this may seem harmless, this may lead to something much bigger if we don’t reflect the way ChatGPT or AI in general is being used. The use of AI is slowly shifting and isn’t just a tool anymore, it’s becoming a means for emotional connection.

The idea of having AI as a companion might sound a little funny, but according to a survey from the Institute for Family Studies/YouGov, it was found that 1% of young Americans say that they have an AI friend, 10% are open to having an AI friendship, 7% are open to the idea of having a romantic relationship with AI, and even more alarming; 25% of young adults believe that AI has the ability to replace real-life relationships. This shows that the little personal chats with ChatGPT, may seem quirky and harmless, but they actually reflect a reality where AI chatbots are replacing human connection, which sounds very similar to Spike Jonze’s film Her starring, Joaquin Phoenix.
In the film Her, Joaquin Phoenix’s character, Theodore becomes smitten with his new operating system named Samantha, not because of her efficiency to complete tasks but because of their “connection”. What once was considered science-fiction has now turned into reality. Psychologists call this anthropomorphism, which means assigning human-like characteristics onto things that aren’t human. In this case, assigning human-like characteristics to AI chatbots can raise a few concerns. Research shows that when people anthropomorphize AI chatbots, they start to think that the chatbots are capable of feeling emotions and empathy; this aligns with the Computers are Social Actors (CASA) idea. CASA suggests that humans have the ability to act naturally and socially with technology, and apps like Replika are built off of this idea, where it offers supportive conversations. While these chatbots do offer an “ear”, they also can set false and unrealistic expectations.
Human-human contact is what allows us to grow, learn, and gain different insights from different backgrounds and cultures; but with AI being heavily moderated and lacking true emotion, a human can’t replace the unpredictability, sensitivity, and authenticity of any human relationship with an AI chatbot.
_edited.png)



Comments