ChatGPT: the route to loneliness?

No comfort from AI

From ChatGPT to Replika, AI is being touted as a tool to solve the loneliness epidemic. But is this just a false marketing promise, or perhaps something more sinister. In these habitually lonely times wanting someone to talk to, even if they’re just an AI chat bot, is understandable. But besides the emptiness of a conversation with an AI, author of Nihilism and Technology, Nolen Gertz, argues that even the most optimistic story of connection could lead to danger.

 

 

There is something disturbing and yet understandable about wanting to talk to a machine.

On the one hand, it could seem that wanting to talk to a chatbot demonstrates a preference for programs over people. People can talk back, making a monologue into a dialogue. Presumably then the desire to talk to a program rather than a person is precisely because of the desire to have a monologue rather than a dialogue, to have someone to talk at rather than talk with. A conversation with a chatbot is then not a conversation but a performance of something that looks like a conversation. Two players are seemingly involved in the conversation game, but really there’s only one player, like playing a game of catch with a brick wall. The ball bounces back to you, but the wall did not throw the ball, as instead it merely reflects back what it is put into it, a reflection that can give the appearance of a game of catch, especially if you do not understand geometry and so think the ball comes back to you more unpredictably than it really does.

Continue reading

Enjoy unlimited access to the world's leading thinkers.

Start by exploring our subscription options or joining our mailing list today.

Start Free Trial

Already a subscriber? Log in

Join the conversation

Markus Padilla 14 February 2024

Thank you for your insightful perspective on the role of AI in addressing loneliness. It's a complex issue that warrants careful consideration. I'm looking forward to exploring this topic further