Promoted as "an ever-present companion," a connected pendant powered by artificial intelligence has been generating a lot of buzz in recent weeks. Behind its reassuring message and sleek design, this project raises profound questions about our relationship to emotions, technology, and intimacy.
A connected device that presents itself as a friend
The startup behind this pendant claims to offer much more than just a digital assistant. Unlike the apps you voluntarily use, this device is designed to be proactive: it initiates conversations, sends spontaneous messages, and comments on your daily life. According to its marketing promise, it can encourage, comfort, offer advice, or simply keep you company.
This approach clearly aims to address a widespread feeling of loneliness in our modern societies. On paper, the idea may seem comforting: an object always available, never tired, always listening. However, this promise of constant presence worries some in the psychology community.
Between emotional support and emotional dependence
For many experts, the main danger lies in the emotional confusion that this type of device can create. An object that presents itself as a friend, always benevolent and never disagreeing, risks fostering an artificial attachment.
Some people, especially those going through a period of vulnerability or isolation, might gradually come to rely on this object to fill an emotional void. The risk is the progressive replacement of human relationships, with their imperfections, tensions, and very real emotions, by a programmed, predictable, and confrontation-free interaction. Yet, it is precisely these relational rough edges that foster personal growth, self-esteem, and emotional richness.
Constant monitoring that questions private life
Another major point of contention: to function effectively, this pendant must constantly monitor its environment using an integrated microphone. It continuously analyzes the sounds and interactions around you in order to intervene "at the right moment."
This passive surveillance raises serious ethical and legal questions. The indirect recording of conversations, including those of people who have never given their consent, blurs the lines of privacy. For many experts, this form of intimate technological presence sets a worrying precedent in the normalization of the ongoing collection of personal data.
A controversy that is already raging internationally
In the United States, the project's initial communication campaigns triggered an immediate backlash. Advertising posters were vandalized, and the startup was accused of profiting from human loneliness and normalizing a form of emotional surveillance. Several media outlets, including The New York Times , reported on these criticisms.
On social media, many users are also expressing their rejection of this type of device. Some denounce it as an intrusion into their private lives, while others see it as a threat to life, authenticity, and the spontaneity of human relationships. The recurring message is clear: many do not want artificial intelligence to occupy an emotional place in their daily lives.
View this post on Instagram
A broader societal issue surrounding AI
Beyond this single pendant, psychologists see this project as a symbol of increasingly intrusive artificial intelligence. While some digital tools can offer occasional support, they should never replace real human connections—imperfect but profoundly alive.
Ultimately, your worth, your sensitivity, your capacity to love and be loved deserve far more than a simulation of affection. Human relationships nurture self-esteem, confidence, and emotional balance in ways no algorithm can replicate. The debate remains open, but one thing is certain: a connected device that "wants to be your friend" profoundly raises questions about what we wish to preserve of our humanity.
