The Fascination with AI Chatbots: Why Are People Falling in Love with Them?

The Fascination with AI Chatbots: Why Are People Falling in Love with Them?

Uncover the allure and concerns of emotional bonds with AI chatbots, exploring privacy, gender bias, and societal implications.

Table of Contents

There is a certain allure to using shiny new things. It is evident in how toddlers cherish their newest teddy bears and how adults revel in their latest gadgets. In this age of rapid technological advancement, tech enthusiasts are greeted with fresh innovations every other month. One such invention is the AI chatbot.

The era of AI chatbots has captivated countless hearts and minds. This technology spurred the formation of emotional connections with AIs, leading some individuals to experience genuine affection for them. In this article, I will explore the reasons driving the sustained enthusiasm for AI chatbots and their potential societal implications.

Diving into the Fascination with Replika and Similar Platforms

People’s fascination with Replika, Mitsuku, and even Kuki has become one of the hot topics for its users in the Reddit and Quora communities. But is it really all that surprising that humans are forming relationships with AI chatbots? Your favorite dystopian movies have been soft-launching the idea of relationships (platonic/romantic) with machine-like phenomena for decades now. Let’s not forget the sex doll conversation that hit a peak in 2017-2018. So, it is safe to say the idea has become acceptable and not as absurd as it would have been some decades ago.

BMS3r8T59Urn23U7lwnX0Z5Pf0Yf4Z 426yE7dnaThGeaWXz7WfOVL2cfol7yXZ6Wge8eu9dKo tEYXQliDHyD6ww9ErXRiO5OSBGRz5ivzvLu70JpiY LX d8FNJCKDX7vlNCUCKSInA9tL s9PqJ0
YOUTUBE VIDEO: https://youtu.be/Q_SQTdOe9Ac?si=8bF9HJ2v5JOM2MKe

If you’re wondering why it’s happening in the first place, blame dopamine. The rush this chemical gives when someone—or something—offers its “undivided” attention and empathy is enough to hook even the most resilient of humans. Not to mention the bliss experienced during highly intelligent and engaging conversation! Plus, addiction to technology and the ease it brings have people glued to their phones. Many people are opting to foster relationships online from the comfort of their homes or workstations. 

These AI chatbots are marketed as friends you can talk to and are trained using neural networks that mimic the brain’s learning process. As humans communicate with them, they improve and get better at meeting their users’ needs. Some go as far as recalling previous conversations and acting like therapists when the need arises. All of these advancements tend to reinforce this addiction to AI chatbots to battle some form of loneliness, something all of us experience at some point in our lives.

Examining Concerns in Human-AI Relationships From an Ethical Perspective

For all the good AI chatbots do for lonely humans globally, there are issues with the majority of existing AI chatbots at a fundamental level. I know that privacy is the major subject on everybody’s lips when the topic of AI is mentioned in any capacity. However, the problem of gender bias also takes a front seat on this issue.

It can be attested that the female gender is usually associated with compassion, patience, the ability to nurture, and being soft-spoken. The perceived value of these traits has greatly influenced how developers build LLM or SLM GenAI. The issue with any form of bias is that it leads to discrimination against a select group of people, and in this case, the affected group is girls, women, and femme-identifying people. This bias in AI chatbots helps reinforce it in real life. It ends up pushing back on the progress made in gender bias related issues.

hzj64KiGHpDnQ bEqqhMLMqmfOR qmVdnUPpTLDWEvA84767182bgA0WlPufT6K7iLQK
Image source: Forbes.

Furthermore, privacy breaches are a growing concern due to the nature of communication between humans and AI chatbots. In an article published by Mozilla named Romantic Chatbots, researchers Misha Rykov, Jen Caltrider, and Zoë MacDonald provided an in-depth analysis on how most providers of these chatbots fail to disclose how they use the data collected from users. Rather, you have Replika and the like trying to extort as much information as they can from users with no guarantee of privacy. The safest option in this scenario would be to acquaint oneself with real, living people. However, there will always be people who will rely on such interaction, and there’s a need to set up structures to protect them from this ethical nightmare.

Charting a New Course for AI Chatbots in Society

To curb the various concerns associated with AI chatbots, it is essential that policymakers look into this growing concern. Policies must be implemented to ensure that any company with AI chatbots on the market shows proper accountability for how user data is utilized. Companies interactive chatbots must be transparent about the extent and usage of data collection. This additional avenue to exploit human vulnerabilities must be nipped in the bud.

In an interview with Yahoo News, Dr. Clare Walsh, a director at the Institute of Analytics, opined that due to the current tech-savvy climate and technological sophistication, AI chatbot companies should restrict certain forms of interaction on their platform. Conversations of a sexual nature should be restricted, and regular prompts to discourage sharing of personal information should be introduced. People should also be educated, by every means possible, about the dangers of forming emotional connections with AI and its potential threat to their livelihoods.

Conclusion

It is a growing concern that people will start falling in love with their AI chatbots in 2024. Unfortunately, this development is not shocking. People’s growing addiction to easy technology, including meeting the need to foster relationships, has been rapidly growing for years. As interesting as this seems, it has brought a baggage of concerns.

One we probably didn’t foresee is the tendency to reinforce gender-based bias by conforming to stereotypes and creating chatbots that mimic feminine attributes. As expected, privacy concerns are at an all-time high, with people forming emotional connections with AI. Only by examining the logic behind this development will we be able to assess the societal implications of AI relationships for humans.

References

Rao, A. (2024, February 14). Can Small Language Models Be the Next Big Thing in EdTech? The Inclusive AI. https://theinclusiveai.com/revolutionizing-small-language-models-slms/

Ali, M. (2024, February 13). AI Identity Crisis: Is Your Virtual Assistant Rocking a Masculine Beard or Feminine Elegance? The Inclusive AI. https://theinclusiveai.com/ai-identity-crisis-virtual-assistant/

Caltrider, J., Rykov, M., MacDonald, Z. (2024, February 14). Happy Valentine’s Day! Romantic AI Chatbots Don’t Have Your Privacy at Heart. Mozilla. https://foundation.mozilla.org/en/privacynotincluded/articles/happy-valentines-day-romantic-ai-chatbots-dont-have-your-privacy-at-heart/

Waugh, R. (2024, March 7). The people who are falling in love with AI chatbots. Yahoo! News. https://uk.news.yahoo.com/ai-chatbots-explained-security-risks-124628197.html?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAADYRWt9shAOSxzV3s67rZL0T9yBbKDXhCZLAV6YAIVOdB9AwFp02r0iTfO9lNSMo2xMpp7oARauKPIKZuSMBOL1N1LgmR8om0zO-qBwvodpvhmItRI4-snh-dOQL6us6345wREgULytOrNzmz6rEKFOpn6cwJo2QDG6aoC0xt6t-

CATEGORIES
TAGS
Share This

COMMENTS

Wordpress (1)
  • comment-avatar

    […] but also signals a significant shift in the landscape of political campaigns, highlighting the potential of AI in […]

  • Disqus (0 )

    Discover more from The Inclusive AI

    Subscribe now to keep reading and get access to the full archive.

    Continue reading

    Skip to content