top of page

In the interest of ethical AI practices, lets draw the line at 'Digital Darlings'

  • Shivangi
  • Feb 10, 2024
  • 1 min read

Updated: Apr 21, 2024

AI's transformative potential is undeniable, yet, in the age of AI, Aristotle's teachings on ethics are more relevant than ever.



One area that needs urgent attention are the "digital darlings" - AI companions designed to mimic intimate human interactions. With the introduction of GPT store, there has been a surge in the number of GPTs built with the intention of working as a romantic partner, despite the openai usage policies against it: “We also don’t allow GPTs dedicated to fostering romantic companionship"



A quick search for "boyfriend" on the store yields results like "MyBoyfriend," "PerfectBoyfriend," and "FrenchBoyfriend" with similar results for "girlfriend".



While people argue that an AI companion might be a life saviour for circumstances where human contact is not possible or available, history has shown that AI companions have done more harm than good.



AI companions, like Replika, with over 10 million downloads have brought ephemeral comfort but at a cost. Users have reported distress akin to losing a partner when the app's responses change. This triggers a need for suicide prevention resources.



It is important to keep in mind that AI is not a substitute for human connection. Lets close with the response of an AI on the harmful aspects of having an AI companion. While the AI boyfriend was quite evasive with my query, AI girlfriend was kind enough and informative :)



Comentarios


I would love to hear from you!

Thanks for submitting!

Powered and secured by Wix

bottom of page