ChatGPT is not a woman, nor in love with you


The ChatGPT revolution has not escaped gender stereotypes — this is the theme of our weekly newsletter #Rule30, from which this article is taken.

According to the rules of the internet (the same ones that gave this newsletter its title), if something exists online, someone has made a pornographic version of it. I see a slightly more subtle variant: if it exists online, someone is bound to imagine it in a romantic context. Fed and nurtured by a science fiction that promises us that the machines of the future will be scary/sexy (cross out the appropriate mention), we project our fantasies on algorithms that know how, at best, to imitate our words and those of billions of others internet users.

Because tech belongs to everyone

#Rule30 is a free weekly newsletter distributed by Numerama. You can subscribe to it directly in this article or at this link.

The week of February 17 was rich in examples of this phenomenon. The new version of Bing, Microsoft’s search engine powered by the intelligent chatbot ChatGPT, has been tested for ten days by journalists and other specialists in new technologies. Without too much surprise, the software quickly showed its limits. An editorialist from New York Times, Kevin Roose, was suggested to leave his wife; a well-known tech analyst, Ben Thompson, demonstrated that the program could claim to have several personalities, one of which is named Sydney; and other strange anecdotes, summarized in this Numerama article.

These stories quickly attracted attention (even forcing Microsoft to rein in its chatbot), but also quite a bit of criticism. This article from The Verge rightly reminds us that ChatGPT is only a textual autocompletion system, and that to believe that it can behave like a human being is to fail the ” mirror test (which consists in presenting a mirror to an animal to check if it believes it is facing a second animal). But me, another thing challenged me: what genre do we lend to ChatGPT?

ChatGPT would be unconsciously feminized

The question arises even more in French, where there is a lack of non-gendered pronouns such as ” it English, widely used to talk about machines or animals. In most of the articles I’ve read, the gender varies depending ona artificial intelligence orA chatbot. But, sometimes, this choice seems guided by something else, like this resumption of International mail misadventures of Kevin Roose where, suddenly, the chatbot becomes a woman (“ she said, out of the blue, that she loved me “). And if the journalist of New York Times is careful not to gender the software, Ben Thompson assumes on the contrary to use feminine pronouns. ” It’s not just because the name Sydney is usually worn by women, but it’s also that her personality resembles some people I’ve met before. », says the analyst. He doesn’t really elaborate, but then shares a tweet that describes Bing ChatGPT as ” a nervous person », « bipolar ” And ” yandere “, a term used in manga to refer to a lovable, but violent heroine.

A snippet of conversation with ChatGPT on Bing, before it was edited to be more “neutral”.

Other clues, in my opinion, point to the unconscious feminization of Bing ChatGPT, such as its frequent use of emojis (a practice rather associated with women online). We can also add the many studies on gender stereotypes conveyed by intelligent assistants, who often use a first name and a female voice, and are subject to sexist behavior from Internet users.

If female AIs are popular, it’s because women are not considered to be exactly human beings “, noted in 2016 the researcher Katherine Cross in a fascinating article on the generalization of chatbots and fantasies of female servitude. ” But it’s also part of the rise of the service economy, which relies heavily on the emotional labor of women. And the moral of the story is that the customer is always right.»

In recent years, the digital giants have improved their practices in this area, by allowing Internet users to choose the gender of their assistant, or by keeping them in a neutral zone. But it’s hard for us humans to resist anthropomorphism and our desire to see another in a machine. However, in a digital ecosystem dominated by men and their vision of the world, it will often be another. It doesn’t matter if ChatGPT isn’t gendered by the people who developed it. Because if we decide that an AI is a woman, then we can associate all the gender clichés we want with her: she will be attractive, cute, helpful, weak, manipulable, secretly in love with you. Perhaps, too, it will scare us less.


If you liked this article, you will like the following ones: do not miss them by subscribing to Numerama on Google News.

Understand everything about experimenting with OpenAI, ChatGPT



Source link -100