When the Austrian employment center, boosted by AI, refuses to direct women towards IT…


Benoit Bayle

January 23, 2024 at 5:47 p.m.

2

Job center © © sylv1rob1 / Shutterstock.com

A slightly misogynistic bot… © sylv1rob1 / Shutterstock.com

Recently, employment infrastructure in Austria was able to reveal the use of technology ChatGPT in order to direct the unemployed and students towards new jobs. The problem is that this AI seems to have a grudge against women…

At the beginning of January, the Austrian Ministry of Labor was able to announce the creation of Berufsinfomat, the very first AI-based conversational agent whose aim is to enable career guidance for students and the unemployed. Collaboration with OpenAI via ChatGPT, launch with great fanfare… All the ingredients were there to generate success and praise across Europe, for the implementation of this new technology for social purposes. But then, after a few weeks, the honeymoon gave way to disenchantment.

Technical defects and prejudices

According to the Austrian publication DerStandard, the service presents technical defects from its launch which revolve around prejudices that are harmful to the infrastructure. The most striking example of such a practice is the following: many male candidates were redirected towards jobs in the IT field, while female candidates with identical CVs were redirected towards studies. gender, philosophy, business psychology, even catering. In other words, good old ChatGPT would have certain sexist biases acting to the detriment of women looking for a job.

This is an ultimately logical assertion: language models like ChatGPT have been trained and fed billions of information mainly retrieved from the internet, with their share of sometimes negative preconceptions about this or that category of the population. It therefore did not take long before the information was relayed on X.com (formerly Twitter), and the general director of the public employment service in Austria Johannes Kopf was obliged to recognize the imperfection of the tool:
It is not easy to completely eliminate this idea about the IT sector. Especially if we try to deliberately create bias “, he was able to indicate in response to the controversy.

woman AI artificial intelligence © PeopleImages.com - Yuri A / Shutterstock

Women pushed aside by AI © PeopleImages.com – Yuri A / Shutterstock

Questions about data protection

On the other hand, the Austrians’ concerns do not only lie in these harmful technical defects: this type of tool also raises various questions regarding data protection. Indeed, the information of the various users of the platform is in fact recovered by OpenAI directly. The Austrian Ministry of Labor has itself recommended not sharing personal or private information when looking for a job on this medium. Developer Mario Zechner was able to carry out in-depth research into the software, the results of which he shared on X.com: surprise! They are not good. The specialist thus points to major defects which could once again indicate a partial or complete recovery of data by OpenAI.

In France, the question arises despite everything: at the end of 2023, the Ministry of Transformation and Civil Service was also able to discuss ways to save time on administrative tasks, one of them obviously being intelligence generative artificial. As for an implementation in the brand new France Travail? For the moment, the question does not seem to have arisen.

The 5 best artificial intelligence chatbots (2024)

In 2023, the landscape of chatbots in French has expanded considerably, boosted by the rise of artificial intelligence. In the past, these assistants were rationed to predefined questions and answers. But now, thanks to advancements like ChatGPT, it’s possible to ask any question and get relevant answers generated in real time.
Read more

Source : International mail



Source link -99