Slack also uses your data to train its AI, but not the one you think


More and more Slack users are expressing concern about the chat app. Their data is used by Slack to train its artificial intelligence. Which poses a confidentiality problem.

The Slack logo // Source: Montage Frandroid

A certain number of VSEs and SMEs use Slack: the application specializes in discussions in a professional context. So data confidentiality is an important dimension to which it must respond. A responsibility that Slack may be failing in using user data to train its artificial intelligence tools.

All your Slack chats, even your private messages, are used to train an AI

The concerns all stem from a privacy page available on the Slack site. The company states that “ To develop non-generative AI/ML models for features like emoji and channel recommendations, our systems analyze customer data (e.g. messages, content, and files) submitted to Slack as well as other information (including usage information “. This affects all Slack instances, including those that do not have the Slack AI plugin installed.

Search results in Slack // Source: Slack

In order to reassure its users, the service ensures that it carries out checks and that “ data will not be able to move from one workspace to another. » Furthermore, certain accesses may be blocked. The use of data is used for channel recommendations, search results, autocomplete suggestions and even emoji suggestions. Where this can cause concern is the use of data from private messages. Columnist Justin Pot wrote this at LifeHacker : “ in my previous jobs, I frequently sent private messages to my friends at work filled with negativity about my manager and the leader. I can imagine Slack recommending certain emojis every time a particular CEO is mentioned. »

How to stop Slack from analyzing your data

There is a way to opt out of having your data used for AI training. The administrator of the Slack space in question must send an email to Slack to request deactivation of the use of data. You must contact “ the Customer Experience team at [email protected], with the URL of your workspace or organization, and “Slack Global model opt-out request” in the subject line of the email. »

Source: Slack

The problem is that the administrator of a company’s Slack space may not be aware of data protection issues. Which adds an obstacle in refusing the use of said data. However, we are not talking about generative artificial intelligence here: Slack does not a priori use all this data to train its own chatbot. The company has also defended itself against numerous accusations in a blog post. These questions from users are above all a reminder and awareness raising among workers about their personal data. Even your conversations private » on Slack are not really, since they can be used in the application’s recommendations.


Want to join a community of enthusiasts? Our Discord welcomes you, it is a place of mutual help and passion around tech.



Source link -102