The CAF algorithm, “a particularly pernicious mass surveillance system” according to Quadrature du Net


Corentin Béchade

November 28, 2023 at 9:29 a.m.

29

Illustration_CAF_2811 © sylv1rob1 / Shutterstock

The CAF algorithm in the sights of Quadrature du Net © sylv1rob1 / Shutterstock

Would the family allowance fund seek to complicate the lives of the most precarious? In any case, this is what a recent post from Quadrature du Net, an association fighting for digital freedoms, states.

After long months of mobilization, Quadrature du Net obtained the source code of one of the algorithms supposed to help CAF in the fight against social fraud. Small problem, according to the association, this formula “ provides definitive proof of the discriminatory nature of the criteria used » and particularly targets the most precarious beneficiaries, thus leading to a “ double punishment ” For ” those who […] are going through a particularly complicated period “.

The hunt for overpayments

As part of the fight against surveillance algorithms, the association fighting for digital freedoms has therefore made two of the algorithms used by the CAF for the detection of social fraud publicly available. The two models (used between 2010 and 2014 and 2014 and 2018 respectively) aimed to detect the “suspicion score” of a beneficiary to estimate whether a check was necessary in order to claim possible overpayments. The current version of the algorithm, however, has not been shared by the CAF.

According to the Quadrature analysis, “ socio-economic variables have a preponderant weight in calculating the score “. Thus, among the criteria increasing the score (and therefore leading to more frequent checks), we find, among others: having a low income, being a RSA recipient, living in a ” disadvantaged ” Or ” not having a job or stable income “. People who receive the Disabled Adult Allowance (AAH) and who are employed would also be particularly targeted by the algorithm.

A “neutral” algorithm?

Structurally disadvantaging people in precarious situations “, the algorithm would therefore be part of a ” particularly pernicious mass surveillance system » widespread in many administrations, according to La Quadrature. Faced with criticisms formulated by the Defender of Rights, a director of the CAF defended himself by explaining that “ the algorithm is neutral » and would even be “ the opposite of discrimination ” since ” no one can explain why a file is targeted “.

In total, nearly 32 million people, including 13 million children living in a household benefiting from CAF aid, would be targeted by the algorithm put in place by the Administration. The control thresholds would be triggered from 600 euros of overpayments in two years, or 25 euros per month.

Source : Squaring the Net, GitLab



Source link -99