associations demand from the government “the end of scoring algorithms”

“Putting back human” facing the excesses of dematerialization. Around thirty associations defending public freedoms and supporting the most disadvantaged are calling on the Prime Minister, Gabriel Attal, on the algorithm used by family allowance funds (CAF) to target their controls, in an open letter dated Monday, February 5.

This letter, signed in particular by the Change course collective and the association for the defense of digital freedoms La Quadrature du Net, looks back on the practices “discriminatory” generated by the algorithm for targeting CAF beneficiaries. This system, of which The world revealed the workings in December, has the effect of concentrating controls on certain vulnerable groups such as single parents, some beneficiaries of the disabled adult allowance or low-income households.

Investigation : Article reserved for our subscribers Profiling and discrimination: investigation into the abuses of the family allowance fund algorithm

Associations accuse this system of contributing to a “institutional mistreatment” CAF, to “multiple consequences on the material and psychological level”.

They therefore call on the government to abandon the use of such rating algorithms, in the CAFs but also in the rest of public organizations such as France Travail (which replaces Pôle emploi). They also demand stricter control of administrations’ IT tools, as well as greater transparency about them.

Large format : Article reserved for our subscribers How the CAF algorithm predicts if you are “at risk” of fraud

Persistent opacity of CAFs

The various journalistic investigations and questioning of associations have not shaken the confidence of the National Family Allowance Fund (CNAF) in its system. “We have nothing to be ashamed of our action”estimates the director of the organization, Nicolas Grivel, in an internal message sent in Decemberensuring not to have “neither afraid nor ashamed of debate”.

Auditioned in the Senate on January 25the manager however refused to go into the details of how the algorithm works. “We are not targeting single-parent families, not at all”, he assured, explaining that more vulnerable groups are monitored because there are more of them among the beneficiaries. A line of defense which obscures the fact that the algorithm directly targets these audiences, and induces controls in greater proportions than the others.

The director of the CNAF also did not respond to the question from the socialist senator of Seine-Saint-Denis Adel Ziane, who sought to know whether geographical criteria were used to target the controls. The department of Seine-Saint-Denis contacted the Defender of Rights in December, worrying about a possible “break of territorial equality”.

The CNAF has long been reluctant to shed light on the design, content and effects of its system of targeting controls. The world appealed to the administrative court in December in order to obtain the communication of documents relating to this “risk score”. But the public body maintains its refusal in a defense statement, signed by the law firm Veil Jourde, on January 17.

The advisors argue that “the CNAF cannot communicate documents that it does not have”. The organization would not have established specifications before designing this algorithm, nor would it have evaluated its effects or sought to test possible biases, after more than ten years of use and hundreds of thousands of checks triggered.

source site-30