Dating apps: moderators are at their wits’ end when faced with unbearable content


A survey carried out among former and current moderators of dating applications shows that they are put in psychological danger by the content they have to process.

PTSD dating apps
Credits: 123RF

THE dating apps are legion. Generally, we talk about it when they add a fun feature, Tinder which allows your mother to choose your future date for example. Or when they are based on information that you don’t think about, like The League which digs into your LinkedIn profile to find your soul mate. Less frequently, they are in the spotlight in the event of controversy. We think of Crush (Friendly), an application that worries parents.

Behind the scenes, however, they all have one thing in common: they expose their moderators to traumatic content. Ana (not her real first name) joins the teams of Grindr although she is a little over 20 years old. A few years later, she left her job with a diagnosis of depression, anxiety, and post-traumatic stress disorder. “I was so anxious that when I went out to go shopping, I passed out twice,” she says. Ana is not an isolated case. Many moderators, in office or not, report similar consequences.

Dating app moderators are often confronted with unsustainable content

Every time you report content on an app like Grindr, Bumble, Hinge or else, the message lands with a moderator. The latter must decide whether the offending user deserves to be banned, and whether to escalate the case to a team in charge of complex situations. We are talking here about situations ofsexual abuseof homophobic violenceof pedophiliaor even murders. One day, several Grindr moderators received a report. Attached are photos showing a child being sexually abused. “Three of us left that day… They couldn’t stand it,” one of those involved remembers.

Still at Grindr, the 14 former moderators interviewed all remember cases of this kind. The psychological distress of the team members became palpable. “People could feel it… They noticed the tension, the hostility of the environment. It was horrible”. For at least one of them, the situation led to suicide attempts, during and after his contract with the company. Employees often suffer in silence, for fear of losing their job and having difficulty finding another one. On the management side, these problems are most often avoided in favor of productivity.

Fewer staff, more pressure, moderators are often alone in difficult situations

Over the years, dating applications, a very lucrative market having brought in $2.6 billion in 2022, seek to maximize their profits. All are reducing local (American) moderation teams to outsource these tasks in Honduras, Mexico, Brazil, India, the Philippines… Where the average salary is much lower than in the United States. Except that the number of reports is not decreasing, on the contrary. Laura (it’s a pseudonym), former member of the teams in charge of cases escalated by moderators at Bumble, remembers.

Read also – Tinder: this AI aims to find appointments for you on the dating site

If the site’s FAQ states thata report is processed within 48 hours, it’s actually much longer. They are sometimes not examined for weeks, despite their seriousness. “There weren’t enough staff to cover the amount of things that were happening. Rather than hiring […], they put pressure on us to get better numbers.” Reports are processed using a color-coded queue system. They are red if they have been there too long. “Everything was always red, all the time,” says Laura.

This understaffed work is not without consequences. While waiting for the file to be processed, the potentially dangerous user can still take action. On Hinge, a woman reported a sexual assault after a date. Due to lack of sufficient resources, she had to wait until she took the man to court one year later so that he is finally banned from the application. However, the management of the various dating sites and apps assures that they take the problem very seriously.

Psychological support teams are set up, with particular attention to moderators. Specific training is also planned upon hiring and then throughout the career. Grindr spokesperson Sarah Bauer says the company will “continue to invest heavily” in automation processes that protect moderators from difficult content.

Source: Wired



Source link -101