Instagram is actually a safe photo platform. But the algorithm delivers a toxic mix and apparently plays sexualized content from children – who shouldn’t even be on the platform.
In the spring of 2010, two Stanford graduates, Kevin Systrom and Mike Krieger, were working on a photography service in San Francisco. A few months later, the Instagram app appeared. The rush is huge, within just a few hours thousands of people download them. The computer systems keep crashing. In 2012, Facebook sold for $1 billion.
Today are worldwide two billion people logged in on Instagram. The photo app is the fourth largest social network after Facebook, YouTube and Whatsapp. It is no longer just a photo community: videos have also been uploaded since 2013. 2020 were Reels introduced, short clips with effects, like on Tiktok.
Since 2016, Instagram no longer shows posts chronologically but uses an algorithm. This will now be the platform’s downfall. Because he ensures that sexualized clips of children are flushed into the timelines. Have that Journalists from the Wall Street Journal (WSJ) found out. They followed young athletes, teenagers and influencers with test accounts. And users who follow these young girls. The result was a toxic mix of sexual content featuring minors and adult pornography.
Companies stop advertising on Instagram
For Jürgen Pfeffer, Professor of Child Protection on Social Media Platforms at the Technical University of Munich, this is hardly surprising. Because that’s how the network works: We see what Instagram thinks we want to see. “As long as you are not interested in these topics, it is quite possible that you will spend years on social media on the internet without seeing problematic content,” says Pfeffer in the ntv podcast “Learned something again”.
Facebook removed 16.9 million pieces of child sexual exploitation-related content between July and September alone, twice as much as the previous quarter. At X, ten to twenty percent of the content revolved around pornography, according to the expert. The former Twitter is the only major platform where these are not banned.
According to WSJ research, between the offensive videos there is advertising from big brands such as Disney, Walmart, Tinder and the Wall Street Journal itself, which are not very enthusiastic about it. Some, like Tinder parent Match Group, have now stopped advertising on Instagram Reels because of this.
“Strict guidelines for nudity”
The parent company Meta seems to have been aware of the problem for a long time. But a Meta spokeswoman only wants to answer ntv.de’s interview request in writing: “We do not allow content on Instagram that is related to the abuse and endangerment of children and have strict guidelines regarding nude depictions of children.” she writes in an email. “We remove content that explicitly sexualizes children. More subtle forms of sexualization, such as sharing images of children in conjunction with inappropriate comments, will also be deleted.”
In order to filter out images and content that endanger children, Instagram says it uses photo matching. This applies, for example, to photos with naked children. It scans all images automatically and deletes them if necessary. Nevertheless, there is a lot of content from children on Instagram and other social platforms that should not be there according to child protection or the guidelines.
This is partly because automatic systems find it more difficult to analyze video content than text or still images, writes the WSJ. It’s even more difficult with Reels: They show content outside of your own circle of friends – from sources that the user doesn’t follow. Meta employees had security concerns even before Reels 2020 was introduced.
Algorithm needs to be trained
Instagram filters out problematic content using AI and machine learning, explains Angelina Voggenreiter, research assistant at the Technical University of Munich in the podcast. “For this to happen, the algorithm first needs a test set of sufficient data that shows what images are that shouldn’t be there.” But this is particularly difficult with child pornography. Because this data is not necessarily available in this quantity, especially at the beginning.
The systems detect nudity well, “but when it comes to children who are half-dressed, it is very, very difficult to filter them out automatically,” explains Voggenreiter. The platforms are largely dependent on other users reporting such content, but are overwhelmed by the large number of these reports and cannot delete them immediately.
In addition to the problem that automatic systems find it more difficult to analyze video content than text or still images, social media platforms can best monetize these: users should spend as much time as possible on them. Experience has shown that people stay on the screen longer with moving images than with text or photos. During this time, more advertising can be shown, which brings in more money.
Employees are not allowed to change the algorithm
Instagram has also optimized its algorithm so that users are suggested videos that match their interests or previously clicked content. So if you often watch young girls or women in bikinis, similar videos will be suggested. An infinite stream – that continues until one actively searches for other content or these recommendation algorithms are “significantly” changed. But Meta doesn’t want that, current and former Instagram employees told the WSJ. In order not to lose users, as the report explains.
Technically this is not a problem, says social media professor Jürgen Pfeffer. “Have you ever tried to watch the goals from your favorite Bundesliga club’s last game on YouTube? You probably won’t find them because there is a strong copyright of media companies behind them. It works very well, content that you shouldn’t see, also to hide.” Better regulation at European level could be a solution.
The age of the user is not checked
Another weak point of Instagram is the age of the users. Actually, you are only allowed to register from the age of 13. In fact, there are likely millions of children on the platform. Although the age is asked when registering to ensure that the children state it correctly, it is not checked. Once registered, you can move freely on the platform. Only when they upload content themselves do they have to verify their age, says expert Voggenreiter. But that is difficult and is rarely enforced.
Meta is said to have known about this problem for years but ignored it. Dozens of US states have therefore sued the company. Instagram deliberately lured children to the platform, the lawsuit says. And also collected personal information such as their locations or email addresses. Even though the parents didn’t agree – which is actually required.
This is no surprise for Pfeffer. Because user numbers are the most important “commodity” for companies like Meta. “An important metric for the stock market price of platforms is: how many new users they have per month,” the social media expert analyzes in the podcast. Controlling the hundreds of thousands of new Instagram accounts every day would be expensive and time-consuming. A delay in the registration process could have an impact on the company’s success.
Despite all the justified criticism of Meta, the experts see another, much bigger problem on Instagram: family influencers. These are parents who show off their children on their accounts, sometimes half naked in diapers and bikinis. They feed the algorithm with these images and videos and provide pedosexual users with new “material” for their fantasies.
“Learned something again” is a podcast for those who are curious: Why would a ceasefire probably just be a break for Vladimir Putin? Why does NATO fear the Suwalki Gap? Why does Russia have iPhones again? What small behavioral changes can save 15 percent energy? Listen in and get a little smarter three times a week.
You can find all episodes in the ntv app RTL+, Amazon Music, Apple Podcasts and Spotify. For all other podcast apps you can use the RSS feed.
You have a question? Please send us an email to [email protected]