Bright splashes of colors and upbeat messages cover the walls, while Playstations litter the “fun floor.” But instead of frolicking cats and vacation snapshots, the computer screens show rants, violent pornography, and torture videos. Welcome to Facebook’s latest German facility for deleting hate speech and offensive content.
Located in the industrial city of Essen, the branch is Facebook's second Competence Call Center in Germany, after the first opened in Berlin in 2015. Commonly known as "deletion centers," both are run by subcontractors working for Facebook, employing 750 staff each.
Facebook’s efforts come in response, at least in part, to a new German law that requires media platforms to remove illegal content or face hefty fines. The social media giant is in image-repair mode following revelations that the company allowed private data to be harvested for the US presidential election campaign. By cracking down on hate speech and illegal content, Facebook hopes to avoid outside regulation.
Employees who deal with harrowing material are encouraged to take regular breaks.
By the end of 2018, the two German centers will each have 1,000 employees sifting through suspicious and offensive content. Facebook ultimately wants to hire 20,000 staff worldwide to perform this tricky task, which requires a particular skill set.
Team members must speak English and another language – preferably German, Arabic or Russian – and have the stomach to digest extremely disturbing material. The job can be trying, even for experienced content moderators. Anyone dealing with harrowing material is encouraged to take regular breaks. Five psychologists are on call at the Essen facility to counsel and advise on recruitment.
Management says most people can handle the pressure, pointing to low rates of sickness and staff turnover. Things have obviously improved since 2016, when the press highlighted shortcomings in staff support at the Berlin center. Now, there's no shortage of job applicants for this sensitive occupation, which also requires a nose for cultural nuance. Behind the centers' darkened windows, employees work anonymously, their names and faces withheld from the public eye.
That policy, however, has caused other problems. Last month, activists from the right-wing “identitarian movement” climbed onto the roof of the building, lighting flares and hanging banners condemning “Facebook censorship.”
Offensive material is “fingerprinted” and in the event of republication, an algorithm alerts human supervisors.
The Essen facility is part of Facebook's global network of specialized locations, some of which are dedicated to combating online bullying. In London, Erin Saltman is the head of Facebook’s anti-terror operations in Europe, Africa and the Middle East, a unit of 200 staff including lawyers, forensic scientists and human rights experts.
To win the battle against content abusers, Ms. Saltman says Facebook weds human expertise with sophisticated technology such as machine learning, which allows computers to teach themselves to understand data. For example, Facebook employs photo- and video-matching to parse ISIS and Al Qaeda content; offensive material is “fingerprinted” and in the event of republication, an algorithm alerts human supervisors.
Facebook has rapid reaction teams in every time zone, as time is of the essence. Disseminators of dubious content like hate propaganda or fake news swiftly learn to work around existing controls, says Dipayan Ghosh, a former policy advisor to Facebook. Much will depend on the development of stronger algorithms, he says, as well as fierce monitoring at centers like those in Essen and Berlin.
Johannes Steger is an editor with Handelsblatt's companies and markets desk in Düsseldorf. Brían Hanrahan adapted this article into English for Handelsblatt Global. To contact the author: [email protected]