Facebook moderators ‘develop PTSD because they are exposed to the worst content on the internet’
Anita Singh, The Telegraph –
Facebook moderators are developing post-traumatic stress disorder because the company is using them as “human filters” for the most horrific content on the internet, according to a leading cyber expert.
The social media giant announced this month that it is hiring an extra 3,000 moderators to review terrorist material, child exploitation videos and hate speech.
But Dr Mary Aiken, a forensic cyberpsychologist who studies the impact of technology on human behaviour and is academic advisor to the European Cyber Crime Centre at Europol, said the move was highly irresponsible. Moreover, she said, it is a useless number when set against Facebook’s two billion users.
“When Facebook say that they’re hiring 3,000 moderators, I’d say two things,” she told an audience at the Hay Festival.
“You’ve two billion members, one post a day – that’s two billion pieces of data. No good, having 3,000 moderators.
“And secondly, I have a major ethical problem with 3,000 young people coming out of college and being exposed to that extreme content and being used as human filters by any commercial entity.”
In her work studying the effects of exposure to harmful online content, Dr Aiken said that “we see post traumatic stress disorder, early signs of it, in the content moderators who are looking at extreme content. It’s only a matter of time before we see post-traumatic stress disorder in children who are looking at extreme content.”
To read the full article in The Telegraph click here: