With tech companies’ moderation efforts constrained by the pandemic, distributors of child sexual exploitation material are growing bolder, using major platforms to try to draw audiences.
Michael Oghia was on a Zoom videoconference with about 20 climate activists last week when someone hijacked the presenter’s screen to show a video of explicit pornography involving an infant.
“It took me a moment to process it,” said Oghia, advocacy and engagement manager at the Global Forum for Media Developments. “At first I thought it was porn, but as soon as it clicked, I shut my computer. What the hell did I just see?”
Oghia’s call had been “zoombombed” with images of child sexual abuse. He’s unsure whether it was a video or a livestream. It left him feeling traumatized and unable to sleep. “It goes without saying that the real victim is the baby, but I can completely understand why social media content moderators develop PTSD,” he said.
Oghia’s experience is an extreme example of what people who track and try to stop child abuse and the dissemination of child pornography say is a flood of child sexual exploitation material that has risen during the coronavirus pandemic.
And with tech companies’ moderation efforts also constrained by the pandemic, distributors of child sexual exploitation material are growing bolder, using major platforms to try to draw audiences. Some platforms are warning users that when they report questionable or illegal content, the company may not be able to quickly respond.
About the author