Tech firms say their reporting mechanisms are robust, but new report says child porn not removed fast enough
By Karen Pauls, Cameron MacIntosh
She was only 14 when she was groomed to have virtual sex with an older man she met on social media...
Some of that work is being done by Hany Farid, a professor who specializes in the analysis of digital images at the University of California, Berkeley. He co-developed a Microsoft program called PhotoDNA, which creates a digital signature of an image and compares it to others to find copies.
But he said buy-in has been slow because the removal of child sexual abuse images has been mostly left to the discretion of the industry.
"Content moderation is never made easy, and there's a really simple reason for it — because they don't want to do content moderation," Farid said.
"All social media sites profit by user-generated content. Taking down and having to take down and having to review material — it's bad for business ... It creates a liability. And so, they don't want to do it."
Hany Farid is a professor at the University of California, Berkeley, with a joint appointment in the School of Information and department of electrical engineering and computer sciences.