ConnectSafely CEO Larry Magid speaks with Michelle DeLaune, senior VP and COO of the National Center for Missing & Exploited Children (NCMEC) about the center's work and new technology being used at Facebook to very quickly identify and remove child sexual abuse images, commonly referred to as "child ography."
The technology uses artificial intelligence and machine learning to identify likely images that violate Facebook's child nudity or sexual exploitation of children policies. In a blog post, Facebook Global Head of Safety, Antigone Davis, wrote that the company is using "artificial intelligence and machine learning to proactively detect child nudity and previously unknown child exploitative content when it’s uploaded." Davis told Reuters that the technology "helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers." She said that the company has removed.