Facebook thinks it has come up with a way to stop revenge porn from being posted on the platform. The social network announced Friday that it has developed a new tool that uses artificial intelligence to detect revenge porn on the platform before the post has been reported.

Using machine learning and artificial intelligence, Facebook says that it can detect nude or near-nude images and videos that have been shared on Facebook or Instagram and then kick them over to its Community Operations Team to review, Facebook explained in a blog post. If it violates Facebook’s community standards then it will be removed, and in many cases, Facebook will also disable the account that posted it (although it does offer an appeals process if someone feels like Facebook has made a mistake).

The program is in addition to a new pilot program that it is running with victim advocate organizations which gives them an emergency option to securely report a photo to Facebook. That program allows Facebook to create a “digital fingerprint” for the image and prevent it from being shared on the platform to being with. Facebook says that so far the pilot program has been successful, and …read more

Source:: Fortune.com – Tech

      

(Visited 3 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *