(CNN) – Meta did not remove an explicit AI-generated image of an Indian public figure until it was questioned by its Advisory Board, the Board said Thursday in a report calling on the tech giant to do more to address the nudity issue. with deepfakes not consented on their platforms.

The report is the result of an investigation Meta’s Advisory Board announced in April into the company’s handling of pornography. deepfakeincluding two specific cases in which explicit images of an American and an Indian public figure were published.

The threat of AI-generated pornography has garnered attention in recent months, with celebrities such as Taylor Swift, as well as American high school students and other women around the world, falling victim to this form of online abuse. Widely accessible generative AI tools have made the creation of these types of images faster, easier and cheaper. Social media platforms, such as Meta, where these images can spread quickly, are facing increasing pressure to combat the problem.

In the case of the image of the American public figure published on Facebook – generated by artificial intelligence and in which she appeared naked and being groped – the company immediately removed the photo, which had previously been added to a bank of matches that it automatically detects. images that violate the rules. But in the case of the Indian public figure, although the image was reported twice to Meta, the company did not remove it from Instagram until the Advisory Council took up the case.

“Meta determined that its original decision to leave the content on Instagram was erroneous and the company removed the post for violating the Community Standard on Bullying and Harassment,” the Advisory Council said in its report. “Later, after the Board began its deliberations, Meta deactivated the account that posted the content.”

The report suggests that Meta is not systematically enforcing its rules against non-consensual sexual images, even as advances in artificial intelligence have made this form of harassment increasingly common. The report also notes that Meta continues to have problems moderating content in non-Western or non-English speaking countries, something for which the company has previously been criticized.

Meta stated in a statement that it welcomed the board of directors’ decision. He added that while specific posts identified in the report have already been removed, the company will “take action” on images of the Indian public figure that are “identical and in the same context” as those flagged by the Advisory Council. “when it is technically and operationally possible to do so.”

In its report, the Advisory Council – a quasi-independent entity made up of experts in areas such as freedom of expression and human rights – made additional recommendations on how Meta could improve its efforts to combat deepfakes sexualized. He urged the company to make its rules clearer by updating its ban on “derogatory sexualized Photoshop” to specifically include the word “non-consensual” and to clearly cover other photo manipulation techniques such as AI.

Meta told the board that it had not added the image of the Indian public figure to its database of photos that violate the rules because there had been no news about it, while the media had reported on the images of American public figure, according to the report. “This is concerning because many victims of false intimate images are not in the public eye and are forced to accept the dissemination of their non-consensual representations or search for and report each case,” the Board said, adding that Meta could consider other factors, including whether an image was generated by AI when determining whether to add it to the bank.

The fight against deepfakes non-consensual forms part of Meta’s efforts to prevent the sexual exploitation of its users. On Wednesday, the company said it had removed about 63,000 accounts in Nigeria that were engaged in financial sextortion scams, in which people (often teenagers) are tricked into sending nude images and then extorted.

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here