Business
Google And X Lag Peers In Addressing Non-Consensual Explicit Images, Lawmakers Say
According to a letter submitted to Google, X, and Discord on Friday, the internet titans might be doing more to address the growing problem of nonconsensual pornographic pictures online.
The letter criticizes almost a dozen technology companies for failing to participate in two programs that make it easier for users to request the removal of nonconsensual sexual photographs and videos from the internet.
Google And X Lag Peers In Addressing Non-Consensual Explicit Images, Lawmakers Say
The programs are voluntary, but they already include other internet titans, including Meta, Snap, TikTok, and PornHub. And the letter comes as politicians and tech leaders are under pressure to do more to prevent nonconsensual sexual photos, sometimes known as revenge porn, particularly as artificial intelligence makes it simpler to create and share such content.
This year, AI-generated pornographic photos targeted women all around the world, from popstar Taylor Swift to high school pupils. While nine US states presently have laws prohibiting the creation or sharing of nonconsensual deepfake photographs, none exist at the federal level, restricting the possibilities for victims of this type of harassment who want to seek treatment or responsibility.
The letter, first released by CNN on Friday, is addressed to the CEOs of 11 major companies: X, Google’s parent firm Alphabet, Amazon, Match, Zoom, Pinterest, Discord, OpenAI, Twitch, Microsoft, and Patreon.
It encourages them to participate in the National Centre for Missing and Exploited Children’s “Take it Down” program, which assists people in removing nude or sexually explicit images or videos of children from online platforms, and the Revenge Porn Helpline’s “StopNCII” initiative, which assists adults in removing explicit images that were shared online without their consent. Both systems allow users to generate a unique numerical code for an image they want to erase, which participating platforms can then use to search for and remove the image.
“By increasing participation in these programs, companies can take actionable steps to stop the life-altering impact that the (nonconsensual intimate imagery) has on the life, career and family of those affected,” according to the letter. Democratic Senator Jeanne Shaheen and Republican Senator Rick Scott sponsored the letter, which eleven additional senators co-signed.
Most companies cited in the letter have policies prohibiting the creation or sharing of nonconsensual, explicit photos; some even provide their methods for users to complain or seek the removal of such information. Google recently declared that it intends to avoid such content appearing near the top of search results.
Google And X Lag Peers In Addressing Non-Consensual Explicit Images, Lawmakers Say
The advantage of joining the group is that users just need to submit one removal request, which is sent to all participating platforms, rather than contacting each company individually.
The fight against nonconsensual pornographic photos and deepfakes has won rare bipartisan support. A group of teens and parents who had been affected by AI-generated porn testified at a Capitol Hill hearing where Republican Sen. Ted Cruz introduced a bill — supported by Democratic Sen. Amy Klobuchar and others — that would make it a crime to publish such images and require social media platforms to remove them upon notification from victims.
SOURCE | CNN