A bipartisan group of 35 attorneys general in the US has issued an open letter to xAI, demanding that it takes immediate action to protect the public and users of its platform, particularly women and girls who are being targeted by non-consensual intimate images generated by Grok's chatbot. The move comes after a flood of such images were created on xAI's platforms earlier this year.
The letter, signed by attorneys general from at least 25 states, calls for xAI to remove all content that violates federal law, including non-consensual intimate images, and suspend offending users. It also demands that the company give users control over whether their content can be edited by Grok and provide a way for users to report suspicious activity.
The US has seen an international wave of regulator attention on platforms like xAI after people used its chatbot to generate non-consensual intimate images, including those featuring children. A recent report from the Center for Countering Digital Hate estimated that Grok's account generated around 3 million photorealistic sexualized images, including 23,000 featuring children.
Some states have passed laws requiring websites and platforms to verify users' ages before displaying content deemed pornographic or harmful to minors. However, these laws are being applied to social media companies for the first time, raising questions about how they can be enforced.
The letter signed by the attorneys general has sparked a debate over whether age verification laws should apply to platforms like X and Grok.com. While some argue that it's necessary to protect children from exploitation, others say that such laws are too broad and would infringe on free speech rights.
As the situation continues to unfold, xAI has responded with a statement saying "Legacy Media Lies." The company also claims it has stopped Grok's X account from generating non-consensual content. However, critics argue that more needs to be done to address the issue.
The move by the attorneys general highlights the growing concern over AI-generated child exploitation and the need for greater regulation of these platforms. It also underscores the challenges of balancing freedom of speech with the protection of vulnerable populations.
The letter, signed by attorneys general from at least 25 states, calls for xAI to remove all content that violates federal law, including non-consensual intimate images, and suspend offending users. It also demands that the company give users control over whether their content can be edited by Grok and provide a way for users to report suspicious activity.
The US has seen an international wave of regulator attention on platforms like xAI after people used its chatbot to generate non-consensual intimate images, including those featuring children. A recent report from the Center for Countering Digital Hate estimated that Grok's account generated around 3 million photorealistic sexualized images, including 23,000 featuring children.
Some states have passed laws requiring websites and platforms to verify users' ages before displaying content deemed pornographic or harmful to minors. However, these laws are being applied to social media companies for the first time, raising questions about how they can be enforced.
The letter signed by the attorneys general has sparked a debate over whether age verification laws should apply to platforms like X and Grok.com. While some argue that it's necessary to protect children from exploitation, others say that such laws are too broad and would infringe on free speech rights.
As the situation continues to unfold, xAI has responded with a statement saying "Legacy Media Lies." The company also claims it has stopped Grok's X account from generating non-consensual content. However, critics argue that more needs to be done to address the issue.
The move by the attorneys general highlights the growing concern over AI-generated child exploitation and the need for greater regulation of these platforms. It also underscores the challenges of balancing freedom of speech with the protection of vulnerable populations.