AI Tool 'Grok' Under Investigation for Generating Fake Sexual Content Involving Minors and Women

In a significant escalation of the ongoing debate surrounding artificial intelligence, international calls for investigations have surged following the emergence of fake sexual videos and images produced by the AI tool Grok, integrated into the platform X. These materials have reportedly depicted minors and adult women in inappropriate contexts.
On Friday, Grok acknowledged the existence of "flaws" that allowed users to access digitally altered images and videos featuring full or partial nudity of real individuals. This admission has led to strong criticism and prompted French authorities to expand their judicial inquiry into the platform.
* "We are urgently addressing the flaws"
The Grok account on platform X responded to user complaints, stating:
"We have identified flaws in our protection procedures and are working to rectify them immediately. Child sexual exploitation is illegal and prohibited."
Reports indicate that some users have submitted images or videos of children and adolescents, requesting the program to alter them to display nudity. Meanwhile, X AI, the developer of the program, has not issued an official comment on the controversy, merely responding to some media inquiries with claims that traditional media "lies."
However, Grok warned visitors to the platform that any company in the United States "exposes itself to civil or criminal liability if it knowingly facilitates the production of child pornography or fails to prevent it."
* Issues Affecting Adult Women as Well
The problem extends beyond minors, as some internet users have utilized Grok to modify images of adult women on social media, aiming to expose them digitally.
The Indian Ministry of Electronics and Information Technology has responded by issuing an official notice to platform X, demanding a detailed report within 72 hours on measures taken to remove "obscene, nude, inappropriate, and sexually explicit content" created by the application without the consent of the women involved.
* Expanded Judicial Inquiry in France
In France, the Paris prosecutor's office has broadened its investigation into platform X to include the Grok tool, following complaints from three ministers and deputies who asserted that it generated and disseminated sexually explicit fake videos featuring minors without their consent.
The ministers and deputies confirmed that the videos relied on deepfake technology and demanded their immediate removal from the platform. The initial inquiry was launched last July in response to reports concerning algorithm manipulation on the platform.
This latest crisis raises significant questions about the safety of artificial intelligence tools and the responsibility of developing companies to prevent misuse, particularly regarding the rights of minors and women.
