Elon Musk, the billionaire owner of X (formerly Twitter), and the platform's chief executive Linda Yaccarino failed to appear for scheduled police questioning in Paris on April 20, 2026. Both had been summoned by French authorities for voluntary interviews related to an investigation into AI-generated sexualized images hosted and disseminated on the X platform.
French police had requested the interviews as part of an ongoing inquiry into whether X bears legal responsibility for the proliferation of AI-generated non-consensual intimate imagery (NCII) and sexualized deepfakes on its platform.
What French Authorities Are Investigating
The French investigation centers on the hosting, amplification, and failure to remove AI-generated sexualized imagery on X — a category of content that has exploded in volume as generative AI image tools have become widely accessible.
The case touches on several intersecting legal frameworks France and the broader EU have deployed to address AI-generated harmful content:
| Legal Framework | Relevance |
|---|---|
| EU Digital Services Act (DSA) | Requires very large online platforms (VLOPs) to proactively assess and mitigate systemic risks, including non-consensual intimate imagery |
| French criminal law | France has specific prohibitions on the non-consensual dissemination of intimate images, increasingly applied to AI-generated content |
| CSAM and minor protection laws | Any AI-generated content depicting minors sexually is treated identically to real CSAM under French and EU law |
| Platform liability | Post-DSA, platforms can no longer claim blanket immunity — active knowledge of illegal content and failure to remove triggers liability |
Significance of the No-Show
The refusal to appear — or absence without explanation — for voluntary police questioning in France carries legal and diplomatic weight. While the summons were voluntary rather than compulsory at this stage, declining to cooperate with an active investigation in a G7 jurisdiction escalates the legal exposure for X and its executives.
French authorities retain the ability to escalate voluntary questioning to compulsory judicial summons (convocation by an examining magistrate), which would carry far more serious legal consequences for non-compliance.
The development comes against a backdrop of heightened scrutiny of X's content moderation practices across Europe:
- The EU opened formal DSA proceedings against X in 2024 over concerns about disinformation and illegal content
- France's audiovisual regulator (ARCOM) has been among the most active European bodies scrutinizing platform compliance
- Multiple EU member states have pursued independent investigations into harmful AI-generated content on major platforms
The Broader AI-Generated Imagery Problem
The French investigation reflects a rapidly expanding policy challenge: generative AI has dramatically lowered the barrier to producing photorealistic sexualized imagery of real people without their consent. What previously required technical expertise and significant effort can now be accomplished in seconds with widely available tools.
For platforms like X:
- Volume: The scale of AI-generated NCII content far exceeds human moderation capacity
- Detection: AI-generated images are increasingly indistinguishable from photographs without forensic analysis
- Speed: Content can go viral before moderation systems identify and remove it
- Policy gaps: Many platforms' content policies were written before this category of harm existed at scale
Regulators across the EU, UK, Australia, and the United States are actively developing legal frameworks to hold both AI tool providers and hosting platforms accountable for the downstream harms of AI-generated sexualized content.
What Happens Next
With Musk and Yaccarino absent from the April 20 interviews, French authorities are likely to consider their next steps, which could include:
- Issuing formal judicial summons through France's examining magistrate system — these carry legal obligations to appear
- Pursuing a formal complaint against X as a legal entity under French or EU law
- Coordinating with the European Commission's DSA enforcement team for cross-border action
- Requesting mutual legal assistance if formal charges require testimony from US-based executives
The case will be closely watched as a test of whether EU member states can effectively assert jurisdiction over the conduct of large social media platforms — and their billionaire owners — in the era of AI-generated harmful content.