The Big, Obvious Reason Why Elon Musk’s Anti-Media Lawsuit Will Backfire
Musk's lawsuit against Media Matters could reveal embarrassing secrets about the company's handling of far-right content.
Elon Musk’s defamation lawsuit against Media Matters is shaping up to be one of the worst business decisions the Tesla founder has made in a while—which is a kind of accomplishment, given his disastrous time at Twitter/X.
In the three days since Musk filed the suit in a U.S. District Court in Texas, legal experts have openly dismissed the legal challenge as an effort to silence the press as well as criticism of Musk’s behavior and acumen. But few have completely shirked the pressure of the suit more than the man on the receiving end of it, Media Matters President Angelo Carusone.
In an interview with The Washington Post, Carusone said that if the lawsuit doesn’t get dismissed, the media watchdog will pursue discovery, the wide-ranging legal process by which evidence and information is shared between prosecution and defense and, by way of being utilized in court, could become public record—or fodder for another Media Matters report.
Carusone said if it comes to that, Media Matters would seek communications regarding whether executives at the social media company “knew internally” about the failed safeguards against placing major brand advertisements back-to-back with white supremacist, pro-Nazi content. Carusone also told the Post that they would be seeking other internal communications regarding Musk’s overt antisemitism on the platform.
Media Matters’s investigation revealed that X, formerly known as Twitter, was placing ads from reputable companies alongside antisemitic, pro-Nazi posts. The ensuing fallout resulted in the hemorrhaging of some of X’s biggest and markedly safe advertisers, such as Apple, IBM, Disney, Lionsgate, and Paramount.
X claimed that the watchdog’s report was an inaccurate representation of their algorithm, arguing that Media Matters had artificially manipulated the report’s results by following just 30 accounts on the platform and refreshing pages at a higher rate than average. Yet Carusone said that wasn’t the point of the investigation—instead, Media Matters proved that X-touted safeguards meant to prevent this from happening either don’t exist or are completely ineffective.
“The point that we’ve been making is that the filters that they say exist are not working the way that they claim,” Carusone told the Post. “Ads can and do run alongside extremist content.”