Auschwitz Memorial Excoriates Elon Musk's X for Holocaust Denial Content

The Nazi death camp memorial asks if X is engaged in “protection of hate speech.”

The Auschwitz Memorial has excoriated X for failure to act on Holocaust denial posts on the platform. On Wednesday last week, BBC reporter Shayan Sardarizadeh posted a screenshot of a Holocaust-denying post on X by far-right influencer Angelo John “Lucas” Gage. 

Gage’s post, which remains intact on the platform, questions the legitimacy of the Holocaust, deploying a specious argument about photographs of those murdered in Nazi death camps.

A spokesperson for the Auschwitz Memorial emailed the Hill Reporter on Thursday about Gage’s post and more broadly about Holocaust denial posts that appear on X: 

“The only thing we can do on X is to report denial posts — as every other user. We do this whenever we encounter denial. In recent months we have noticed many cases when reporting a Holocaust denial post does not give results. ‘Violent event denial’ is officially forbidden on the platform.” 

Sardarizadeh wrote about Gage’s post: “There’s an overwhelming body of evidence about the Holocaust, including Nazi gas chambers and extermination camps, that anyone can easily access. But far-right influencers like Angelo John Gage don’t care about evidence. No amount of evidence will ever convince them.”

Gage’s post on X is followed by a user-sourced Community Note, saying, “Readers added context they thought people might want to know,” with a link to Concentration Camp photos from the United States Holocaust Memorial Museum. The note says, “The Holocaust of WWII took place before the invention of smart phones, internet & social media. Nevertheless, there is ample photographic evidence (mostly taken AFTER liberation) that it did take place and that victims included minors.”

As of this writing, Gage’s post remains intact, fully viewable on X, and appears enabled for reposting and propagation. The post currently has more than 222,000 views, has been reposted 1,600 times. X’s rules against “Violent Event Denial” prohibit denial of “events like the Holocaust.” Last August, X removed a Holocaust-denying post after the Auschwitz Memorial criticized it. 

Full statement from the Auschwitz Memorial is available here

.

Earlier this month, the Auschwitz Memorial posted screenshots of X users denying the Holocaust, and X’s automated response email saying such posts didn’t violate X rules. The Memorial posted: 

“The hatred grows slowly — from ideas, prejudice, and words of dehumanization. It’s worrying that such expressions of Holocaust #denial, #antisemitism, and hatred do not violate rules on X that ‘prohibit content that denies that mass murder or other mass casualty events took place.’ Holocaust denial is a dangerous & hideous carrier of antisemitism & hatred. Deniers hate. By distorting and rejecting facts, they insult the memory of the victims.”

In January, X owner Elon Musk visited Auschwitz after facing an uproar over antisemitic content on X. Months after he endorsed an antisemitic conspiracy theory, Musk visited the site of the Nazi death camp, and said, “I’m aspirationally Jewish,” Musk said. “So I was like, ‘What are people talking about with this antisemitism?’ Because I never hear it at dinner conversations."

Seth Abramson, professor, attorney and journalist, posted on X last week, in response to the Auschwitz Memorial’s statement: 

“The refusal of Elon Musk and Linda Yaccarino to abide by their own policies and take down Holocaust-denial content is a scandal — especially as Musk just visited a concentration camp as a photo op and is trying to make far-right political hay out of antisemitism in grotesque ways.”

The European Commission’s Investigation into X is ongoing

In December, the European Union announced it was launching a formal investigation into X over potential violations of social media law. The German penal code prohibits publicly denying the Holocaust and disseminating Nazi propaganda.

On Friday, a spokesperson for the European Commission confirmed in an email to the Hill Reporter that the EU’s investigation into X under the DSA (Digital Services Act) is “still ongoing” and a topic under investigation is “risk of disseminating illegal content.”

The EU also said in its email, “if such [illegal] content (holocaust denial) is flagged to a platform, the platform would need to take it down expeditiously where it is illegal, hence at least geofence it.” The EU also said the DSA requests “uniform enforcement” of the platform’s terms and conditions.

X does, presumably, have geofencing capability – the ability to block content based on user location. A Twitter blog post from 2012 announced the capability of geoblocking, addressing the need to “restrict certain types of content, such as France or Germany, which ban pro-Nazi content.”

Last August, X CEO Linda Yaccarino told CNBC: “So if you’re going to post something that’s illegal or against the law, you’re gone, zero tolerance.” She did not, however, account for posts that are illegal in Europe. The efficacy of X’s geoblocking capability to restrict illegal content by location is unknown. No one from X responded to inquiries.

Free Speech v. Brand Safety for Advertisers

Yaccarino continues to post on X, arguing in defense of “free speech” on the platform. This week, X won an argument in an Australian court when a judge ruled in its favor, refusing to extend a block of video showing a bishop who was stabbed in a church. 

But advertisers have freedom of choice about where to place their ads. Last year, several large brands, including Disney, Apple and IBM suspended advertising on X after Musk backed an antisemitic conspiracy theory. Last month, carmaker giant Hyundai suspended its advertising on X after its ad appeared next to antisemitic posts from a user who has posted pro-Hitler content, as NBC News detailed, following my post with screenshots on X.

In November, Musk told advertisers, “Go fuck yourself,” in a live interview at The New York Times’ Dealbook Summit, aimed at those advertisers that stopped advertising on the platform. In February, the Hill Reporter cited comments from several X advertisers in its reporting in February, asking, “Does Brand Safety Exist for Advertisers on Elon Musk’s X?”

In April, a new report from the Center for Countering Digital Hate (CCDH), a nonprofit watchdog group, found that X is potentially profiting from a spike in anti-Jewish and anti-Muslim content, exploiting the Israel-Gaza conflict, serving ads next to “hateful posts from brands like Oreo, the NBA, the FBI, Elon Musk’s Starlink – and even X itself.” 

Imran Ahmed, CEO of the CCDH told the Hill Reporter, “the site is rife with hate and disinformation.” He said X has a “fundamental misunderstanding” of advertisers’ needs. “It’s not just about ad adjacencies. They [advertisers] don’t want to be in this environment,” he said. 

Musk had sued the CCDH last year, attempting to muzzle the group’s research on hate and disinformation. But in March, the CCDH won dismissal of the ‘baseless and intimidatory’ lawsuit brought by Musk.

In March, the nonprofit watchdog group Check My Ads announced that X “has lost its most coveted seals of approval in the ad industry, likely making it much harder to lure back advertisers who fled.” The organization reported:

“Late last summer, we learned that Trustworthy Accountability Group had quietly renewed X’s Brand Safety Certified seal — and immediately filed a formal complaint. Anyone who’s been on the platform since Elon Musk took it over knows the cesspool it’s turned into. Musk’s own antisemitic comments drove away advertisers like Disney, Lionsgate, and Apple.”

When Musk acquired Twitter in 2022, he tweeted an open letter to advertisers, saying “Twitter obviously cannot become a free-for-all hellscape.”

Last year, the Global Alliance for Responsible Media (GARM), an initiative of the World Federation of Advertisers, responded to Musk’s open letter with a “Dear Elon” open letter, posted on Linkedin by GARM co-founder Rob Rakowitz, calling on Musk and Twitter to uphold its existing commitments on brand safety. 

Stephan Loerke, CEO of the WFA, emailed the Hill Reporter on Monday, saying, “GARM is a cross-industry initiative to set voluntary industry standards and frameworks. X [is] still in GARM membership based on that it is a voluntary initiative.”

But Nandini Jammi, co-founder of Check My Ads, emailed the Hill Reporter, saying, “GARM charter very clearly requires members to commit to protect everyone from online harms, including ‘hate speech, bullying and disinformation.’ It has not been encouraging to watch [GARM Director] Rob Rakowitz writing public love letters to Elon Musk and turn the other cheek when Musk started bringing Nazis back to the platform over the last year. To me, it suggests the organization’s mission has been compromised.”

Earlier this month, Musk reinstated the X account of neo-Nazi Nick Fuentes. The white nationalist often praises Adolf Hitler and questions whether the Holocaust happened. The Texas Tribune reported that Fuentes “has called for a ‘holy war’ against Jews and compared the 6 million killed by the Nazis to cookies being baked in an oven.”

In March, Media Matters, another nonprofit watchdog that Musk has sued, reported: “Holocaust denier Nick Fuentes celebrates Candace Owens: ‘She has been in a full-fledged war against the Jews.’”

Board members of publicly traded corporations have a fiduciary duty to shareholders to protect the corporation from reputational harm. Publicly traded corporations that are still advertising on X include entertainment giant Netflix ($NFLX), which resumed advertising on X in December after halting nearly $3 million worth of ads, as the New York Times reported in November. No one from Netflix has responded to inquiries.

In a seeming effort to win back advertisers, in January, X announced it planned to hire 100 content moderators for a new Trust and Safety center, based in Austin, Texas. Bloomberg reported that the center would be primarily focused on child sexual exploitation content. In April, X announced two new hires: Kylie McRoberts was promoted internally to new head of safety, and X hired a head of brand safety, Yale Cohen.

But the Auschwitz Memorial wrote last week, saying that X’s failure to act on Holocaust denial posts “demonstrates that the moderation system does not work properly. In the post in question [by Lucas Gage], we also reported it and we did not even get a confirmation that they ‘received our case’ and will ‘investigate.’ The question remains — is it a sign of a broken system, or something much more dangerous and worrying — protection of hate speech?”

No one from X has responded to inquiries.