The following statement will be read at Meta’s annual shareholder meeting on 5/28/2025:
“My name is Dani Nurick, and I am the Director of Advocacy at JLens, a network of Jewish investors. I’m here to present Proposal 8, which asks Meta to disclose an annual report assessing the effectiveness of its policies and practices in addressing hate speech—particularly antisemitic content and hate targeting other vulnerable communities, like the LGBTQ+ community and people with disabilities.
With over 3 billion daily users and a market cap over one trillion dollars, Meta has outsized influence—and an equally outsized responsibility.
We’ve all seen the powerful ways social media can bring people together. But we’ve also seen it used to spread violent hate, radicalize users and even livestream mass shootings, –such as the mass casualty events in Christchurch, New Zealand and Buffalo, New York. We’ve also seen it used to spread a wave of antisemitism following the October 7th terrorist attacks in Israel.
In 2024, ADL research showed that 41% of Jewish adults reported altering their online behavior to avoid being recognized as Jewish. 49% of respondents within the LGBTQ community and 45% of people with disabilities also experienced online harassment. And 61% of adults who were harassed experienced incidents on Meta’s Facebook, far more than any other social media platform.
These spikes occurred when Meta’s content moderation policies were stronger. Yet since then, the Company has opted to scale back its content moderation, not strengthen them, including its January decision to replace fact-checking with user-generated “community notes.”
In April 2025, Meta’s own Oversight Board questioned whether the rollback of policies had gone too far, urging the company to assess the human rights impact of these changes.
This hands-off approach to content moderation is not only a moral failure–it poses serious business risk. Reputational damage, regulatory scrutiny, legal exposure, and a loss of user trust can all undermine platform engagement, shrink ad revenues and erode shareholder value.
As it says in the Book of Exodus, “Do not follow the majority to do harm.” That verse, first directed at judges in ancient Israel, is a reminder that leadership demands moral courage. Proposal 8 calls on Meta to lead—not follow—and to act with integrity rather than expediency.
Meta may have hate speech policies on paper, but policies without proof of impact are not enough. Investors deserve transparency into whether those safeguards are effective.
Meta has the scale, the data, and the global influence to lead– but leadership demands accountability.
Supporting this proposal is a chance to restore trust with users and shareholders alike.
The risks are real. The harm is measurable. And the remedy—greater transparency—is very clearly attainable. It’s not just good policy, it’s good governance and in shareholders’ best interest.
Thank you.”
PLEASE NOTE: THIS IS NOT A PROXY SOLICITATION AND NO PROXY CARDS WILL BE ACCEPTED. The Anti-Defamation League and JLens are not asking for your proxy card and cannot accept your proxy card. Please DO NOT send us your proxy card.