We support our Publishers and Content Creators. You can view this story on their website by CLICKING HERE.
NBC senior reporter on all things internet, Brandy Zadrozny, joined MSNBC’s Stephanie Ruhle on Thursday’s edition of The 11th Hour to discuss Meta CEO Mark Zuckerberg’s decision to revise the company’s rules of what constitutes acceptable speech. Zadrozny was not there for subtlety or nuance as she suggested the door is now open for a Myanmar-style genocide.
Ruhle began by declaring, “The internet is your world. You don’t just cover this, you have faced extraordinary abuse online. Tell us what this is going to look like now that we’re not going to have guardrails.”
Zadrozny started by claiming conservative cries of censorship are false, “Well, it’s going to be worse for all the people it’s already bad for: women, minorities, immigrants, gay people, trans people, all the people who are already attacked online. Mark Zuckerberg and Joel Kaplan are now saying, ‘We’ve gone too far protecting you.’ Sometimes conservatives say they are censored, which is not true, according to the data and research. That is not actually happening. But they feel that it is, and their feelings are more important than your truth, so we are going to roll back the things that we have instituted to try to keep those users safe. We don’t care about those people.
Zuckerberg himself has said that it is true, including the dreaded joke police who came for satirical articles. Nevertheless, Ruhle brought out the air quotes and wondered, “So, will this ‘community notes’ system do anything?”
Like everyone else who has defended the previous fact-checker program, Zadrozny could get herself to believe that the fact-checkers are unreliable guardians of the truth compared to partisan social media users, “It’s unclear how they are going to roll out community notes. Like, the fact is, it’s not really an effective tool on X, even though Zuckerberg said that is why we’re instituting it, because it worked so well on X. That’s just not true. Community notes take a really, really long time. Research suggests that those community notes are driven by partisan actors, and the most important thing, they are very, very, very slow.”
Zadrozny then got really crazy as she shifted back to concerns about hateful speech, “And incitement to violence happens very, very quickly. We’ve seen what happened in Myanmar, we have seen what happened with Stop the Steal. These things happen like lightning, and with whatever community notes thing they’ll roll out, what will happen, Zuckerberg said yesterday, ‘We’re no longer going to check those things automatically.’ Machine learning, when it comes to harassment and hate speech, no more. Instead we are going to rely on users to report that. I don’t know if you ever tried to report anything to Facebook. You’re not getting a response.”
By referencing Myanmar, Zadrozny is referring to the massacre and genocide of the Rohingya people that began in 2017. However, Facebook’s fiercest critics argue the problem was more related to the company’s algorithm.
Fortunately for everyone, there is nothing in Meta’s new guidelines that suggests calling for genocide or race massacres is acceptable conduct. As for dumping machine learning, Zuckerberg has said the system simply made too many mistakes.
Here is a transcript for the January 8 show:
MSNBC The 11th Hour with Stephanie Ruhle
1/8/2025
11:36 PM ET
STEPHANIE RUHLE: The internet is your world. You don’t just cover this, you have faced extraordinary abuse online. Tell us what this is going to look like now that we’re not going to have guardrails.
BRANDY ZADROZNY: Well, it’s going to be worse for all the people it’s already bad for: women, minorities, immigrants, gay people, trans people, all the people who are already attacked online. Mark Zuckerberg and Joel Kaplan are now saying, “We’ve gone too far protecting you.” Sometimes conservatives say they are censored, which is not true, according to the data and research. That is not actually happening. But they feel that it is, and their feelings are more important than your truth, so we are going to roll back the things that we have instituted to try to keep those users safe. We don’t care about those people.
RUHLE: So, will this “community notes” system do anything?
ZADROZNY: It’s unclear how they are going to roll out community notes. Like, the fact is, it’s not really an effective tool on X, even though Zuckerberg said that is why we’re instituting it, because it worked so well on X. That’s just not true. Community notes take a really, really long time. Research suggests that those community notes are driven by partisan actors, and the most important thing, they are very, very, very slow.
And incitement to violence happens very, very quickly. We’ve seen what happened in Myanmar, we have seen what happened with Stop the Steal. These things happen like lighting, and with whatever community notes thing they’ll roll out, what will happen, Zuckerberg said yesterday, “We’re no longer going to check those things automatically.” Machine learning, when it comes to harassment and hate speech, no more. Instead we are going to rely on users to report that. I don’t know if you ever tried to report anything to Facebook.
RUHLE: Sister, please.
ZADROZNY: You’re not getting a response.
?xml>