Facebook CEO Mark Zuckerberg returned to Capitol Hill on Wednesday for the first time since April 2018, answering a litany of questions about Facebook’s digital currency project and how it balances freedom of expression with demands it prevent the spread of false information. One exchange, on its approach to the controversial anti-vaccination movement, underlined the many ways its strategy can get muddled.
The hearing, held by the U.S. House Committee on Financial Services, was billed as an opportunity for lawmakers to probe the company’s plan to launch a global digital currency, called libra. The agenda for the meeting quickly derailed in the opening minutes when chairwoman Maxine Waters (D-California) ripped into Zuckerberg for what she called an inability to adequately govern the platform he created.
“As I have examined Facebook’s various problems, I have come to the conclusion that it would be beneficial for all if Facebook concentrates on addressing its many existing deficiencies and failures before proceeding any further on the Libra project,” she said. Zuckerberg’s response: “While we debate these issues, the rest of the world isn’t waiting. China is moving quickly to launch similar ideas in the coming months.”
Waters’ opening remarks set the tone for what took place during the remaining four-hours-plus of testimony. Legislators questioned Facebook’s decision to continue to run political ads with false information and failure to stop foreign governments from interfering on the platform. One revealing moment came from an outspoken anti-vaccination supporter, Congressman Bill Posey (R-FL), who wanted assurance Facebook would “support users’ fair and open discussions and communications related to the risk as well as the benefits of vaccinations.”
“We do care deeply about giving people a voice and freedom of expression,” Zuckerberg said. “At the same time, we also hear consistently from our community that people want us to stop the spread of misinformation. So we try to focus on misinformation that has the potential to lead to physical or imminent harm, and that can include misleading health advice.”
Facebook’s has tried to tackle the spread of misinformation by lowering its value in News Feed and making it easier for users to report false posts. Independent third-party fact-checking organization review them—if they determine a story is false, it will be flagged as disputed and there will be a link to a corresponding article explaining why. But Facebook fact-checkers have described the process like “playing a doomed game of a wack-a-mole.” These various approaches have been widely criticized for not doing enough to stomp out the spread of false information across the platform.
In 2014, the Centers for Disease Control estimated that vaccinations have prevented more than 21 million hospitalizations and 732,000 deaths among children born in the last 20 years. Scientists have yet to find any evidence for claims that vaccines can cause illnesses like autism. But anti-vaccine sentiment, which has flourished on Facebook and other social platforms, has led some parents to forgo vaccinations, leading to the rebound of some childhood diseases like measles. In March, Facebook rolled out a new policy on anti-vaccination content, including the decision to reject ads with false information.
Zuckerberg, who told Congressman that his “understanding of the scientific consensus” is that people should get their vaccines, said Facebook won’t stop its users from posting information that’s wrong.
“If someone wants to post anti-vaccination content, or if they want to join a group where people are discussing that content, we don’t prevent them from doing that. But we don’t go out of our way to make sure our group recommendation systems try to encourage people to join those groups.”
In other words, Facebook won’t prevent one of its 2 billion users from posting false information—it may not even flag it as wrong. The Facebook algorithm just won’t help it gain traction. If the user can spread that information on his own, then in Zuckerberg’s words, that’s “freedom of expression.”
I’m an associate editor at Forbes covering Facebook and social media. I previously worked as an editor for Popular Science, Gizmodo, and Mashable leading investigations and spotting emerging trends. In 2016, I authored an investigative series that pried open the inner workings of Facebook’s Trending Topics and news operation, causing a global referendum on how the social network curated the news for its readers. Follow me on Twitter at @MichaelFNunez and email me at firstname.lastname@example.org. Securely share tips at https://www.forbes.com/tips/