US surgeon general on tech companies’ steps to fight Covid misinformation: ‘It’s not enough’
By Chandelis Duster, CNN
US Surgeon General Dr. Vivek Murthy on Sunday said social media platforms must recognize that they played “a major role” in the spread of misinformation about the Covid-19 pandemic and while they’ve taken some steps to fight back, it hasn’t been enough.
“Some of them have worked to try to, you know, up promote accurate sources like the CDC and other medical sources. Others have tried to reduce the prevalence of false sources and search results,” Murthy told CNN’s Dana Bash on “State of the Union.” “But what I’ve also said to them, publicly and privately, is that it’s not enough. That we are still seeing a proliferation of misinformation online. And we know that health misinformation harms people’s health. It costs them their lives.”
Murthy’s comments come on the heels of his warning last week that such misinformation is “a serious threat to public health.” The Biden administration has recently turned up the heat on Facebook and other social media and technology companies amid increasing concern over misleading claims about coronavirus vaccines.
On Friday, President Joe Biden charged that social media platforms like Facebook are “killing people” with misinformation, saying that “the only pandemic we have is among the unvaccinated.”
A Facebook official told CNN that the White House is looking for “scapegoats” to blame for the country missing the administration’s goals. The White House had sought to have 70% of adult Americans with at least one Covid-19 vaccine shot and 160 million Americans fully vaccinated by July Fourth.
Murthy doubled down Sunday on the administration’s stance that the spread of health misinformation has played a key part in the slowdown in vaccinations.
“Here is a key thing to remember: Health misinformation takes away our freedom and our power to make decisions for us and for our families. And that’s a problem,” he said. “And the platforms have to recognize they played a major role in the increase in speed and scale with which misinformation is spreading.”
Lawmakers have criticized Facebook as failing to stop the spread of misleading claims and conspiracy theories, including debunked claims that mail-in voting is untrustworthy and the 2020 election results were illegitimate. Facebook CEO Mark Zuckerberg, who has testified multiple times before Congress over misinformation and related issues, has struggled to portray the platform as a safe space for users while taking an expansive view on free speech — a position that has allowed fringe claims and theories to take hold on the platform.
Minnesota Sen. Amy Klobuchar, a Democrat, on Sunday said action should be taken regarding vaccine misinformation.
“Social media has greatly contributed to this misinformation. There’s no doubt,” Klobuchar told Bash in a separate interview. “When we have a public health crisis and people are dying every day, enough is enough. These are the richest companies in the world. … There’s absolutely no reason they shouldn’t be able to monitor this better and take this crap off of their platforms that are basically telling people, ‘Oh, hey, there’s problems,’ when we know science proves there isn’t.”
In June, Washington, DC, Attorney General Karl Racine subpoenaed Facebook in a recently disclosed investigation into the social media giant’s handling of Covid-19 vaccine misinformation. The subpoena sought internal records showing how the company has dealt with Covid-related anti-vaccine content. It also called for Facebook to produce documents about all the groups, pages and accounts that have violated Facebook’s policies on the matter; information about how much Covid-19 vaccine misinformation Facebook has removed from its platform; and how much Covid-19 vaccine misinformation is being subjected to third-party fact-checking.
In early 2019, Facebook announced it would take action against vaccine misinformation after a measles outbreak that started in Washington state. Earlier this month, the company acknowledged it is testing prompts that notify users when they may have been exposed to extremist content on Facebook.
This story has been updated with additional details Sunday.
The-CNN-Wire
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.
CNN’s Donald Judd, Maegan Vazquez, Donie O’Sullivan and Brian Fung contributed to this report.