Facebook CEO Mark Zuckerberg just announced he’s giving the entire company off for the Thanksgiving Week, and former Facebook security chief Alex Stamos agrees, somewhat: after fighting the 2020 elections online misinformation threat, technology workers have earned a break.
But a week might be too long, Stamos thinks, before they need to turn their attention to what he sees as the next big social media disinformation battle: Covid-19 vaccine information.
The good news, from his point of view, is that the amount of work that technology companies and the government put into fighting misinformation leading up to Election Day, can be transferred to the war against Covid-19 vaccination lies. The mistake would be not making that transition in full, and quickly.
“A huge amount of work has gone into this election and we can’t let that work go to waste on Nov. 4 and no longer be making progress on disinformation,” Stamos, who now directs the Stanford Internet Observatory, said at the CNBC Technology Executive Council Summit this week. “And in the U.S., the most critical will be around Covid and vaccines, which we’ll start to see hopefully come out next year. The most important disinformation campaigns will be about Covid.”
Given the potential severity of the problem, news organizations need to help by getting the headlines right and not unintentionally spread misinformation, he said, in reference to a tweet he recently sent that attracted attention for taking the Washington Post to task for a story about a person dying in a vaccine trial which resulted in confusion over cause of death — the subject had been given a placebo, not the experimental vaccine.
“We need to allow scientists to do their jobs and measure the risk, and look at all of the details, and the vaccine issue has become a geostrategic issue,” Stamos said.
Several consortiums are tied to governments, and several, for example, to very important companies in China backed by the Chinese Communist Party, which has been positioning its vaccine candidates as chess pieces in the battle for global influence. Russia has multiple vaccine projects underway, including one developed by a biotech company that was once a Soviet era bioweapons laboratory.
“There could be a great amount of interest in saying other companies’ vaccines are bad,” Stamos said.
“We need the same kind of cooperation … to go into vaccine safety, and we already have a sub-culture in the U.S. very skeptical and will harass people who push vaccines,” Stamos said. “We’re in a very dangerous place,” he added, referring to the opportunity for a foreign adversary to use misinformation and more targeted propaganda and disinformation to threaten the health of the U.S.
Declining trust among Americans for a vaccine
In fact, recent Pew Research survey data shows that there is reason to be concerned about vaccine distrust among a growing segment of the American public, and not just limited to a sub-culture.
A September report from Pew showed that Americans who say they would get vaccinated for the coronavirus declined by a significant amount over the course of 2020. Half of U.S. adults (51%) told Pew in September they would “definitely or probably” get a vaccine to prevent Covid-19 if it were available, but nearly as many (49%) say they definitely or probably would not get vaccinated. Overall intent to get a vaccine fell from 72% in May, a 21 percentage point drop. And the share who would “definitely” get a coronavirus vaccine dropped by half to 21%.
“Everyone at Facebook can take the day off after the election and then on Nov. 5, they need to get back to work at deploying the exact same responses we saw to election disinformation,” Stamos said, adding that a Covid war room is a necessity similar to the election war rooms that companies like Facebook have now.
Alexis Wichowski, Deputy CTO for Innovation, New York City Mayor’s Office of the CTO, who spoke on the CNBC TEC virtual summit with Stamos, said while federal agencies have the largest reach, absence of trust in the federal government right now requires technology companies to be engaging with state and local governments, as well. “The more local we get the better chance we have to combat vaccine disinformation,” she said.
Stamos worries that while it is clear exactly who is in charge of the election disinformation effort within the federal government, including the Department of Homeland Security’s CISA unit, created after 2016, and the military’s Cyber Command, there is no clear lead agency on Covid misinformation in Washington, D.C.
One advantage in fighting Covid-19 vaccine misinformation relative to the 2020 election version is that political speech is more difficult to label as fact or fiction than science.
“We have scientific experts with generally accepted truths they can reach,” he said.
But Stamos cautioned that even there, the issue is complicated. He cited the early days of the pandemic outbreak in March when the CDC was not advising the public to wear masks, versus a “truly crazy idea” like that the wearing of masks increases the chances of getting Covid-19.
“It’s a fast-moving situation and while there are experts … the opinions of those experts change as research changes.”
The technology companies have these policies in place to label misinformation, but it is not easy to do when there is no direct, fixed set of truths. As a society, we need to be careful about asking the intermediaries to censor speech when the “absolute truth” in some situation is not well known yet.
“When you talk about vaccines … there will be very complicated, conflicting information and we need information centers equivalent to what we had running for the election,” Stamos said. “Facebook should set the goal of four million people getting vaccinated that wouldn’t otherwise, just like they registered four million,” he said.
View original post