Skip to main content

Sheryl Sandberg reveals the 2 lessons Facebook learned from the 2016 election, and how the company is dealing with Trump and misinformation during the coronavirus outbreak (FB)

Reuters

  • Facebook COO Sheryl Sandberg says the company has learned two important lessons about misinformation since the 2016 election.
  • In a recent interview with Business Insider, Sandberg said Facebook has learned to deploy experts like the World Health Organization to define which posts are directly harmful to users. 
  • Sandberg also said Facebook has learned it has a responsibility to keep people safe, which is coming into play during the coronavirus outbreak. 
  • In cases like President Trump, who has made comments containing misinformation, Sandberg said Facebook is relying on the WHO tell it when a post is "imminently harmful." 
  • "One of the things we learned the hard way is we shouldn't make judgments on truth," Sandberg said. "We are not global health experts. The WHO is, so we are relying on the WHO to tell us what information they believe will lead to imminent harm and that information comes down, and it comes down if anyone in the world says it."
  • Visit Business Insider's homepage for more stories.

It's been nearly four years since the 2016 election, meaning it's been nearly four years since a targeted campaign of misinformation spread like wildfire on Facebook's platform. 

In those four years, Facebook has learned two important lessons, according to the company's chief operating officer, Sheryl Sandberg.

In a recent interview with Business Insider Editor-in-Chief Alyson Shontell, Sandberg clarified Facebook's current policy on misinformation: When Facebook spots misinformation on the platform, it will mark the post as false, "dramatically reduce" its distribution, and provide the other side of the story in the form of related articles. In the event the post could lead to what Sandberg called "imminent harm," it's taken down entirely. 

"That was something we didn't do years ago, but one thing we learned is it's very hard to define 'imminent harm,'" Sandberg told Business Insider. "If you look at something like Myanmar, we didn't have voices on the ground. We didn't know what imminent harm was, and a post that would look pretty benign outside, if you understood the situation on the ground, absolutely was much more harmful."

Sandberg was referring to a campaign of Facebook posts that targeted Myanmar's population of Rohingya Muslims. Human rights experts said in 2018 that the propaganda was directly responsible for fueling violence against the Rohingya.

Sandberg said that in the face of the coronavirus outbreak, Facebook kept this lesson in mind, immediately turning to the World Health Organization for help.

"Right away, we knew we can't be making these decisions," Sandberg said. "Right away we went to the WHO and said to them, 'You tell us. Whatever you think is imminent harm comes down.' We're relying on third-party experts."

In cases like President Trump, who has made several comments in the last few weeks that are not grounded in science, like appearing to call concerns over the coronavirus a hoax, Sandberg said Facebook's process is "very clear": any posts that the WHO believes will cause imminent harm will be taken down. Otherwise, those posts will stay up if they are deemed to be "part of the conversation." 

"One of the things we learned the hard way is we shouldn't make judgments on truth," Sandberg said. "We are not global health experts. The WHO is, so we are relying on the WHO to tell us what information they believe will lead to imminent harm and that information comes down, and it comes down if anyone in the world says it."

Facebook's efforts may be an uphill battle: about 50% of the posts on Facebook's newsfeed are now about the coronavirus, according to the Washington Post, and Facebook has put its 15,000 content moderators, who are third-party contractors, on paid leave for the time being. The company is currently relying on algorithms, as well as full-time employees in other roles, to decide whether content is harmful. 

A responsibility to keep people safe

The other lesson Facebook learned from 2016, Sandberg said, is that Facebook has a responsibility to keep people safe during the pandemic. 

"That's getting down harmful misinfo, but it's also proactively getting the right messages to the right people," Sandberg said. "WHO, CDC — we're putting a lot of messages at the top of the newsfeed."

She pointed to two recent examples: when the UK government wanted people to start staying home, Facebook put a message at the top of the newsfeed for UK users; and when the WHO wanted to encourage people to wash their hands more thoroughly, Sandberg and her fiancé, Tom Bernthal, were some of the first to make a video demonstrating the proper technique. Sandberg said she asked other users with large followings to make a video as well, including NBA player Steph Curry and tennis superstar Serena Williams. 

Beyond Sandberg's own work, Facebook CEO Mark Zuckerberg has been hosting regular videos with health experts like Dr. Anthony Fauci, the leading infectious disease expert in the US. 

"We want to be the place that people distribute information," Sandberg said. "Everyone in this crisis has a responsibility to do what they can do. We are really actively trying to get the right information to the right people."

NOW WATCH: Jeff Bezos reportedly just spent $165 million on a Beverly Hills estate — here are all the ways the world's richest man makes and spends his money

See Also:

SEE ALSO: How Sheryl Sandberg is dealing with the coronavirus pandemic at home, in her community, and as the leader of Facebook

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.