But while he can get away with shrugging off criticism, the CEO of Meta Platforms Inc, Facebook’s parent company, must reassess his priorities over the coming months as the US approaches potentially tumultuous midterm elections. He needs to turn his attention back to Facebook, or risk a proliferation of misleading videos about election fraud, which could once again disrupt the democratic process.
Zuckerberg could start by doing what thousands of managers have done before him, and reconsidering his duties.
The metaverse is still in its infancy: While Facebook has about 3 billion active users, Horizon Worlds, the VR platform that serves as the basis for the metaverse experience, has only 200,000, according to internal documents revealed by the Wall Street Journal.
Zuckerberg has been outspoken in saying that the metaverse won’t be fully realized for five years or more. Adding to the reason, then, is that his emotional project can lose his attention for a few months, or at least during the crucial moments of democracy.
So far he hasn’t shown any sign of shifting his focus. Facebook’s core election team no longer reports directly to Zuckerberg as it did in 2020, according to the New York Times, when Zuckerberg made the US election that year a top priority.
It also eased the reins on the chief executives tasked with dealing with election disinformation. Head of global affairs Nick Clegg now splits his time between the UK and Silicon Valley, and Jay Rosen, the company’s head of information security, has moved to Israel, a company spokesperson confirmed via email.
Researchers who track misinformation on social media say there is little evidence that Facebook is better at stopping conspiracy theories now than it was in 2020. says Melanie Smith, who heads disinformation research at the Institute for Strategic Dialogue, a nonprofit based London, the company did not. Access to data is improved for outside researchers trying to determine the prevalence of misleading publications. Anecdotally, she is still breeding. Smith said she found groups on Facebook that were recruiting election observers for the apparent purpose of intimidating voters on Election Day.
She also referred to a video posted by Florida Rep. Matt Gates on his Facebook page, in which he said the 2020 election had been stolen. The video has been viewed more than 40,000 times at the time of writing. Although it was published a month ago, it does not have a warning label to validate the news.
Smith also cited recent Facebook posts, which have been shared hundreds of times, inviting people to events to discuss how “Chinese Communists” are conducting local elections in the United States, or posters stating that some politicians should “go to prison for their role in stolen elections.” Smith said posts posted by candidates tend to go especially far.
Meta said its main approach to dealing with content during the mid-term of 2022 will be to use warning labels. But warning labels are not very effective. For more than 70% of disinformation posts on Facebook, such stickers are applied two or more days after the post is published, after the post has had a chance to spread, according to a study by the Integrity Institute, a nonprofit research organization run by former employees of the Big tech companies. Studies have shown that disinformation gets 90% of their total interaction on social media in less than a day.
The problem, after all, is the way Facebook shows people content most likely to keep them on the site, which whistleblowers Francis Hogan called engagement-based rating. A better approach might be “quality-based ranking,” similar to Google’s PageRank system that favors consistently reliable sources of information, according to Jeff Allen, a former data scientist at Meta and co-founder of the Integrity Institute.
Facebook’s increasing focus on videos only makes the problem worse. Allen said, citing a recent study by the Integrity Institute, that in September 2022, far more disinformation was shared via video than regular Facebook posts. He added that false content generally gets more engagement than honest content, and therefore tends to be favored by a participatory system.
In 2020, Facebook published “glass-breaking” measures to counter a wave of posts saying the election was stolen by then-President-elect Joe Biden, which eventually led to the storming of the US Capitol building in January 6.
Meta should never have to resort to such drastic measures again. If Zuckerberg is serious about hooking people up, and doing so responsibly, he needs to get out of his virtual reality bubble and re-examine the rating system that keeps eyeballs glued to Facebook’s content. At least he can tell his staff and the public that he is once again prioritizing election integrity. Metaverse can wait.
More from Bloomberg Opinion:
If Musk’s possession of Twitter is a security risk, what about Tesla?: Liam Denning
Musk Gutting Twitter may be a threat to all of us: Tim Colban
Zuckerberg’s $1,499 Headphones Won’t Help Dead: Barmi Olson
(1) Allen’s study showed that video content on Facebook in September 2022 had a “disinformation amplification factor” for videos of 14, while it was 4.2 for regular posts.
This column does not necessarily reflect the opinion of the editorial staff or Bloomberg LP and its owners.
Parmi Olson is a columnist for Bloomberg Opinion covering technology. She was a former reporter for The Wall Street Journal and Forbes, and the author of We Are Anonymous.
More stories like this are available at bloomberg.com/opinion
#Zuckerberg #focus #midterm #Metaverse