Mark Zuckerberg – Beware! Researchers who examined the amount of misinformation spreading on the social media platform Facebook during the Covid-19 pandemic found that its design makes it unable to control lies.

New research was published today in the prestigious Nature Journal Advancement of science It suggests that Facebook’s COVID-19 vaccine misinformation policies — still the world’s largest social media platform — have not been effective in combating misinformation. The study, conducted by researchers at George Washington University (GW) in Washington, D.C., found that Facebook’s efforts were undermined by fundamental design features of the platform itself.

The study was titled “The Effectiveness of Vaccine Misinformation Policies and Engineering during the COVID-19 Pandemic.”

Online misinformation is dangerous because it fosters distrust in science, undermines public health, and may even lead to civil unrest. It also undermines confidence in scientific evidence and medical recommendations.

“There is a lot of interest in social media platforms and AI governance today. However, this discussion is largely focused on content or algorithms. In order to effectively address misinformation and other online harms, we need to go beyond content and algorithms to also focus on design and architecture.” “Our findings show that removing content or changing algorithms can be ineffective if they don’t change what the platform is designed to do — enable community members to connect over shared concerns — in this case, vaccine hesitancy — and find information that motivates them.” on.”

Vials labeled with the Pfizer-BioNTech and Moderna coronavirus (COVID-19) vaccine are shown in this illustration taken on March 19, 2021. (Credit: REUTERS/DADO RUVIC/ILLUSTRATION/FILE PHOTO)

Facebook is designed to build communities around the things people care about. To do this, it uses several different architectural features, including fan pages that promote brands and community celebrities, enabling a relatively small group of influencers to reach large audiences. These influencers can then form groups explicitly designed to build communities in which community members can exchange information, including how to access misinformation or other persuasive content off-platform. Group members, especially group admins (who are often the page’s content creators) can leverage Facebook’s newsfeed algorithms to make sure this information is available to those who want to see it.

Efforts to remove misinformation do little to stop it

The researchers found that although Facebook has made a significant effort to remove a lot of anti-vaccine content during the COVID-19 pandemic, overall engagement with anti-vaccine content has not declined from previous trends; In some cases, it has even increased.

This finding — that people were equally likely to spread vaccine misinformation before and after Facebook’s massive takedown efforts — is incredibly troubling, said public health professor Loren Abroms of the GW Institute’s Milken School of Public Health, who contributed to the article. “It shows the difficulty we have as a society in removing health misinformation from public spaces.”

She added that in the content that was not removed, there was an increase in links to off-platform, low-credibility sites and links to misinformation on “alternative” social media platforms such as Gab and Rumble, especially in anti-vaccine groups. “In addition, anti-vaccine content remaining on Facebook became more — not less — misinformed, containing sensationalized false claims about vaccine side effects that were often too new to be fact-checked in real time.”

The researchers emphasized that there was also “collateral damage,” as pro-vaccine content may also have been removed as a result of the platform’s policies, and in general, vaccine-related content has become more politically polarized. Additionally, producers of anti-vaccine content used the platform more effectively than producers of pro-vaccine content. Although both had large page networks, anti-vaccine content producers coordinated content delivery more effectively across pages, groups, and users’ news feeds.

Even as Facebook tweaked its algorithms and removed content and accounts to combat vaccine misinformation, researchers say the platform’s architecture declined.

“Think of Facebook’s architecture like a building. An airport is designed to direct people to move them easily and safely to and from gates, and a stadium is designed to safely bring together a group of people for a show,” Broniatowski said. “If these buildings weren’t designed to balance travel, entertainment, safety and security People may be routinely harmed. Now think of Facebook’s structure in the same way – it’s designed to allow passionate people to build communities and easily share information on any topic.

He pointed out that Facebook’s structure sets up the policy for failure. “Individuals who are highly motivated to find and share anti-vaccine content are using the system the way it was designed to be used, which makes it difficult to balance these behaviors with public health or other public safety concerns. You have to change the architecture if you want to create that balance.”

The study suggested that social media platform designers are able to promote public health and safety by working collaboratively to develop a set of “building rules” for their platforms that are guided by scientific evidence to reduce online harm.

He explained that building engineers have to balance building design goals with compliance with rules to protect the people who use them. “They must be designed to enhance public safety by complying with fire, flood, pollution and earthquake codes. They need to enhance security by incorporating design features such as well-marked and accessible entrances, exits and turnstiles to prevent vandalism, panic, terrorism and riots. They must They promote public health by complying with plumbing, sewer, and ventilation codes, and must be designed to respect municipal codes such as noise ordinances and other nuisance ordinances. These codes are typically developed through a partnership between industry, government, and community organizations, and are informed by science and established practices. They can Governance strategies can facilitate these partnerships and support science.”

According to the researchers, their study is the first and only scientific evaluation of the effectiveness of the world’s largest social media platform’s attempt to systematically remove misinformation and misleading accounts.



Leave a Reply

%d bloggers like this: