Calls on social media to remove misleading content – such as vaccines, climate change and 5G technology – should be rejected, according to the UK’s High Academy of Sciences.
After researching the sources and impact of disinformation on the Internet, royal society he concluded that removing false claims and offensive accounts would do little to limit their detrimental effects. Instead, the bans could bring misinformation “into corners of the Internet that are harder to access and worsen feelings of distrust in government,” the report said.
In the UK, there have been calls from across the political spectrum for Twitter, Facebook and other platforms to remove antivax posts. However, “suppressing claims beyond consensus may seem desirable, but it can complicate the scientific process and force truly malicious content underground,” said Frank Kelly, a professor of mathematics at Cambridge University who chaired the Royal Society’s investigation.
He added that removing content and removing users from mainstream platforms makes it difficult for scientists to interact with people such as anti-vaxers. “A more nuanced, sustainable and focused approach is needed,” he said.
While illegal content that incites violence, racism or sexual abuse of children must be removed, legal material that is contrary to scientific consensus should not be banned, the report said. Instead, there should be broad action to “build collective resilience” so that people can detect and respond to harmful misinformation.
“We need new strategies to ensure that high-quality information can compete in the online attention economy,” said Gina Neff, a professor of technology and society at Oxford University and co-author of the report. “It means investing in lifelong information literacy programs, technologies that improve origins and mechanisms for sharing data between platforms and researchers.”
A well-informed majority can act as a “collective intelligence” that protects against misinformation and calls for inaccuracies when encountered, said Sir Nigel Shadbolt, executive president of the UK Open Data Institute and another co-author. “Many eyes can provide a powerful examination of content, as we see on Wikipedia,” he added.
Some fears of spreading misinformation on the Internet – such as the existence of “echo chambers” and “filter bubbles” that lead people only to encounter information that strengthens their own beliefs – have been exaggerated, the report said.
Although the internet has led to a huge proliferation of all kinds of information, the vast majority of people in the UK have attitudes close to those in mainstream science, according to a YouG poll commissioned for the report. The proportion of 2,000 participants who agree that Covid vaccines are unsafe was 7 percent for BioNTech / Pfizer jab and 11 percent for Oxford-AstraZeneca, while 90 percent said human activity is changing the climate.
Opponents of vaccination would eventually have to face evidence that their opposition to Covid stings is wrong, Shadbolt said: “A large natural experiment on the effectiveness and safety of vaccination is the best evidence we have. For [anti-vaxxers] the evidence is not good. “