Loading

Saturday, June 18, 2016

Why Facebook Is A Hotbed For Conspiracy Theories

Facebook provides a scientifically perfect echo chamber for the wildest conspiracy theory, new study confirms


BitMiner - free and simple next generation Bitcoin mining software

Facebook is a conspiracy theory hotbed. There’s The Conspiracy Archives, a community 200,000 strong united in their hatred of President Obama and rampant 9/11 skepticism. There’s CancerTruth, home of 250,000 anti-vaxxers and a host of alternative cancer “therapies.” Now, a forthcoming study explores conspiracy theories on Facebook and tries to explain the phenomenon scientifically—social media is an echo chamber where users come together to rally around their own opinions, rather than explore outside ideas or acquire new ones.



“Users show a tendency to seek out and receive information that strengthens their preferred narrative…and to reject information that undermines it,” the authors write. “Confirmation bias operates to create a kind of cognitive inoculation.”


Although we tend to think of conspiracy theorists as isolated loners with fringe opinions, most rely on networks of likeminded skeptics for support and reinforcement. One of the main advantages of joining a coven of conspiracy theorists is what psychologists call confirmation bias—the virtual guarantee that you’ll be able to participate in conversations and ideas that confirm your conspiratorial suspicions. For instance, GMO skeptics who believe that Monsanto is trying to give us cancer will find comfort in surrounding themselves with other GMO skeptics who constantly talk about the evils of Monsanto.
That’s where social media comes in. On Facebook, you can join a group of thousands of other conspiracy theorists who will chime in with the sort of wild ideas that match your own skepticism. At least, that’s the theory. To test whether social media indeed functions as an echo chamber that supports budding skeptics and reinforces their odd ideas, researchers surveyed several Facebook groups that peddled conspiracy theories.
And then they started messing with them.

Researchers joined one Italian group to share “troll posts,” intentionally satirical articles in the style of earnest conspiracy theory writing. For example, an article claiming chemtrails—what conspiracy theorists call the condensation trails left behind by airplanes flying in certain whether conditions—contain Viagra. And they joined another American anti-vaccine group to drop in scientifically accurate information about vaccine safety. Then they repeated this sort of behavior in every other group, either posting ridiculous conspiracies to conspiracy groups or attempting to debunk them using science. In both scenarios, it quickly became clear that Facebook groups served as ideal echo chambers, and that confirmation bias was hard at work on social media.
“Intentionally false claims are accepted and shared,” the authors write. “While debunking information is mainly ignored.” That is, even the scientists’ wildest attempts at trolling the group were absorbed into regular conversation. Sure, chemtrails could have Viagra in them. Why not? As for the debunks, virtually every attempt was met with loud, angry reactions as the group ganged up on the scientist, and the researchers found that users that engaged with debunking content actually became slightly more engaged with the conspiracy community. This tells us two things about Facebook—users jump on information that strengthens their narrative and reject any ideas that fly in the face of their general belief in grand conspiracies.
The findings reinforce prior studies that have shown that trying to debunk skeptics never works, and perhaps now we know why. Conspiracy theorists don’t join social networks to learn, they join them to hear comforting reinforcements of their own ideas. And if you disrupt their happy place with scientific facts, you’re not going to get very far. The study also suggests that social media, especially Facebook, has provided a particularly strong breeding ground for extreme ideas by creating a forum for loners to establish large communities that echo similar, strange beliefs.
“We provide empirical evidence that because they focus on their preferred narratives, users tend to assimilate only confirming claims and to ignore apparent refutations,” the authors write. “People are using Facebook to create enclaves of like-minded people, spreading information in strikingly similar ways.”

No comments:

Post a Comment

Thanks for comments