Doubtful News would be useless if people weren’t interested in finding out what might or might not be true and correct. This site appeals to people who are willing to listen to a narrative framed in that way. But, SURPRISE (not), most people consult the internet guided by beliefs they hold dear – whether those tend towards assuming scientific consensus will give us the best answers or that the government is covering up the truth.
A recent study in the Proceedings of the National Academy of Sciences by Del Vicario, et al. confirmed what we already suspected – people exist in online “echo chambers” where they share information that supports their existing biases. The researchers looked at 67 public Facebook pages that focused on science news, conspiracy theories and “troll” sites (those that are deliberately sarcastic and mocking) to see the patterns of how content diffused from the initial online posting and how this related to the common community who shared similar ideas. This is called “cascade dynamics” in the paper and cascade lifetime is measure in hours.
From the significance statement:
The wide availability of user-provided content in online social media facilitates the aggregation of people around common interests, worldviews, and narratives. However, the World Wide Web is a fruitful environment for the massive diffusion of unverified rumors. In this work, using a massive quantitative analysis of Facebook, we show that information related to distinct narratives – conspiracy theories and scientific news – generates homogeneous and polarized communities (i.e., echo chambers) having similar information consumption patterns. Then, we derive a data-driven percolation model of rumor spreading that demonstrates that homogeneity and polarization are the main determinants for predicting cascades size.
The data shows a peak sharing appears at 1-2 hours after the first post and then a second peak at ~20 hours. This appears in both categories so may be information we can generalize. For my uses, I see that getting information out related to the story is critical in the first hour after it reaches social media (a near impossible task except for a dedicated news outlet). Science news was found to spread more quickly but then dropped off and stabilized. Whereas the conspiracy cascade continued on a more gradual curve upwards and became more popular through time. What this tells me is that science news is consumed faster (and digested and used) but conspiracy ideas grow and get passed along slowly and steadily. The graph ends at 400 hours (about 16 days) so we don’t know if these shared posts continued to propagate. And, we don’t know from this study if they gave birth to new posts.
An article on the study by the Washington Post contained some additional quotes from experts that triggered thoughts from me about promoting corrections — including the quote, “Continued preaching to the choir is not going to work.” That is, a Facebook page with a science focus does not extend its reach past that “choir”. The same almost certainly occurs with pages that promote critical thinking and debunking of false claims.
Damn, it’s not going to be easy. Human nature is against it. As Robert Brulle of Drexel University was quoted:
Individuals want to maintain their self-identity and self-image. They’re not going to read something that challenges their values, their self-worth, their identity, their belief system.
Those of us who try to reach people with strong beliefs in conspiracies, the paranormal, or alternative treatments know exactly how this goes. Changing minds is near impossible. Therefore, this study didn’t tell us anything we didn’t already know – confirmation bias rules.
How to reach the largest audience, outside of the “choir”, has always been a priority of DN but we acknowledge we are hamstrung by our decision to reject sensationalizing and prioritizing clicks over content. We don’t play games with the reader, so we are necessarily less entertaining to the average site viewer than those sites that provide content based on apocalyptic destruction, social or political outrage, or mystery mongering. As we’ve noted before, people tend to share the stories that are gruesome, shocking, and ridiculous even though, if asked, they may GUESS that they are lies. They will also share those stories that fit in with their preferred worldview, not ones that poke holes in topics they are invested in. In fact, try that in the echo chamber and you will get booted out right quick.
Additional research cited in the Del Vicario, et al. study tells us that even exposure to unsubstantiated rumors increases the likelihood that you will give them credence. Facebook has cornered the market on that process. False beliefs, once you have them in your head, even if you don’t recall where they came from, are extremely difficult to discard. The best we can do is try to gradually change thinking and eventually replace the misinformation with better stuff. We may be able to do that by considering the framing of the story to appeal to people to change THEIR OWN minds.
We will always have fringe beliefs. Counteracting misinformation spread and false beliefs will require a shift in presenting these ideas, not just a tweak to what exists. It also will likely require a shift in online behavior. How do you slow down sharing, increase critical thinking, and avoid the backfire effect?
How to catalyze the needed shift is a conundrum. Ideas welcomed.