Kissed by a Twister











{January 29, 2016}   New Year’s Detox Blast

New Year’s Detox Blast

Advertisements


{January 23, 2016}   Galaxy Slime

Galaxy Slime



40 Clean Eating Recipes in 30 minutes

Nom Nom Nom



{January 19, 2016}   Creative Ideas

Creative Ideas

image



{January 19, 2016}   The Workbox

The Workbox 2.0

image



13 EASY AND USEFUL CURLING IRON HACKS EVERY GIRL SHOULD KNOW

image



Teach Kids Chemistry With This Homemade Periodic Table Battleship Game

image



{January 16, 2016}   How to draw perfect feet

How to draw perfect feet



{January 16, 2016}   Amazing Hairstyles!

Amazing Hairstyles!

image



How Facebook spreads falsehoods and paranoia 

BY CASS SUNSTEIN BLOOMBERG VIEW

Why does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?

A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.

Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.

The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behavior of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?

Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages). Their data set is massive: It covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.

In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.

More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it.

As Del Vicario and her coauthors put it, “users mostly tend to select and share content according to a specific narrative and to ignore the rest.” On Facebook, the result is the formation of a lot of “homogeneous, polarized clusters.” Within those clusters, new information moves quickly among friends (often in just a few hours).

The consequence is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.” And while the study focuses on Facebook users, there is little doubt that something similar happens on other social media, such as Twitter — and in the real world as well.

Striking though their findings are, Del Vicario and her coauthors do not mention the important phenomenon of “group polarization,” which means that when like-minded people speak with one another, they tend to end up thinking a more extreme version of what they originally believed.

Whenever people spread misinformation within homogenous clusters, they also intensify one another’s commitment to that misinformation.

Of the various explanations for group polarization, the most relevant involves a potentially insidious effect of confirmation itself. Once people discover that others agree with them, they become more confident — and then more extreme.

In that sense, confirmation bias is self-reinforcing, producing a vicious spiral. If people begin with a certain belief, and find information that confirms it, they will intensify their commitment to that very belief, thus strengthening their bias.

Suppose, for example, that you think an increase in the minimum wage is a sensational idea, that the nuclear deal with Iran is a mistake, that Obamacare is working well, that Donald Trump would be a fine president, or that the problem of climate change is greatly overstated.

Arriving at these judgments on your own, you might well hold them tentatively. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty — and perhaps real disdain for people who do not see things as you do.

On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored — and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.

Can anything be done? The best solution is to promote a culture of humility and openness. Some people, and some communities, hold their own views tentatively; they are interested in refutation, not just confirmation.

In the midst of World War II, a great federal judge, Learned Hand, said that the spirit of liberty is “that spirit which is not too sure that it is right.” Users of the social media are certainly exercising their liberty.

But there is a real risk that when they fall prey to confirmation bias, they end up compromising liberty’s spirit — and dead wrong to boot.

Cass Sunstein, a Bloomberg View columnist, is director of the Harvard Law School’s program on behavioral economics and public policy.

Reference:

SUNSTEIN, C. (2016, January 8). How Facebook spreads falsehoods and paranoia.  Retrieved on JANUARY 8, 2016 from the Bradenton Herald, http://www.bradenton.com/opinion/national-opinions/article53729070.html



et cetera