Saturday, December 3, 2016

Here's Why Stamping Out Fake News Is a Lot Harder Than You Think




Yes there is plenty of made up stories out there but they are readily checked by merely going to a mainstream source.  We have merely become aware.  More to the point, such stories are targeted to an active audience least likely to be so conned.
Are there any naive readers left?

In the need it joins other sources of noise and we need not be too exercised about it.   They most certainly had nothing to do with Hillary's loss.  She did that all by herself.
.

Here's Why Stamping Out Fake News Is a Lot Harder Than You Think

by 

 
November 17, 2016, 1:55 PM EST

http://fortune.com/2016/11/17/fake-news-problem/

The rise of “fake” news, and the role that services like Facebook played in that rise has become one of the defining issues of the 2016 election. Did hoaxes and misinformation help Donald Trump win? And if so, what—if anything—should Facebook do about it? Who is to blame?

What makes this issue so difficult is that hard answers to those questions are difficult to come by, if not impossible. And finding solutions is not likely to get any easier.

Let’s take the first of those questions: Did fake news help Trump win? We simply don’t know. Some believe that false stories about Hillary Clinton murdering people or similar hoaxes definitely swayed the electorate, while others believe all these stories did was confirm biases that voters already had.

Those who argue that fake news did play a role point to evidence that these kinds of stories were hugely popular on Facebook, and were spread far more widely than “real” news stories.

A recent BuzzFeed investigation by Craig Silverman, for example, showed that the 20 most shared fake stories were far more popular with users than the top 20 real news stories from mainstream media outlets.

Get Data Sheet, Fortune’s technology newsletter.

The top viral fake, entitled “Pope Francis Shocks the World, Endorses Donald Trump for President,” came from a site called Ending the Fed and was shared or interacted with on Facebook almost a million times. The top real news story from the Washington Post, which was about Donald Trump’s alleged history of corruption, fell well short of that mark.

As both New York magazine and journalism researcher Mark Bunting have pointed out—and Silverman has admitted—the BuzzFeed sample is a relatively small one, and it doesn’t mean that all fake news was more popular than all real news. It also doesn’t measure the true reach of either the fake or real news stories, because Facebook doesn’t make it easy to do that (which is a big part of the problem when it comes to determining the size of the fake news phenomenon).

Nevertheless, fake news is clearly spreading far and wide on Facebook, and the company’s response to it has been relatively weak. In effect, Facebook says it is concerned, but that it doesn’t believe it’s a big problem, and doesn’t think it affected the election.

As a counterpoint to that, the Washington Post recently spoke to someone who makes a business out of creating fake news. Paul Horner told the newspaper that he created dozens of fakes and saw them shared widely by Trump fans, including the presidential candidate’s campaign manager.

My sites were picked up by Trump supporters all the time. I think Trump is in the White House because of me. His followers don’t fact-check anything—they’ll post everything, believe anything. His campaign manager posted my story about a protester getting paid $3,500 as fact.

Since Horner essentially admitted that he lies for a living, it’s difficult to know how much credence to give his report on the impact of fake news. Did it really help get Trump elected?

Even if it’s true that fake news stories just confirmed what people already thought—a phenomenon that sociologists call “confirmation bias”—it’s still possible that having those beliefs confirmed every day on Facebook made people less likely to change their votes, or less likely to listen to competing arguments about a different candidate.

But when it comes to Facebook actually doing something about that, the issue gets even more complicated. For one thing, as journalist Jessica Lessing and others have pointed out, determining what is fake and what isn’t would give Facebook even more power and control over news content than it already has.

Facebook is taking on Craigslist with this new feature. Watch: 

The biggest problem when it comes to solving this, however, comes down to human nature. The fact that a news story is shown to be untrue—either by being labeled a fake, or being fact-checked by mainstream news organizations—doesn’t seem to affect whether people choose to share or believe it.

New York University professor and author Clay Shirky said recently that “people trade untrue stories that encapsulate things they believe about the world all the time,” and there is a lot of truth to that. People don’t share news stories on Facebook because they are true or factual. They share them because they feel true, or because sharing them is a way of signaling membership in a specific cultural group.

As Walter Quattrociocchi, the author of a study on social echo chambers, put it in a recent interview with the New York Times about fake news and Facebook, this “creates an ecosystem in which the truth value of the information doesn’t matter. All that matters is whether the information fits in your narrative.”

Even the act of fact-checking a fake news story has been shown to actually reinforce the belief system of those who want it to be true. Mike Masnick of the technology commentary site Techdirt has said that he often gets responses from people claiming that Snopes.com—one of the Internet’s longest-surviving and most active fact-checking sites—is a liberal front.

This aspect of human nature is something that Facebook and other social networks have “weaponized,” as Josh Benton of the Nieman Journalism Lab put it recently. It is so

So what can we do about this kind of “post-truth” environment? It’s clear that Facebook could take some kind of action to identify hoaxes and block fake sites—although it is not in the company’s economic interests to do so, as Shirky noted. But even that won’t be enough.

It’s possible that the only way to stop people sharing fake news is to try to understand why they are sharing it, and spend some time getting at the root of those problems. Unfortunately, that’s a much more time-consuming and difficult task than fact-checking a news story. And it’s something that the fragmented media industry of today isn’t particularly adept at doing.

Could Facebook help on that score? Could it find some way of introducing people to alternate viewpoints or promoting understanding rather than creating a cultural battleground? Only Facebook CEO Mark Zuckerberg knows the answer.

1 comment:

Chumgrinder said...

That's because snopes.com IS a liberal front. As a 35-year veteran of the firearms culture war, I not only know it, but I can prove it.

Their tactic is misdirection -- to spin an original true claim into a different claim, and then mark that one false on some technicality; or they'll take an overall true premise supported by four pieces of evidence, debunk the faulty one, never examine the other three, and mark the whole thing false.

Compare snopes' "debunking" at: http://www.snopes.com/doctors-kill-more-people-than-guns/
...with this article supporting the truth of the topic from NPR: https://www.washingtonpost.com/news/to-your-health/wp/2016/05/03/researchers-medical-errors-now-third-leading-cause-of-death-in-united-states/

Why does snopes say it is "false?" Because it wasn't officially a "study," just some official federal statistics proving the same premise. Lame. But the viewer reads what they want him to read, which is "false."