Getting Rid Of The Bullsh*t On Your Facebook Feed

facebook-feed

Taking command of your news feed on Facebook.

Facebook has been under fire with accusations that their news feed may have affected the outcome of this year’s election. CEO Mark Zukerberg has batted back the accusations as “a pretty crazy idea.”






The case against Facebook points to the way Facebook presents articles in their “news” feed as visually the same regardless of whether it’s an article from a reputable media outlet or a week-old blog run by a teenager in Macedonia. Stories that are either satire or outright lies are presented in the same way an article that was written by a real journalist using facts is presented.

Facebook has always run the risk of becoming an echo chamber for users to surround themselves with others who have the same point of view. We pick our friends and the news we like – even the ones that lie to us. In most cases, we unfriend those who disagree with us or with whom we disagree and in the end are left with our little petri dish of ignorance. And while Facebook may encourage its users to keep an open mind by looking for posts that don’t appear in their feed, they don’t make that process easy. We only see a small fraction of what any individual posts. It’s all determined by an algorithm that’s more about marketing than it is about your interests. You can take control of what you see with some effort, but who wants to put effort into anything?

The “Wall Street Journal” recently compared posts from users with opposing political views. They created two feeds, a “blue” one and a “red” one and compared the results side by side to see the conversation from a different, and usually opposite, perspective.

According to the WSJ:

If a source appears in the red feed, a majority of the articles shared from the source were classified as ‘very conservatively aligned’ in a large 2015 Facebook study. For the blue feed, a majority of each source’s articles aligned ‘very liberal.’

The results, which are fascinating, can  be found on “Blue Feed, Red Feed,” where you can compare for yourself.



Zuckerberg defended the site and it’s newsfeed saying:

I think the idea that fake news on Facebook—of which it’s a very small amount of the content—influenced the election in any way is a pretty crazy idea.

Apparently, after six days to reflect on that theory, both Facebook and Google have taken a small step by taking aim at fake news sites. Or at least taking aim at their revenue.

According to the “New York Times”:

 Google kicked off the action on Monday afternoon when the Silicon Valley search giant said it would ban websites that peddle fake news from using its online advertising service. Hours later, Facebook, the social network, updated the language in its Facebook Audience Network policy, which already says it will not display ads in sites that show misleading or illegal content, to include fake news sites.

While neither of the companies will ban, block or slow down traffic to these sites, they are going to make an effort to not reward them with advertising revenue.

That doesn’t mean those sites won’t be out there, however. It simply means that Google and Facebook won’t be paying them for advertising. There are other ways for those sites to make money and still use Google and Facebook as a means to garner traffic and readers. There are other advertisers, and there are ways for marketers and site owners to use the search algorithms to their advantage.

From the same NYT piece:

 On Sunday, the site Mediaite reported that the top result on a Google search for “final election vote count 2016” was a link to a story on a website called 70News that wrongly stated that Mr. Trump, who won the Electoral College, was ahead of his Democratic challenger, Hillary Clinton, in the popular vote.

With nearly 60 percent of Americans getting their news through social media, according to Pew Research, this is a significant issue that isn’t going away anytime soon.

“If you are in an echo chamber where you’re only speaking to like-minded people, it tends to heighten the extreme voices and marginalize the moderate ones,” WNYC’s Brooke Gladstone says. “It’s creating an atmosphere that’s incredibly intolerant online.”

We all live in a bubble, and we should all make an attempt to pop it. The risk is Donald Trump as president, and we’re already too late as far as that goes.

You can start by informing yourself and getting the facts from non-partisan, unbiased sources.

Check out the latest installment from the We the Voters web video series, where several top media critics give advice on how to tackle this problem.

A recent article from “Mother Jones” offers these tips:

  • Try reading something you violently disagree with once a week or checking a news source that’s completely different from what you usually read.
  • Find “deputy curators” who are experts in areas that you care about and see what they’re suggesting you read. Click a little further to read sources that aren’t your usual go-to’s.
  • Go past the headlines to read the full story.
  • Think before you retweet—is it worth it to amplify an extreme or hateful voice?

There are journalism nonprofits you can also support such as ProPublicaMother JonesThe Marshall ProjectThe Hechinger ReportThe Trace, and The Center for Investigative Reporting for starters.

And be sure to keep an eye on sites like Fake News Watch and others like it that are acting as internet watchdogs.

While Facebook and Google won’t warn you that site is questionable, fake, or outright making things up, the Chrome extension, “B.S. Detector“, released on Tuesday, claims to identify and flag fake news online. The database it’s using seems to be rather sparse for now and hopefully they’ll be updating and adding to it as time goes on.

The extension claims to identify articles on Facebook that are from a “questionable source.” The warning appears when users scroll over the article on a feed.

“I built this in about an hour yesterday after reading [Mark Zuckerberg’s] BS about not being able to flag fake news sites. Of course you can. It just takes having a spine to call out nonsense,” Daniel Sieradski wrote. “This is just a proof of concept at this point, but it works well enough.”

The extension is a few days old and was written in an hour, so who knows how effective it’s going be. If you install it, let us know in the comments how it works.

Listen to Richard Zombeck (the author of this article) & Tony Trupiano on the T&Z Talk podcast.

Richard Zombeck
Follow me

Richard Zombeck

Richard Zombeck is a freelance writer, featured blogger at Huffington Post, and co-host of the T&Z Talk Podcast.

He’s much older and angrier than he looks.

You can listen to his Podcast , follow him on Twitter , and find him on Facebook.

Richard Zombeck
Follow me