Over the past two weeks, Republicans — by Republicans I mean Donald Trump and his surrogates — have amplified claims that this election is being rigged. Rigged by whom, you ask? Well, just about everyone (who doesn’t support Trump). 

The mainstream media? Just a bunch of liars colluding with the Hillary Clinton campaign to block Trump’s path to the White House. Want proof? “EXPOSED — Anti-Trump Media Caught Red-Handed Trying to LIE About Trump Campaign” on The Angry Patriot is a must-read.

The polls that show Trump losing? Totally illegitimate. Don’t believe me? Check out this article from The Common Sense Show called “All Polls Favoring Clinton’s Candidacy Are Fake — Here Is How To Tell.”

OK, let’s be real. There’s almost no evidence to suggest this election is actually being rigged, and those articles I referenced come from right-wing blogs, not legitimate sources of news.

But let’s be real about something else, too. “Legitimate” is a loaded word. In this context, you hopefully sensed my sarcasm, and came to the conclusion that those weren’t reliable articles. Plus, if you’re a student at the University of Michigan, you’re less than 10 percent likely to support Trump, and possibly read the first few paragraphs through your own skeptical lens.

Let’s change up the context. Imagine you’re seeing these headlines on Facebook. Except instead of two, there are 10. Some have been shared by Conservative family members, some by Facebook pages you’ve at some point “liked” and some by Facebook itself through sponsored content.

The outlets are similar to the ones above. They’re not exactly The Wall Street Journal, but rather InfoWars.com, a source you would discount if you came across it through a Google search. But all of these articles are saying the exact same things: The election is rigged, the media can’t be trusted and the polls are bought and sold by Liberals seeking to depress Republican voter turnout. We all learned the three-source rule in elementary school. If three different sources say something’s true, there’s a strong chance it is!

I’ve been picking on team Trump, mostly because online interest — measured in terms of Google searches — is higher on average for Trump than Clinton. Media networks like Facebook guess your political preferences based on information from their sites about the way you engage with political content on Facebook and information about you from third parties. It uses that information to target ads and filter content to match your preferences.

Back in May, Jon Keegan, a visual correspondent for The Wall Street Journal, published “Blue Feed, Red Feed,” a dynamic news story that simulates two Facebook Timelines — one conservative and one liberal — on topics like guns, abortion and the presidential debate. Once upon a time, the majority of Americans got the majority of their news from, well, news outlets. Journalists served as the “gatekeepers” of public awareness, deciding what was worth reporting and how to report it. Reporters and editors made these decisions free from pressure to appeal to social media algorithms that would determine the success of their stories by showing them on users’ newsfeeds — or not.

Obviously, that’s all changed. The majority of American adults now get their news from social media. Facebook aggregates news stories from all over the web, using complex and ever-changing algorithms to determine which stories users read. In that sense, Facebook has assumed the gatekeeper role that professional news publications formerly dominated.

In some ways, this might be a positive. Today, consumers access a broader range of stories and news sources than they ever did in the past. In theory, this should have a democratizing effect on news media, allowing consumer demand to drive coverage at unprecedented levels.

But in practice, the ways that social media sites filter news stories actually narrows the types of stories — and more importantly, the range of political perspectives — users experience.

In June, Facebook changed its algorithm to give greater preference to content posted by the friends that users most frequently engage with, effectively ensuring that friends’ posts wind up higher on users’ newsfeeds than news stories do. To the extent that like-minded people are most likely to be the closest friends, this only reduces the likelihood that users will encounter shared news content or political posts that contradict their own views.

Between friends’ posts and targeted news articles, Facebook has become a political echo chamber that reifies its users’ pre-existing beliefs. Over the course of the semester, just about everyone — including University President Mark Schlissel — has extolled the benefits of open dialogue and exploring opportunities for personal growth by encountering beliefs that challenge our own.

Is that possible when the lion’s share of information we get about current events happening in the world around us is carefully selected to confirm what we already think? I seriously doubt it.

As a general principle, humans prefer when new information confirms what they already believe. Psychologists call that confirmation bias. I call it Facebook’s advertising strategy.

While both Facebook users and shareholders might like this strategy in the short term, I highly doubt any are prepared to deal with the consequences if online groupthink contributes to political polarization so severe as to threaten the stability of the U.S. government.

That may sound a bit dramatic. But according to Facebook, its 1.65 billion monthly active users spend an average 50 minutes on its websites and mobile applications per day. That’s quite the network. Just by tweaking its secret algorithms, Facebook could easily tilt election outcomes –– something Facebook employees are well aware of. The company would be well within its First Amendment rights to do so.

When people express concerns about the role of Facebook in politics, it tends to center on the potential for Facebook to “rig” our elections. But let’s not pretend that Facebook isn’t impacting politics already. It is –– and it’s doing it to the benefit of extreme, highly polarized candidates and policies. You know, the type we all claim to despise.

Victoria Noble can be reached at vjnoble@umich.edu.

Leave a comment

Your email address will not be published.