Nearly all last year, every time I would go on Facebook and scroll through my newsfeed, I would see dozens of articles I wanted to read. But as someone who is generally too busy for their own good, I often had to save articles for later. So, 99 percent of the time, I saved the link to the article in a note-taking app on my computer. Finally, this year, I sat down, and I started to clicking through my list of nearly 150 articles that I had been meaning to read. And as overwhelming as it seemed in the moment, I quickly realized something as I browsed through the articles.  

After reading the first twenty or so articles, I began to recognize the title or the cover art as that of an article I’d already read. And even when the title and cover art didn’t remind me I’d read the article, the contents of it would sound vaguely familiar, and I would eventually remember I’d already read that one. I put two and two together and realized that as the year had gone by, many of the same articles that I saw and had saved to read popped up again, months later, when I’d forgotten about them, and, again intrigued (after all, it would be the same article), I would save it to read.

Last semester at the Daily, the staff talked extensively about how to increase our publication’s presence online, and one particular topic of discussion has still stuck with me: Facebook and how its newest algorithm made it so users will only see posts that — based on likes and interactions with accounts, articles and pages — Facebook believed they would be interested in.

While it can be nice not to have to sift through every single post from each one of my Facebook friends — especially when it had been so long that I only recognized some my Facebook friends by name — seeing first-hand how Facebook not only filtered profile pictures, statuses and cover photos, but also articles, was especially alarming to me. Equally alarming is that the majority of Americans are getting their news from social media, therefore the algorithms that determine what we see are increasingly important.

While my Twitter presence is dismal at best, I have seen how Twitter can have the same effect. In March 2016, Twitter launched a new algorithm. Like Facebook, instead of showing posts from all the people you follow, you would now only see certain tweets based on what the algorithm thought were your preferences. And even though there was significant backlash against the switch to an algorithmic timeline, prompting Twitter to provide users the option to opt out of the new algorithm in favor of the old timeline where you saw every post in chronological order, only about two percent of users opted out and took the time to actively change the settings back to the old version.

As technology has become more innovative and complex, so have programs that analyze what we “like” on Facebook, what we read, what we listen to, the places we shop online and the searches we put into Google. Though tailoring content can be useful at times, it can also be problematic.

Because of these algorithms, we are seeing more of the same type of articles and same type of posts, and we aren’t so much exposing ourselves to new ideas or opinions. As I wrote about in my most recent column, opinions won’t always change, but if we never hear the other side, we will never have the opportunity to challenge our opinions. We won’t have that chance to think and expand our worldview. With algorithms that show us what it thinks we want to see, it’s helping to vastly shrink our world, not encouraging exploration and openness, which I argue is crucial in any society.

Politics aside, though, the issue of tailored content based on algorithms and computer programs is also problematic because they can help, even subtly, reinforce stereotypes.

Recently, MLB.TV had a half-off discount for Fathers Day, but, at least when it comes down to the playoffs and postseason, women account for nearly one-third of those who watch sports such as baseball, basketball and football, to name only a few, so why not a Mothers Day discount?

Furthermore, even though networks almost always have monetary gain as a main concern, by assuming X or Y parties won’t be as interested, they could be missing out on a large market (I know the Father’s Day discount enticed me to purchase the rest of the MLB season even though I never would have for the regular price).

And though less pressing an issue, with tailored content, companies and analysts assume people won’t find something interesting they previously may not have thought much about. Yeah, I like what I like, but that doesn’t mean I won’t be open to new content. That doesn’t mean it will be a surefire indicator of what I will like in the future.

Netflix, for example, has a new feature where it tells users, based on previously watched movies and TV shows, the percent “match” a movie or show is for you. I recently saw Lee Daniels’ “The Butler,” which Netflix told me was a 70 percent match, but it was one of my favorite movies I’ve seen this year. In fact, two of my favorite movies could not be more opposite each other. But had I listened to an algorithm or a matching program, I may never have seen them.

In fact, everyone from movie producers to car companies, journalists to sports teams, lose out when they are always trying to predict which audiences will be most likely to be interested in their content. I’m not saying trying to place Heineken ads on Disney Channel is a good idea or putting an article about the stock market on an eleven-year-old’s Facebook page makes sense, but it wouldn’t hurt for everyone to expand and experiment and think outside the box that says X or Y won’t find the article, product or movie interesting.  

In the last year, The New York Times morning briefings have started to include a section on partisan articles, to try to bridge a gap between people on different sides of the aisles. Companies and newspapers need to do what the Times is doing. They need to rethink how and where they disseminate information, what articles they should provide that might help them see and understand other side of the aisle, how to promote a movie that Netflix’s matching system would otherwise say they wouldn’t like or how to play with a potential market for a product that analysts said people who watch that TV channel wouldn’t be interested in.

As for people like me who aren’t making the advertising decisions or who can’t control the algorithms, we should take the initiative to read articles that aren’t just the first ones to pop up on Facebook or Twitter, seek out information on topics we wouldn’t think we’d be interested in, watch movies regardless of the fact that they were or were not recommended to us, branch out and try new products. It can be fun, and not to mention important, to try new things.

Leave a comment

Your email address will not be published.