Here’s Why Facebook Should Collect Data on Our Political Leanings

Why you should care

Because maybe we need to be pushed to see the other side's arguments.

Facebook is aiming to play a defining role in influencing mass political thought in the run-up to 2020. We see this reflected in the launch of its news service, which curates stories for you. And with Twitter banning political ads, it makes a lot more business sense for Facebook to double down on not checking ads to eat into Twitter’s market share.

Combined, these efforts will ensure that Facebook is the only platform (along with maybe Instagram and WhatsApp in different capacities) where people get news and can talk about it publicly.

The concern with Facebook since 2016 has been filter bubbles, and the notion that people can interact on the social network in ways that perpetuate lies and conspiracies. But there is a simple solution: Facebook should collect more data about our political leanings to help us break out of filter bubbles.

As a global community, we should have a more visible and informed choice in what content we want to consume.

For those who don’t follow the space, filter bubbles are the result of platforms showing you content by relevance. Platforms such as Facebook and Twitter arrange content for you based on what the algorithms think you might like to see next. So if you like/love, or just spend more time on, say, a dog video, your feed will try to show you more dog videos. That’s how they get people to spend more time on these platforms. As Anupam Manur, a professor at the Takshashila Institution, explains, filter bubbles are a result of what is termed a “data network effect.” A platform becomes smarter through the data it collects from users, and the smarter a platform is, the better it serves its users, making sure they come back often and contribute with more data, and so on.

The problem here is that this model of bubbles and data network effects also applies to political content. So Trump supporters are unlikely to see the plus sides of Bernie Sanders’ campaign and vice versa.

With Facebook News, it’s likely going to be a similar story but worse. Facebook will curate the stories for you and personalize it according to your bubble. Because all of these stories are going to be from news sources more trusted than friends’ posts, people are likelier to take them more seriously, further entrenching their beliefs. In a world where we allow social media to form and shape our political opinions informally, formally getting our news from a social media platform is going to make things worse.

At their worst, filter bubbles have become a cost attached to platforms, and at their best, an unpriced negative externality. But with news, Facebook might have an opportunity to fix it. Unfortunately, the fix may begin with collecting more data … this time, about our political leanings.

Facebook already has that data to a great extent, and it uses it to arrange the content on your news feed. With Facebook news, it will likely ask for your data again, by allowing you to choose how you want to personalize your content curation. Only this time, it would be a good chance to explicitly collect political leanings. Of course, there are multiple reasons for why this is a bad idea. For one, we need to trust Facebook to not sell that data and for the data not to leak to malicious third parties. For another, we need to trust the platform to not use that data to incite conflict.

So why is it a great idea? Because it’s a chance to broaden everyone’s horizons. By using the political data, Facebook can and should curate the stories that reinforce your political views, but also simultaneously build a section based on content that actively opposes your worldview.

The value in doing so is that we have more access to opinions that differ from ours, and by that virtue allow us to believe that there may be legitimate arguments on both sides of the aisle. Flipboard, as a news source, already allows us to do that by easily toggling between liberal view and conservative view. Through that, people have a chance to be more open to debate and build a fruitful political discourse instead of cross-talking leading to rising frustrations.

Flipboard’s algorithm only optimizes it to differentiate between the political views in the U.S. But Facebook’s scale, data and resources would enable it to ensure that it differentiates between content globally (and in more languages) a lot more effectively than it does today.

As a global community, we should have a more visible and informed choice in what content we want to consume. Up until now, that choice has been either taken away or limited. By collecting more data on us, and using it for good, Facebook has an opportunity to help us take back control from its algorithms that reinforce and shape our views for us.

OZYThe New + the Next

Cutting-edge trends, rising stars and big ideas.