Member-only story
Does Facebook cause Polarization?
A slurry of recent articles assert that Facebook increases polarization and perhap radicalization among its users. Facebook is circling the wagons to counter this narrative, even asking its employees to help push back. After all, if they claim to be the company that brings people together, it would be unfortunate to be widely seen as having the opposite effect.
The first step to understand how Facebook may or may not polarize us is to look at how recommendation engines work. The following is more or less true of Twitter and YouTube (Google) and basically any ad-driven network that makes more money by keeping us actively engaged.
If a site wants to better monopolize our time and attention, it will measure what we click and how well that content engages us. Algorithms then try to guess what we might want next based on our recorded patterns, which forms a map of our likes and dislikes. When other people, even total strangers, match our patterns and they like something new to us, that item is more likely to be recommended to us, on the chance we’ll like it too. The more successful the algorithm is, the more prominent it becomes, optimizing for this key quality: our attention.
This is not the only way to build a recommendation engine, but it’s typical. I hope it’s easy to see why it would, on the whole, be effective.
Then there’s the emotional component to consider. Content that pushes our emotional buttons is much more likely to motivate us to explicitly recommend (e.g., forward…