Eli Pariser’s op-ed in the New York Times, When the Internet Thinks It Knows You:
Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.
The Times had run an earlier story on Pariser’s The Filter Bubble: What the Internet Is Hiding From You. It takes the easiest possible reading of this idea, applying it to media choices and political disagreement:
If you want to test your own views on personalization, you could try a party trick Mr. Pariser demonstrated earlier this year during a talk at the TED conference: ask some friends to simultaneously search Google for a controversial term like gun control or abortion. Then compare results…
With television, people can limit their exposure to dissenting opinions simply by flipping the channel, to, say, Fox from MSNBC. And, of course, viewers are aware they’re actively choosing shows. The concern with personalization algorithms is that many consumers don’t understand, or may not even be aware of, the filtering methodology.
Reading Pariser’s op-ed, though, I got the sense that he’s not nearly as concerned about narrowing of opinions on the web as he is about the narrowing of interests.
“[I]f algorithms are taking over the editing function and determining what we see,” he writes, “we need to make sure they weigh variables beyond a narrow ‘relevance.’ They need to show us Afghanistan and Libya as well as Apple and Kanye.”
If you spend much time on the Internet, you know that there’s clearly no shortage of disagreement. But it’s more likely that you spend most of your time and energy disagreeing with people who care deeply about the same things about which you already care deeply.
You’ll argue about whether LeBron James or Derrick Rose should have won the MVP, whether or not Mitt Romney has a shot in the Iowa caucuses, or why Apple decided to pre-release information about the WWDC keynote.
We dive deeply into a range of pre-defined topics, tied to our professions, hobbies, needs, and histories, and sharpen our swords with opponents who do the same.
And on the margins, maybe that’s okay. Mass culture throws a whole lot of stuff at its audience that I, like you, have no intrinsic interest in. The time, energy, and cognitive surplus we once devoted to those things we used to consume only because “they were on” are all much better put to use tackling subjects we actually care about.
But it does mean that we’re often unaware of what’s happening in the next room, where there is frequently plenty of useful stuff that we could port into our own special areas of interest. We need to make sure we’re taking advantage of the web’s built-in ability to move laterally.
More to the point: those of us who produce and share content that other people read — and at this point, that’s almost all of us — need to trust that our readers are lateral movers too, and encourage them to do so.
I’m reminded of this blog post from last year, predicting the death of the niche blog and the rise of the lens blog. The lens blog can tackle any subject, but always from the point of view of a subset of enthusiasms or perspectives that find clever ways to find the same in the different, and vice versa.
Hyper-specialization, like information overload, is an old, old problem. But exactly for that reason, we shouldn’t be surprised to see it pop up as a potential problem with our new tools and new media, too.
In short, if you’re really worried about search engines or social media overfiltering what you see, worry less about your reading being one-sided and more about it being one-dimensional.