The murmur of the snarkmatrix…

Jennifer § Two songs from The Muppet Movie / 2021-02-12 15:53:34
A few notes on daily blogging § Stock and flow / 2017-11-20 19:52:47
El Stock y Flujo de nuestro negocio. – redmasiva § Stock and flow / 2017-03-27 17:35:13
Meet the Attendees – edcampoc § The generative web event / 2017-02-27 10:18:17
Does Your Digital Business Support a Lifestyle You Love? § Stock and flow / 2017-02-09 18:15:22
Daniel § Stock and flow / 2017-02-06 23:47:51
Kanye West, media cyborg – MacDara Conroy § Kanye West, media cyborg / 2017-01-18 10:53:08
Inventing a game – MacDara Conroy § Inventing a game / 2017-01-18 10:52:33
Losing my religion | Mathew Lowry § Stock and flow / 2016-07-11 08:26:59
Facebook is wrong, text is deathless – Sitegreek !nfotech § Towards A Theory of Secondary Literacy / 2016-06-20 16:42:52

It’s not the echo, it’s the chamber
 / 

Eli Pariser’s op-ed in the New York Times, When the Internet Thinks It Knows You:

Democracy depends on the citizen’s ability to engage with multiple viewpoints; the Internet limits such engagement when it offers up only information that reflects your already established point of view. While it’s sometimes convenient to see only what you want to see, it’s critical at other times that you see things that you don’t.

The Times had run an earlier story on Pariser’s The Filter Bubble: What the Internet Is Hiding From You. It takes the easiest possible reading of this idea, applying it to media choices and political disagreement:

If you want to test your own views on personalization, you could try a party trick Mr. Pariser demonstrated earlier this year during a talk at the TED conference: ask some friends to simultaneously search Google for a controversial term like gun control or abortion. Then compare results…

With television, people can limit their exposure to dissenting opinions simply by flipping the channel, to, say, Fox from MSNBC. And, of course, viewers are aware they’re actively choosing shows. The concern with personalization algorithms is that many consumers don’t understand, or may not even be aware of, the filtering methodology.

Reading Pariser’s op-ed, though, I got the sense that he’s not nearly as concerned about narrowing of opinions on the web as he is about the narrowing of interests.

“[I]f algorithms are taking over the editing function and determining what we see,” he writes, “we need to make sure they weigh variables beyond a narrow ‘relevance.’ They need to show us Afghanistan and Libya as well as Apple and Kanye.”

If you spend much time on the Internet, you know that there’s clearly no shortage of disagreement. But it’s more likely that you spend most of your time and energy disagreeing with people who care deeply about the same things about which you already care deeply.

You’ll argue about whether LeBron James or Derrick Rose should have won the MVP, whether or not Mitt Romney has a shot in the Iowa caucuses, or why Apple decided to pre-release information about the WWDC keynote.

We dive deeply into a range of pre-defined topics, tied to our professions, hobbies, needs, and histories, and sharpen our swords with opponents who do the same.

And on the margins, maybe that’s okay. Mass culture throws a whole lot of stuff at its audience that I, like you, have no intrinsic interest in. The time, energy, and cognitive surplus we once devoted to those things we used to consume only because “they were on” are all much better put to use tackling subjects we actually care about.

But it does mean that we’re often unaware of what’s happening in the next room, where there is frequently plenty of useful stuff that we could port into our own special areas of interest. We need to make sure we’re taking advantage of the web’s built-in ability to move laterally.

More to the point: those of us who produce and share content that other people read — and at this point, that’s almost all of us — need to trust that our readers are lateral movers too, and encourage them to do so.

I’m reminded of this blog post from last year, predicting the death of the niche blog and the rise of the lens blog. The lens blog can tackle any subject, but always from the point of view of a subset of enthusiasms or perspectives that find clever ways to find the same in the different, and vice versa.

Hyper-specialization, like information overload, is an old, old problem. But exactly for that reason, we shouldn’t be surprised to see it pop up as a potential problem with our new tools and new media, too.

In short, if you’re really worried about search engines or social media overfiltering what you see, worry less about your reading being one-sided and more about it being one-dimensional.

(For more smart takes on Pariser’s argument, see also Mat Ingram at GigaOm, Cory Doctorow at Boing Boing)

4 comments

Two things related to small parts of this come to mind:

One, being social media friends with coworkers, family, and friends brings me into constant contact with opinions (stated or linked) that are contrary to mine and with new things I did not already know or know about. I keep some of them on my friend rolls just for that purpose.

Two, those very people represent a community.

I would say the that necessary friction is between, on one side, filtering enough so that the information flow is manageable, appropriate, and has a feeling of community, and, on the other, leaving the filters loose enough so that the information flow is surprising, accretive (as opposed to duplicative), and provocative.

/ Reply

What’s that phrase? Something like, “two anecdotes don’t make a fact”? I think it needs to be updated to, “telling a story at TED doesn’t make it science.”

/ Reply

Love this discussion but whenever anyone lives too much in one camp or the other I always find myself asking “Who are we talking about?” and “What Are We Talking About?” All of the talk of filter failure and filter bubbles bad or good, is without context.

As a former news reporter, I was a knowledge glutton who pigged out on everything I could on a given topic from a variety of perspectives in an effort to draw a more holistic picture (ignore my own personal filters or biases for a minute). In that sense, maybe filters are bad. As they would be for any wayward voter to only drink from the Fox News firehose in their effort to get a greater understanding of a political issue.

But in a professional environment, I want my lawyer, my doctor, my pharmacist, even my favorite company that builds the things I absolutely must buy, to be laser focused on the knowledge streams, aka filters, that can keep me out of jail, heal me when I’m sick, medicate me when I’m not, and/or make me something that makes my life easier to manage. I want the people I depend on in this way to live in these “bubbles” so they don’t get distracted by fads or false hopes.

Filters for businesses are vital for staying on top of your areas of expertise, your competitors, and the needs of your customers and your prospects. Without them, we’re digital cavemen always in hunter/gatherer mode. Check out today’s post “The current Content Filter Debate Lacks Context” by @charliedavidson here: http://bit.ly/kddDWx .
Nice work….keep the snark snarky.
Mark

/ Reply

It took a few days for this to occur to me, but it would seem notably obvious, and yet, importantly undersaid.

“…ask some friends to simultaneously search WIKIPEDIA for a controversial term like gun control or abortion…”

There. That’s the non-biased resource he meant to reference. As they said on the Internet, ‘fixed that for him’.

/ Reply