The murmur of the snarkmatrix…

Jennifer § Two songs from The Muppet Movie / 2021-02-12 15:53:34
A few notes on daily blogging § Stock and flow / 2017-11-20 19:52:47
El Stock y Flujo de nuestro negocio. – redmasiva § Stock and flow / 2017-03-27 17:35:13
Meet the Attendees – edcampoc § The generative web event / 2017-02-27 10:18:17
Does Your Digital Business Support a Lifestyle You Love? § Stock and flow / 2017-02-09 18:15:22
Daniel § Stock and flow / 2017-02-06 23:47:51
Kanye West, media cyborg – MacDara Conroy § Kanye West, media cyborg / 2017-01-18 10:53:08
Inventing a game – MacDara Conroy § Inventing a game / 2017-01-18 10:52:33
Losing my religion | Mathew Lowry § Stock and flow / 2016-07-11 08:26:59
Facebook is wrong, text is deathless – Sitegreek !nfotech § Towards A Theory of Secondary Literacy / 2016-06-20 16:42:52
Snarkmarket commenter-in-chief since 2003, editor since 2008. Technology journalist and media theorist; reporter, writer, and recovering academic. Born in Detroit, living in Brooklyn, Tim loves hip-hop and poetry, and books have been and remain his drug of choice. Everything changes; don't be afraid. Follow him at

How a sale is made
 / 

It’s always nice when three blogs in your “must read” folder happily converge. First, Jason Kottke pulls a couple of super-tight paragraphs from a Chronicle of Higher Ed article by Clancy Martin, philosophy professor and onetime salesman of luxury jewelry, about how he plied his former trade:

The jewelry business — like many other businesses, especially those that depend on selling — lends itself to lies. It’s hard to make money selling used Rolexes as what they are, but if you clean one up and make it look new, suddenly there’s a little profit in the deal. Grading diamonds is a subjective business, and the better a diamond looks to you when you’re grading it, the more money it’s worth — as long as you can convince your customer that it’s the grade you’re selling it as. Here’s an easy, effective way to do that: First lie to yourself about what grade the diamond is; then you can sincerely tell your customer “the truth” about what it’s worth.

As I would tell my salespeople: If you want to be an expert deceiver, master the art of self-deception. People will believe you when they see that you yourself are deeply convinced. It sounds difficult to do, but in fact it’s easy — we are already experts at lying to ourselves. We believe just what we want to believe. And the customer will help in this process, because she or he wants the diamond — where else can I get such a good deal on such a high-quality stone? — to be of a certain size and quality. At the same time, he or she does not want to pay the price that the actual diamond, were it what you claimed it to be, would cost. The transaction is a collaboration of lies and self-deceptions.

This structure is so neat that it has to be generalizable, right? Look no further than politics, says Jamelle Bouie (filling in for Ta-Nehisi Coates). In “Why Is Stanley Kurtz Calling Obama a Socialist?“, he writes that whether or not calling Obama a socialist started out as a scare tactic, conservative commentators like Kurtz actually believe it now. He pulls a quote from Slacktivist’s Fred Clark on the problem of bearing false witness:

What may start out as a well-intentioned choice to “fight dirty” for a righteous cause gradually forces the bearers of false witness to behave as though their false testimony were true. This is treacherous — behaving in accord with unreality is never effective, wise or safe. Ultimately, the bearers of false witness come to believe their own lies. They come to be trapped in their own fantasy world, no longer willing or able to separate reality from unreality. Once the bearers of false witness are that far gone it may be too late to set them free from their self-constructed prisons.

What’s nice about pairing these two observations is that Martin’s take on self-deception in selling jewelry is binary, a pas de deux with two agents, both deceiving themselves and letting themselves be deceived. Bouie and Clark don’t really go there, but the implication is clear: in politics, the audience is ready to be convinced/deceived because it is already convincing/deceiving itself.

There’s no more dangerous position to be in, truth-wise, than to think you’re getting it figured out, that you see things other people don’t, that you’re getting over on someone. That’s how confidence games work, because that’s how confidence works. And almost nobody’s immune, as Jonah Lehrer points out, quoting Richard Feynman on selective reporting in science. He refers to a famous 1909 experiment which sought to measure the charge of the electron:

Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It’s a little bit off, because he had the incorrect value for the viscosity of air. It’s interesting to look at the history of measurements of the charge of the electron, after Millikan. If you plot them as a function of time, you find that one is a little bigger than Millikan’s, and the next one’s a little bit bigger than that, and the next one’s a little bit bigger than that, until finally they settle down to a number which is higher.

Why didn’t they discover that the new number was higher right away? It’s a thing that scientists are ashamed of—this history—because it’s apparent that people did things like this: When they got a number that was too high above Millikan’s, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number closer to Millikan’s value they didn’t look so hard. And so they eliminated the numbers that were too far off, and did other things like that.

It’s all little lies and adjustments, all the way down. Where else can I get such a good deal on such a high-quality stone?

8 comments

The Western 101, via Netflix Watch Instantly
 / 

I love Westerns. My allegiance to the genre has long been known on the Snarkmatrix. (I refer you to the comment threads on Exhibit A or Exhibit B.) So I am excited that people are excited by Joel and Ethan Coen’s new Western, True Grit.

And jeez, I hope I get a few hours by myself in the next week or so to see this movie. Parenting is a serious drag on your ability to partake of the cinema, which is one reason I’ve become such a devotée of Netflix Watch Instantly. I didn’t even get to catch the restored Metropolis when it came to town, and I had only A) waited months for it and B) written a chapter of my dissertation about its director. So I don’t know if True Grit is as good as everyone says it is. What I do know, what I know the hell out of, are Westerns, and Netflix. If you don’t know Westerns, that’s fine. So long as you’ve got a Netflix subscription and streaming internet, I’ve got your back.

You probably know that True Grit (2010) is an adaptation of the same Charles Portis novel (True Grit) that was previously adapted into a movie [True Grit (1969)] that won John Wayne a Best Actor Oscar for his portrayal of the eyepatched marshal Rooster Cogburn. It’s not a remake, you’ve heard entoned, it’s a more-faithful adaptation of the novel.

Fine. Who cares? At a certain point, remakes and adaptations stop being remakes and adaptations. Does anyone care that His Girl Friday was a gender-swapping adaptation of The Front Page, a terrific Ben Hecht and Charles McArthur play which had already been made into a movie in 1931, and which was made into a movie again in 1974 with Billy Wilder directing and Walter Matthau and Jack Lemmon playing the Cary Grant and Rosalind Russell roles?

Okay, I do. But besides me, not really. Because His Girl Friday obliterated The Front Page in our movie-watching conciousness, even though the latter is the prototype of every fast-talking newspaper comedy from, shit, His Girl Friday to the Coen Brothers’ The Hudsucker Proxy. It’s been over forty years since True Grit (1969). It’s a good movie, but if you haven’t seen it, don’t sweat it too much.

You should, however, be sweating the Western. Because not least among their virtues is that Joel and Ethan Coen care and care deeply about genre. Virtually all of their movies are a loving pastiche of one genre form or another, whether playful (like Hudsucker’s newspaper comedy or The Big Lebowski’s skewed take on the hardboiled detective), not so playful (No Country For Old Men) or both somehow at once (Miller’s Crossing, Fargo). And the Western is fickle. You’ve got to contend with books, movies, radio, and TV, all with their own assumptions, all alternating giddy hats-and-badges-and-guns-and-horses entertainment and stone-serious edge-of-civilization Greek-tragedy-meets-American-origin-stories primal rites.

I’ll save you some time, though, by giving you just twelve links, briefly annotated.
Read more…

4 comments

Rooting for the home team
 / 

It’s a classic paradox of American democracy: citizens love America, hate Congress, but generally like their own district’s Congressman. (Until they don’t, and then they vote for someone else, who they usually like).

Josh Huder (via Ezra Klein) takes on the apparent paradox, armed with some good data and historical analysis.

Huder points out something even more paradoxical: Congressional approval takes a hit not just when there’s a scandal, or when there’s partisan gridlock in the face of a crisis, but even when Congress works together to pass major legislation:

By simply doing its job Congress can alienate large parts of its constituency. So while people like their legislators, they dislike when they get together with fellow members and legislate.

From this, Huder concludes that “disapproval is built into the institution’s DNA.” But let me come at this from a different angle: professional and/or sports.

There’s almost an exact isomorphism here. Fans/constituents like/love their home teams (unless their performance suffers for an extended period of time, when they switch to “throw the bums out” mode), and LOVE the game itself. But nobody really likes the league. Who would say, “I love the MLB” or “I love the NCAA” — meaning the actual organizations themselves?

Never! The decisions of the league are always suspect. They’re aggregate, bureaucratic, necessary, and not the least bit fun. Even when leagues make the right decision, we discount it; they’re just “doing their job.” The only time they can really capture our attention is when they do something awful. And most of the time, they’re just tedious drags on our attention, easily taken for granted.

If it’s a structure, it doesn’t seem to be limited to politics. It’s a weird blend of local/pastime attachment, combined with contempt/misunderstanding for the actual structures that work. Because we don’t *want* to notice them at work at all, really.

One comment

Two observations on Lanier on Wikileaks
 / 

Robin set the table up (and h/t to Alexis for getting Lanier’s essay in circulation).

Here are three disjoint thoughts, slightly too long for tweets/comments:

  1. Part of Lanier’s critique of Wikileaks works astonishingly well as a critique of Google’s Ngrams, too. (I’m working up a longer post on this.) In particular, I’m thinking of this observation:

    A sufficiently copious flood of data creates an illusion of omniscience, and that illusion can make you stupid. Another way to put this is that a lot of information made available over the internet encourages players to think as if they had a God’s eye view, looking down on the whole system.

  2. I feel like we need a corollary to the Ad Hitlerem/Godwin’s Law fallacy. I’m going to call it “the Gandhi principle.” Just like trotting out the Hitler analogy for everything you disagree with shuts down a conversation by overkill, so do comparisons with Mahatma Gandhi, Martin Luther King, Nelson Mandela, Jesus, and other secular and not-so-secular activist saints.

We’ve canonized these guys, to the point where 1) we think they did everything themselves, 2) they never used different strategies, 3) they never made mistakes, and 4) disagreeing with them then or now violates a deep moral law.

More importantly, in comparison, every other kind of activism is destined to fall short. Lanier’s essay, like Malcolm Gladwell’s earlier essay on digital activism, violates the Gandhi principle. (Hmm, maybe this should be the No-Gandhi Principle. Or it doesn’t violate the Gandhi Principle, but invokes it. Which is usually a bad thing. Still sorting this part out.) The point is, both Ad Hitlerem and the Gandhi Principle opt for terminal purity over differential diagnosis. If you’re not bringing it MLK-style, you’re not really doing anything.

The irony is, Lanier’s essay is actually pretty strong at avoiding the terminal purity problem in other places — i.e., if you agree with someone’s politics, you should agree with (or ignore) their tactics, or vice versa. At its best, it brings the nuance, rather than washing it out.

Google’s Ngrams is also subject to terminal purity arguments — either it’s exposing our fundamental cultural DNA, or it’s dicking around with badly-OCRed data, and it couldn’t possibly be anything in between. To which I say — oy.

4 comments

Film History 101 (via Netflix Watch Instantly)
 / 

Robin is absolutely right: I like lists, I remember everything I’ve ever seen or read, and I’ve been making course syllabi for over a decade, so I’m often finding myself saying “If you really want to understand [topic], these are the [number of objects] you need to check out.” Half the fun is the constraint of it, especially since we all now know (or should know) that constraints = creativity.

So when Frank Chimero asked:

Looking to do some sort of survey on film history. Any sort of open curriculum out there like this that runs in tandem with Netflix Instant?

I quickly said, “I got this,” and got to work.

See, trying to choose over the set of every film ever made is ridiculously hard. Choosing over a well-defined subset is both easier and more useful.

Also, I knew I didn’t want to pick the best movies ever made, or my favorites, or even the most important. Again, that pressure, it’ll cripple you. I wanted to pick a smattering of films that if you watched any given, sufficiently large subset of them, you’d know a lot more about movies than when you started.

This is actually a lot like trying to design a good class. You’re not always picking the very best examples of whatever it is you’re talking about, or even the things that you most want your students to know, although obviously both of those factor into it. It’s much more pragmatic. You’re trying to pick the elements that the class is most likely to learn something from, that will catalyze the most chemistry. It’s a difficult thing to sort, but after you’ve done it for a while, it’s like driving a car, playing a video game, or driving a sport — you just start to see the possibilities opening up.

Then I decided to add my own constraints. First, I decided that I wasn’t going to include any movies after the early 1970s. You can quibble about the dates, but basically, once you get to the Spielberg-Scorsese-Coppola-Woody Allen generation of filmmakers — guys who are older but still active and supremely influential today — movies are basically recognizable to us. Jaws or Goodfellas or Paris, Texas are fantastic, classic, crucial movies, but you don’t really have to put on your historical glasses to figure them out and enjoy them, even if they came out before you were of movie-going age. The special effects are crummier, but really, movie-making just hasn’t changed that much.

Also, I wasn’t going to spend more than a half-hour putting it together. I knew film history and Netflix’s catalog well enough to do it fast, fast, fast.

And so, this was the list I came up with. As it happened, it came to a nice round 33.

I made exactly one change between making up the list and posting it here, swapping out David Lynch’s Eraserhead for Jean-Luc Godard’s Breathless. I cheated a little with Eraserhead — it’s a late movie that was shot over a really, really long period of time in the 70s and came out towards the end of that decade. And Breathless isn’t Godard’s best movie, but it’s probably the most iconic, so it was an easy choice.

There are huge limitations to this list, mostly driven by the limitations of the catalog. Netflix’s selection of Asian and African movies, beyond a handful of auteurs like Akira Kurosawa, isn’t very good. There’s no classic-period Hitchcock. There’s no Citizen Kane. There aren’t any documentaries or animated films. And you could argue until you’re blue in the face about picking film X over film Y with certain directors or movements or national cinemas.

But you know what? You wouldn’t just learn something from watching these movies, or just picking five you haven’t seen before — you would actually have fun. Except maybe Birth of a Nation. Besides its famous pro-Ku Klux Klan POV, that sucker is a haul. Happy watching.

14 comments

Now that's what I call local
 / 

Sorry; this snippet from Matt’s second-day liveblog/Twitter curation of the conversation at PubCamp blew my mind a little bit:

Matt Thompson: One of the most frequent issues NPR.org users have is not being able to find something on our website. The vast majority of the time, that’s because they heard something on their local programming and are searching for it in the national site. If we had shared authentication across the system, we could be able to recognize other stations users authenticate with and show them local content.

So simple, but so powerful.

You’ve got to fine-tune just how local you get to match user expectations, though:

Matt Thompson: Discussion turns to users qualms over things like the Open Graph, turning on WaPo.com, for example, and suddenly seeing your friends’ names all over the page. How does the Washington Post know who my friends are?

But we quickly come back to the simple-but-powerful stuff again:

Matt Thompson: I asked for my pony: a registration system that would just keep track of what I’d read on the site, then let me know when those stories were updated/corrected.

I think we almost need to bring it back to the user end and offer something like a hybrid between the “Private Browsing/Incognito” mode that’s started to get incorporated into web browsers and the browser extension FlashBlock, that disables Flash ads and videos except when you whitelist them.

Call it “SocialBlock” (which sounds way more fun than it actually is). I browse with my identity intact, carrying it with me, but can select which sites/services I offer it to. And it’s just a quick click to turn it on or off.

One comment

Blogger, Reporter, Author
 / 

I want to distinguish blogging from reporting, and bloggers from reporters. But more than that, I want to distinguish the first question from the second.

Blogging is pretty easy to define as an activity. It’s writing online in a serial form, collected together in a single database. It doesn’t matter whether you’re doing it as an amateur or professional, as an individual or in a group, under your own byline or a pseudonym, long-form or on Twitter.

Reporting is a little trickier, but it’s not too tough. You search for information, whether from people or records or other reports, you try to figure out what’s true, and you relay it to somebody else. Anyone can report. They assign reports to elementary school students. Or you can be Woodward-and-Bernsteining it up, using every trick you can think of to track down data from as many sources as possible.

Now, both of these are different from what it means to be a blogger or a reporter. The latter are a matter of identity, not activity. I’ll offer an analogy. If someone says, “I’m a writer,” we don’t assume that they mean that they’re literate and capable of writing letters, words, or sentences. We might not assume that they’re a professional writer, but we do assume that they identify with the act of writing as either a profession, vocation, or distinguished skill. They own their action; it helps define who they are.

Likewise, if someone calls themselves (or if someone else calls them) a reporter or blogger, they might be referring to their job or career, but they’re definitely referring to at least a partial aspect of their identity. And just like we have preconceptions about what it means to be a “writer” — a kind of Romantic origin myth, of genius and personality expressed through language — we have preconceptions about what it means to be a blogger or a reporter.

They’re not just preconceptions, though, but practices codified in institutions, ranging from the law to business and labor practices to the collective assumptions and morés of a group.

There are lots of ways you could trace and track this, but let me follow one thread that I think is particularly important: the idea of the author-function.

Traditionally (by which I mean according to the vagaries of recent collective memory), reporters who are not columnists have bylines, but are not seen as authors. Their authority instead accrues to their institution.

If we read a story written by a typical reporter, we might say “did you see ____ in the New York Times?” If other newspapers or journalistic outlets pick up the story, if they attribute it at all, they’ll say, “According to a report in the New York Times…” This is similar to medical and scientific research, where journalists will usually say, “scientists at MIT have discovered…”

Some people within this field are different. If Paul Krugman writes something interesting, I probably won’t say “the New York Times”; I’ll say “Paul Krugman.”

In fact, there’s a whole apparatus newspapers use in order to distinguish writers I’m supposed to care about and writers I’m not. A columnist’s byline will be bigger. Their picture might appear next to their column.* They might write at regular intervals and appear in special sections of the paper. This is true in print or online.

(*This was actually one of the principal ways authorship was established in the early modern period: including an illustration of the author. Think about the famous portraits of Shakespeare. Sometimes to be thrifty, printers would reuse and relabel woodcuts: engravings of René Descartes were particularly popular, so a lot of 17th-century authors’ pictures are actually Descartes.)

Blogs do basically the same thing. Quick: name me three bloggers besides Josh Marshall who write for Talking Points Memo. If you could do it, 1) you’re good, and 2) you probably know these people personally, or at least through the internet.

These guys and girls are bloggers, they’re reporters, they’re opinionated, they have strong voices, and some of them are better than others. But I don’t know what they look like; if they followed me on Twitter tomorrow, I probably wouldn’t recognize their names. Josh Marshall, the impresario, is an author of the blog in a way that his charges are not. Or to take another example, Jason Kottke — whose writing is nearly as ego-less as it can probably get in terms of style, but who still is the absolute author of his blog.

The Atlantic, for better or worse (I think better), took an approach to blogging that foregrounded authorship: names, photos, and columns. There are “channels” through which lots of different people write, and sometimes you pick their names and voices out of the stream, but they’re not Andrew Sullivan, Ta-Nehisi Coates, James Fallows, Megan McArdle, Jeffrey Goldberg, Alexis Madrigal, et al., or Ross Douthat, Matt Yglesias and co. before them.

Now all of these writers tackle different topics and work in different styles, but they’re all authors. Their blogs are written and held together through the force of their names and personalities. Sullivan has a team of researchers/assistants, Coates has a giant community of commenters, Alexis has a crew of rotating contributors. It doesn’t matter; it’s always their blog.

The one person who never quite fit into this scheme was Marc Ambinder. Early on, when the first group of bloggers came in, it made more sense. For one thing, almost all of them wrote about politics and culture. They each had a slightly different angle — different ages, different political positions, different training. Ambinder’s schtick was that he was a reporter. It seemed to make as much sense as anything else.

As time went on, the blogs became less and less about politics in a recognizable sense. Ta-Nehisi Coates starting writing about the NFL, Jim Fallows increasingly about China and flying planes. And then the Atlantic starting putting author pictures up, by the posts and on the front page.

I remember sometime not long ago seeing Ambinder’s most recent photo on TheAtlantic.com and saying to myself, “I know what Marc Ambinder looks like, and that’s not Marc Ambinder.” He wasn’t wearing his glasses. He’d lost a ton of weight — later I’d find out he’d had bariatric surgery. He found himself embroiled in long online arguments where he was called out by name about his politics, his sexuality, his relationships.

Here’s somebody who by dint of professional training and personal preference simply did not want to be on stage. He didn’t want people looking at him. He didn’t want to talk about himself. He couldn’t be a personality like Andrew Sullivan or Ta-Nehisi Coates, or even a classically-handsome TV anchor talking head WITH personality like Brian Williams or Anderson Cooper. He wanted to do his job, represent his profession and institution, and go home.

I’m sympathetic, because I find it just as hard to act the opposite way. By training and disposition, I’m a writer, not a reporter. I’ve had to learn repeatedly what it means to represent an institution rather than just my own ideas and sensibilities — that not every word that appears under my byline is going to be the word I chose. The vast majority of people I meet and interact with don’t care who I am or what I think, just the institution I write for.

That’s humbling, but it’s powerful, too. Sometimes, it’s appealing. One of the things I love about cities are the anonymity you can enjoy: I could be anybody and anybody could be me. If you identify with it and take it to its limit, adopting those values as yours, it’s almost impossible to turn around and do the other thing.

So far, we have lived in a world where most the bloggers who have been successful have done so by being authors — by being taken seriously as distinct voices and personalities with particular obsessions and expertise about the world. And that colors — I won’t say distorts, but I almost mean that — our perception of what blogging is.

There are plenty of professional bloggers who don’t have that. (I read tech blogs every day, and couldn’t name you a single person who writes for Engadget right now.) They might conform to a different stereotype about bloggers. But that’s okay. I really did write snarky things about obscure gadgets in my basement while wearing pajama pants this morning. But I don’t act, write, think, or dress like that every day.

10 comments

A family resemblance of obsessions
 / 

At HiLobrow, Matthew Battles interviews Tim Maly about his 50 Cyborgs project, for which Robin and I both wrote posts. Tim (Tim M, the other Tim) has a lot of nice things to say about Snarkmarket, and the whole interview is in part a response to Robin’s call for a postmortem on the project, but the interview’s mostly interesting for the smart things Tim says in response to Matt’s smart proddings.

***

A fair amount of the discussion circles around the nature of language. Here’s a representative chunk, where Matthew asks Tim about whether or not nonfiction criticism needs (or already has) a “fanfic impulse”:

I’m thinking about how Bruce Sterling in particular has identified or refined a series of concepts—spime, atemporality, favela chic, design fiction, to name a few — which people who aren’t students of his, but fans of his critique, sort of take up and extend. Maybe “hilobrow” has pretensions to this kind of conceptul life; “bookfuturism,” too has fans, now, and a life of its own. Of course we’re always doing this sort of thing in public discourse; it’s just a notion I have now that “fandom” becomes another mode or style of relating, alongside classroom, chiefdoms/tribes, and mentorship, among other models. Call it “fancrit”? Or not…

Tim is game, and runs with the “fancrit” idea:

The interesting thing about this, I think, is that where fanfic is necessarily ghettoized (you are playing with someone else’s copyrighted characters and worlds) fancrit is fed by a long academic tradition of fighting for mindshare via vocabulary. Sterling coins spime and that’s a meaningful event only to the extent that he can lose control of it. He wins when people start using the word without bothering to attribute it to him. Clynes & Kline coin cyborg and they end up winning to the point where Clynes becomes irritated with the way the meaning shifts and is twisted.

If you don’t get that etymological/genealogical twisting of cyborg from Clynes and Kline’s original, limited meaning, you don’t get 50 posts about it; the term itself isn’t generative or potent enough to move beyond its first-generation instance. It’s a concept that can’t conceive, in the sexual/reproductive sense.

***

That’s the power of language, which can be a dangerous power — it’s always exceeding our ability to, Humpty-Dumpty like, determine once and for all what words mean.

But it also means that words can be put into motion without permission, without determination — that they can circulate without anyone needing to hold them fast, or play Pope to decide what’s in and what’s out. They have a life of their own.

This is what I also like in Bruce Sterling’s comment on TM and MB’s conversation:

Some remarkable stuff in this discussion about positioning for niche intelligentsia eyeballs in the modern post-blogosphere. I think people used to call that activity “publishing,” but nowadays it’s a creolized effort badly in need of a neologism.

We don’t have a word for this! Let’s make one up! We have an old word, but it doesn’t work any more; it doesn’t mean what it should, or it means too much. Let’s let it go! Let it mean something else — and we can all talk about this in a different way.

***

I have something that I’m fond of saying, and it’s totally drawn from my training in philosophy: sometimes the most important thing you can do in an argument is to point out that we don’t have to talk about it the way we’ve always talked about it.

If you asked me to boil down the “real meaning” of the Bookfuturist manifesto I wrote, I’d say it’s that. We almost always talk about the relationship between culture and technology in very predictable ways that don’t solve problems. So let’s not talk about them that way anymore.

If you want a better example, look at this post on education, pointed to me by Rob Greco:

The “problems” we face with schools are right now are less about the schools themselves and more about a lack of vision and a fear of change. Put simply, the age-grouped, subject-delineated, 8 am-2 pm, September-June, one-size-fits-all system that we have makes the process of education easy. The realities of personal, self-directed, real problem-solving learning in a connected world are anything but.

Still, the hardest reality right now is that there is no groundswell to do school differently, not just “better.” Seems it’s easy to see a path to “better.” “Different” is just too scary.

***

If you want to do philosophy, or to show someone what it means to do philosophy, even your grandma, or a seven-year-old, get a group of people into a room and ask them, “what is a sport?”

Quickly, you’ll get strong opinions. Some people don’t think golf is a sport; other people don’t think figure skating should be one. Is dodgeball a sport? What about “tag”? (Some people are really good at tag.) Table tennis? Video games? Cheerleading? If not, why not? Eventually, people will try to come up with definitions. The definitions will resolve some problems but inevitably, they’ll exclude something that everyone in the room agrees is at least a borderline case.

What’s great about it is that you’re not arguing about the fundamental nature of the universe, drawing on complex symbolic logic, or questioning people’s ethical or religious beliefs (you know, depending on how strongly they feel about baseball).

You haven’t assigned any reading. There’s no mathematical equation to be solved, reference work to consult, or tool to be used to solve the problem. But everyone agrees that you’re talking about a real thing, something that actually exists and is relatively important, and at least for most of us, worth having an opinion about.

All you’re doing is asking everyone in the room to ask themselves: when I use such-and-such a word, what do I mean? What am I assuming? What am I committing myself to? If there’s a dispute between two people about how to use a word or what it means, how do we resolve it? How do we decide with language how we use language? And how do we do this, for the most part, completely organically and without great complication?

It’s a wonder. And it deserves to be wondered at.

***

Tim Maly has a great phrase for the group he gathered to work on 50 Cyborgs:

I’m lucky to have this great community (clique?) that’s emerged around a bunch of people whose work I love who have a family resemblance of obsessions.

“Family-resemblance,” if you don’t know, is an important phrase in philosophy. It’s the phrase Ludwig Wittgenstein uses to describe just the process I described above — how words like “game,” “sport,” “cyborg,” “community,” “book,” or “publishing” don’t have a single fixed meaning, a picture of a thing that you can match to each word, like God’s own dictionary.

Instead we’ve got this sloppy, fleshy language that generates and regenerates itself over time and across space and forms new clusters and meanings, and we can’t even collect the entire extension of the concept; all we can say is this word is used in such-and-such-a-way, and, within the broad unspoken assumptions of the lifeworld of a particularly community, we know what we mean and we know how to resolve misunderstandings.

Blogs — the best blogs — are public diaries of preoccupations. The reason why they are preoccupations is that you need someone who is continually pushing on the language to regenerate itself. The reason why they are public is so that those generations and regenerations and degenerations can find their kin, across space, across fame, across the likelihood of a connection, and even across time itself, to be rejoined and reclustered together.

Because that is how language and language-users are reborn; that is how the system, both artificial and natural, loops backward upon and maintains itself; because that is how a public and republic are made, how a man can be a media cyborg, and also become a city. That’s how this place where we gather becomes home.

5 comments

All the pieces matter: Monopoly and The Wire
 / 

MONOPOLYWIREMEDIUM1

The Poke is a UK satirical site, a little bit like Chicago’s The Onion. Thursday, they published a fake news article about a version of Monopoly — complete with a fully-imagined and -illustrated fake gameboard — branded on the beloved HBO series The Wire:

“The Wire is all about corners,” says Hasbro spokesperson Jane McDougall, “and the Monopoly board is all about corners. It was a natural fit.” Based around the journey a young gangster might take through the fictionalised Baltimore of the show, players move from corner to stoop, past institutions featured in successive series like the school system and the stevedores union, acquiring real estate, money and power before ending up at the waterfront developments and City Hall itself.

There’s a classic scene in the first season of The Wire where D’Angelo (nephew of the drug boss Avon Barksdale and one of the series’s many unlikely protagonists) tries to teach two young dealers who work for him (Bodie and Wallace) how to play chess. Chess quickly turns into an elaborate metaphor to describe the violent realities and unreal ideals of the drug world they all live in:

But of course, it turns out not just to describe the drug world, but any world seen through the lens of The Wire. The two sides of the chess board could be one drug gang warring against another — Avon vs Marlo Stansfield. It could be the police detail trying to catch and trap the leader of one of the gangs. In the world of the police, too, pawns are expendable, and the people at the top fall under a completely different logic. (Every so often, a pawn will be transformed — like Prez, the hapless street cop who becomes first an invaluable decoder and data-miner and eventually, a middle school math teacher.)

But the single-plane, A vs B world of chess is really only an adequate metaphor for the narrow world of The Wire‘s first season, the immediate objectives that eventually get unravelled. As Stringer Bell tries to tell his partner Avon, “there are games beyond the game.”

That’s the world Stringer tries to navigate. You begin with drugs, fighting for corners. Then you step back, build institutions – other people work for you. Eventually, you transcend the street level and become a power broker, directing traffic but never touching the street. Then you take your ill-gotten capital — your Monopoly money — and turn it into real capital, by investing in (get this) real estate, political connections, legitimate businesses. Stringer Bell’s dream is Michael Corleone’s dream (which was Joe Kennedy’s dream). Power into wealth and back into power again. But it’s all just business.

That’s where Monopoly comes in. Like chess, Monopoly is about controlling territory. Unlike chess, it’s not neofeudal combat, with handed-down traditions and ideologies of strategy and honor — the illusion that everything is perfectly under the player’s control, that all the pieces in the game are visible.

Monopoly is transparently about money and greed. It lays bare the multiple, adjacent worlds and the interlocking systems that tie them together. (In The Wire, the worlds adjacent to drugs and cops include the ports, politics, the schools, and the media.) You gain territory and choose how you build on it, but you also roll dice and overturn hidden cards that can send you in a completely different direction. It’s actually absurdly easy for players to cheat — especially if you let them control the bank. And every time you pass Go, the game — at least in part — starts over again.

The Wire is about a lot of things — the decline of the American city, the futility of the war on drugs, the corruption of our institutions. It’s also about the gap between our ideologies of how things ought to be as opposed to the way they actually are. “You want it to be one way,” drug kingpin Marlo tells a worn-out security guard who tries to stop him from shoplifting. “But it’s the other way.”

Overwhelmingly, that gap plays out in the field of work. The second season, about the blue-collar port workers, is transparently about work — but really, every season is about workers, bosses, money, promotions, recognition. The innovation of The Wire with respect to its representation of drug gangs and cops is to present them as the mundane, kind of screwed-up workplaces that they are.

And capitalism has always been screwed-up about work. On the one hand, we’ve got Weber: the Protestant idea that work has an ethical value, that everybody has a calling and that we prove ourselves through our success. On the other, we’ve got Marx: the only way the system works is by extracting value from its workers, and the more value it can extract for less investment, the better the people at the top make out. “Do more with less,” as the newspaper editor, mayor, and police bosses say over and over again.

I think this is how I finally came to terms with The Wire‘s last season, which added journalism to the mix. It’s about that disillusionment — the idea that the work of journalism has an intrinsic value, and the corruption of that through cost-cutting and self-serving behavior. And maybe that disillusionment is extra bitter for Simon, who couldn’t stand what capitalism did to his newspaper, his city, its employers, its politics. The gall is too thick.

Simon’s collaborator Ed Burns had a more reconciled view of it; he’d worked as a cop, as a teacher, then a screenwriter/producer, and seemed to find satisfaction in different parts of each of them. It’s Burns’s wisdom we get when Lester Freamon tells Jimmy McNulty — who (like Simon) unleashes his anger on anyone who tries to get between him and his work — “the job will not save you.”

A Wire-themed Monopoly board might have begun as a joke, but let me tell you, Hasbro: you definitely think about it. I posted the link on Twitter, and it was picked up by Kottke and then by Slate, who both attributed me. You wouldn’t believe the reaction people had to this. Just like the series itself, it struck a chord. Also, just think of all the quotes from the series you can use to talk trash while you play:

3 comments

The Comedy Closer
 / 

Bill Murray is 60 years old today, which is a little bit unbelievable. The Beatles, Dylan, and The Stones can be in their 60s, and Woody Allen sometimes seems like he was ALWAYS in his 70s, but Bill Murray? 60? My parents aren’t even 60 yet, but Bill Murray is?

Maybe between movies, he gets in a spaceship that approaches the speed of light– so 60 earth years have passed, but he’s still really (let’s say) 48. He understands aging, all too well, because he’s seen it happen to the people around him at lightning speed, but he himself is only slowly, gently moving through middle age.

HiLobrow has a short but very fine appreciation, which makes me miss their daily HiLo Heroes birthday posts all the more. The erstwhile site only does occasional pieces averaging about one per week now. I’m guessing it’s because the editorial load was too large to bear.

I know the editors, but I haven’t actually asked them why; I know that from my own occasional entry-writer’s perspective, it seemed like way too much work. But golly-gosh, these are still some of my favorite things to read on the web.

Here are some of my favorite Bill Murray clips. Watching them, you see that Murray’s real genius may be in his ability to react to those around him with sanity AND lunacy; like Woody Allen at his peak, he’s George and Gracie rolled into one. He’s such a generous comedic actor, he makes even ciphers like Andie McDowell in Groundhog Day or Scarlett Johanssen in Lost In Translation look great. And because your attention’s still on him, you don’t even notice he’s doing it.

On Twitter, I compared him to baseball closer Mariano Rivera. Murray — maybe especially has he’s gotten older — is the relief pitcher who finishes every game/scene. He makes everybody look better; you’re always talking about him, but somebody else usually gets the win. The starters set the table, and he just kills you a half-dozen different ways. Fastball = punchline, change-up = muted expression, curveball = unexpected character transformation, and a devastating fluttery cut fastball that’s a mixture of all three.

4 comments