The murmur of the snarkmatrix…

Jay H § Matching cuts / 2014-10-02 02:41:13
Greg Linch § Matching cuts / 2014-09-16 18:18:15
Inque § Matching cuts / 2014-09-05 13:27:23
Gavin Craig § Matching cuts / 2014-08-31 16:33:56
Tim Maly § Sooo / 2014-08-27 01:35:19
Matt § Sooo / 2014-08-25 02:10:30
Tim § Sooo / 2014-08-25 00:49:38
Robin § Sooo / 2014-08-21 20:47:35
Doug § Sooo / 2014-08-21 20:40:50
Tim § Sooo / 2014-08-21 18:23:13

De inventione punctus
 / 

All signs suggest punctuation is in flux. In particular, our signs that mark grammatical (and sometimes semantic) distinctions are waning, while those denoting tone and voice are waxing. Furthermore, signs with a slim graphical profile (the apostrophe and comma, especially) are having a rough go of it. Compared to the smiley face or even the question mark, they’re too visually quiet for most casual writers to notice or remember, even (or especially) on our high-def screens.

But we’re also working within the finite possibilities and inherited structures of our keyboards. It’s the age of secondary literacy: writing and reading transformed by electronic communication, from television to the telephone.

See 1. Jan Swafford’s unfortunately titled “Why e-books will never replace real books,” which takes seriously Marshall McLuhan’s argument that print (and computers, too) change the ways we think and see:

I’ve taught college writing classes for a long time, and after computers came in, I began to see peculiar stuff on papers that I hadn’t seen before: obvious missing commas and apostrophes, when I was sure most of those students knew better. It dawned on me that they were doing all their work on-screen, where it’s hard to see punctuation. I began to lecture them about proofing on paper, although, at first, I didn’t make much headway. They were unused to dealing with paper until the final draft, and they’d been taught never to make hand corrections on the printout. They edited on-screen and handed in the hard copy without a glance.

Handwriting is OK! I proclaimed. I love to see hand corrections! Then I noticed glitches in student writing that also resulted from editing on-screen: glaring word and phrase redundancies, forgetting to delete revised phrases, strangely awkward passages. I commenced an ongoing sermon: You see differently and in some ways better on paper than on computer. Your best editing is on paper. Try it and see if I’m right. You’ll get a better grade. The last got their attention. The students were puzzled and skeptical at first, but the ones who tried it often ended up agreeing with me.

And especially, see 2. Anne Trubek’s “The Very Long History of Emoticons“:

A punctuation purist would claim that emoticons are debased ways to signal tone and voice, something a good writer should be able to indicate with words. But the contrary is true: The history of punctuation is precisely the history of using symbols to denote tone and voice. Seen in this way, emoticons are simply the latest comma or quotation mark… The earliest marks indicated how a speaker’s voice should adjust to reflect the tone of the words. Punctus interrogativus is a precursor to today’s question mark, and it indicates that the reader should raise his voice to indicate inquisitiveness. Tone and voice were literal in those days: Punctuation told the speaker how to express the words he was reading out loud to his audience, or to himself. A question mark, a comma, a space between two words: These are symbols that denote written tone and voice for a primarily literate—as opposed to oral—culture. There is no significant difference between them and a modern emoticon.

I <3 @atrubek. And I’m feeling all zen about this observation of hers, too: “A space is a punctuation mark.” There’s a whole philosophy in that idea, I know it.

I’m also feeling all zen about this idea that computer screens (keyboards, too) are sites of multiple, overlapping, and conflicting cultures, and that it’s up to us (in part) to help decide what the assumptions of those cultures are. —–>
<-----
--->Here, see 1. The Slow Media Manifesto, and 2. Nick Carr’s much-debated (whaaa? Nick Carr in a debate?) post about “delinkification“, which is actually a pretty solid meditation on the rhetoric of the hyperlink (you could say, the way we punctuate them). In short, if you think the superimposed montage of words and link of in-text hyperlinks pose some cognition/decision problems, which might not be appropriate to all kinds of reading, then it might make sense to try using a different strategy (like footnoting), instead. And being the relatively sophisticated mammals we are, in different contexts, we can sort these strategies out (even if we don’t fully understand that or how we’re doing it).

11 comments

Champion of the Shallows
 / 

This link is being gen­er­ated by a Snark­mar­ket bot pro­grammed to iden­tify and auto­mat­i­cally post any con­tent that con­tains “Nieman Lab” and “Matthew Battles” and “Gutenberg” and “alternate universe” and—okay you get the point. This basically just short-circuited the Snarkmatrix.

6 comments

Western threads
 / 

red-dead-500

I saw the new video game Red Dead Redemption for the first time this weekend, courtesy of my pal Wilson, who described it (and I paraphrase) as “every awesome Western ever, combined.”

It is indeed totally stunning, and it’s got me thinking about Westerns. Among other things:

What clicks in your mind when you think about Westerns? Any recent movies I ought to see? Any other fun stuff out there?

Update: Yes, this post was Tim-bait, and whoah yes, he delivers. I’m considering just pasting his comment into the body of the post and moving what I wrote to the comments…

16 comments

Only crash
 / 

Sometimes you run across an idea so counter-intuitive and brain-bending that you immediately want to splice it into every domain you can think of. Sort of like trying a novel chemical compound against a bunch of cancers: does it work here? How about here? Or here?

That’s how I feel about crash-only software (link goes to a PDF in Google’s viewer). Don’t pay too much attention to the technical details; just check out the high-level description:

Crash-only programs crash safely and recover quickly. There is only one way to stop such software—by crashing it—and only one way to bring it up—by initiating recovery.

Wow. The only way to stop it is by crashing it. The normal shutdown process is the crash.

Let’s go a little deeper. You can imagine that commands and events follow “code paths” through software. For instance, when you summoned up this text, your browser followed a particular code path. And people who use browsers do this a lot, right? So you can bet your browser’s “load and render text” code path is fast, stable and bug-free.

But what about a much rarer code path? One that goes: “load and render text, but uh-oh, it looks like the data for the font outlines got corrupted halfway through the rendering process”? That basically never happens; it’s possible that that code path has never been followed. So it’s more likely that there’s a bug lurking there. That part of the browser hasn’t been tested much. It’s soft and uncertain.

One strategy to avoid these soft spots is to follow your worst-case code paths as often as your best-case code paths (without waiting for, you know, the worst case)—or even to make both code paths the same. And crash-only software is sort of the most extreme extension of that idea.

Maybe there are biological systems that already follow this practice, at least loosely. I’m thinking of seeds that are activated by the heat of a forest fire. It’s like: “Oh no! Worst-case scenario! Fiery apocalypse! … Exactly what we were designed for.” And I’m thinking of bears hibernating—a sort of controlled system crash every winter.

What else could we apply crash-only thinking to? Imagine a crash-only government, where the transition between administrations is always a small revolution. In a system like that, you’d optimize for revolution—build buffers around it—and as a result, when a “real” revolution finally came, it’d be no big deal.

Or imagine a crash-only business that goes bankrupt every four years as part of its business plan. Every part of the enterprise is designed to scatter and re-form, so the business can withstand even an existential crisis. It’s a ferocious competitor because it fears nothing.

Those are both fanciful examples, I know, but I’m having fun just turning the idea around in my head. What does crash-only thinking connect to in your brain?

21 comments

Like a school of fish
 / 

I love little observations of the everyday like this one in Nick Paumgarten’s essay on elevators:

Passengers seem to know instinctively how to arrange themselves in an elevator. Two strangers will gravitate to the back corners, a third will stand by the door, at an isosceles remove, until a fourth comes in, at which point passengers three and four will spread toward the front corners, making room, in the center, for a fifth, and so on, like the dots on a die. With each additional passenger, the bodies shift, slotting into the open spaces. The goal, of course, is to maintain (but not too conspicuously) maximum distance and to counteract unwanted intimacies—a code familiar (to half the population) from the urinal bank and (to them and all the rest) from the subway. One should face front. Look up, down, or, if you must, straight ahead. Mirrors compound the unease.

This reminds me of what is quite possibly the best poetic description of riding the elevator, part III of T.S. Eliot’s “Burnt Norton” (from Four Quartets). In particular, it’s about the long elevator ride at the tube stop at Russell Square:

Here is a place of disaffection
Time before and time after
In a dim light: neither daylight
Investing form with lucid stillness
Turning shadow into transient beauty
With slow rotation suggesting permanence
Nor darkness to purify the soul
Emptying the sensual with deprivation
Cleansing affection from the temporal.
Neither plenitude nor vacancy. Only a flicker
Over the strained time-ridden faces
Distracted from distraction by distraction
Filled with fancies and empty of meaning
Tumid apathy with no concentration
Men and bits of paper, whirled by the cold wind
That blows before and after time,
Wind in and out of unwholesome lungs
Time before and time after.
Eructation of unhealthy souls
Into the faded air, the torpid
Driven on the wind that sweeps the gloomy hills of London,
Hampstead and Clerkenwell, Campden and Putney,
Highgate, Primrose and Ludgate. Not here
Not here the darkness, in this twittering world.

Descend lower, descend only
Into the world of perpetual solitude,
World not world, but that which is not world,
Internal darkness, deprivation
And destitution of all property,
Desiccation of the world of sense,
Evacuation of the world of fancy,
Inoperancy of the world of spirit;
This is the one way, and the other
Is the same, not in movement
But abstention from movement; while the world moves
In appetency, on its metalled ways
Of time past and time future.

(Why hasn’t “Not here the darkness, in this twittering world” been quoted regularly?)

Another great bit from Paumgarten, which relates to my earlier “potatoes, paper, petroleum” observation about the 19th century:

The elevator, underrated and overlooked, is to the city what paper is to reading and gunpowder is to war. Without the elevator, there would be no verticality, no density, and, without these, none of the urban advantages of energy efficiency, economic productivity, and cultural ferment. The population of the earth would ooze out over its surface, like an oil slick, and we would spend even more time stuck in traffic or on trains, traversing a vast carapace of concrete.

A meta/editorial/critical note: Paumgarten’s essay has a regrettable B-story, about a guy who worked at a magazine who was trapped in an elevator. He dribbles it out graf by graf, to create the illusion of dramatic tension. Just speaking for myself, I didn’t care; also, it kind of bothers me that this is starting to become one of the default templates for magazine writing. Either find a reason to do it and do it well, or just… try something else.

2 comments

This is what sports liveblogging is for
 / 

Every sport, I believe, has its own optimal medium. For baseball, I like the intimacy of radio, and the timing and traditions of the medium lend themselves well to a sport driven by discrete, well-defined actions. Pro and college football actually work better on television than in person — unless you’re intoxicated, when all bets are off. Soccer, as this year’s World Cup proves, lends itself to Twitter’s ability to celebrate goals, talk trash, and complain about calls (or diving for calls) in truncated bursts. Basketball, hockey, and (usually) tennis have a combination of speed, intimacy, and crowd effect that make the stadium experience hardest to beat or replicate.

But what about a tennis match, like that between John Isner and Nicolas Mahut at Wimbledon, that because of endless tiebreaks and evening suspensions, spills over into more than ten hours and a third day? In such a case, stadium attendance and television alike become gruesome; you’re watching something that resembles a tennis match, but feels more like an all-night dance-a-thon. It’s horrible and fascinating at the same time. You can’t bear to watch, but you need periodic updates, because at any moment, something — anything — may happen.

Here, then, is the perfect sports experience for the liveblog. And here, too, The Guardian’s Xan Brooks is the master, riveting to read even in retrospect. Consider :

4.05pm: The Isner-Mahut battle is a bizarre mix of the gripping and the deadly dull. It’s tennis’s equivalent of Waiting For Godot, in which two lowly journeymen comedians are forced to remain on an outside court until hell freezes over and the sun falls from the sky. Isner and Mahut are dying a thousand deaths out there on Court 18 and yet nobody cares, because they’re watching the football. So the players stand out on their baseline and belt aces past each-other in a fifth set that has already crawled past two hours. They are now tied at 18-games apiece.

On and on they go. Soon they will sprout beards and their hair will grow down their backs, and their tennis whites will yellow and then rot off their bodies. And still they will stand out there on Court 18, belting aces and listening as the umpire calls the score. Finally, I suppose, one of them will die.

Ooh, I can see the football out of the corner of my eye. England still 1-0 up!

And, four and a half hours later:

8.40pm: It’s 56 games all and darkness is falling. This, needless to say, is not a good development, because everybody knows that zombies like the dark. So far in this match they’ve been comparatively puny and manageable, only eating a few of the spectators in between bashing their serves.

But come night-fall the world is their oyster. They will play on, play on, right through until dawn. Perhaps they will even leave the court during the change-overs to munch on other people. Has Roger Federer left the grounds? Perhaps they will munch on him, hounding him down as he runs for his car, disembowelling him in the parking lot and leaving Wimbledon without its reigning champion. Maybe they will even eat the trophy too.

Growing darker, darker all the while.

They are still tied at 59 all in the fifth and final set. This set alone is longer than any other match in tennis history. Play will resume tomorrow.

One comment

McChrystal's secret strategy
 / 

There’s been a lot of noise about Gen. Stanley McChrystal’s Obama-badmouthing candor with Rolling Stone, but besides perhaps Colson Whitehead (“I didn’t know they had truffle fries in Afghanistan“), Andrew Fitzgerald at Current has distilled it to its essence better than anyone on the net: first substance (“Focusing on the few controversial remarks misses the point of this RS McChrystal piece. Really tough look at Afg.”), then snark (“Let’s say McChrystal is fired… How long before he shows up as a commentator on FNC? Is it months? Weeks? Hours?”).

When I saw this last tweet, I had an epiphany. All the commentators and journalists were wondering how McChrystal could have let this bonehead, 99%-sure-to-cost-your-job move happen. Did he think he was talking off the record? Was he blowing off steam? Did he think no one would find out? And if he wanted to trash the administration publicly, why in the world would did he give this info to Rolling Stone? I mean, did he even see Almost Famous? (Is Obama Billy Crudup? I kind of think he is.)

But let’s just suppose that this was McChrystal’s intention all along. I pretty much buy the New York magazine profile of Sarah Palin, which lays out why she resigned her office; being governor of Alaska is a crummy, poorly-paying job, her family was going broke fighting legal bills, and she was getting offers she couldn’t refuse. It’s like being an Ivy League liberal arts major, getting offered a job at Goldman Sachs right out of college; it’s not what you came there to do, but how are you going to let that go? (Besides, it isn’t like you have to know a ton about what you’re doing; you’re there for who you are already.) Also, Palin could do the new math of GOP politics in her head — public office is less important than being a public figure, with a big platform. Or as Andrew says, “FNC commentator is the new Presidential candidate.”

Well, let’s try this equation: if it’s tough to be the governor of Alaska, how much harder does it have to be to be in charge of Afghanistan? What are the chances that you’re going to come out of this thing smelling like roses anyways? How can you remove yourself from that position while still coming off as an honorable, somewhat reluctant, but still passionate critic of the administration? And make a splash big enough doing it that it gets beyond policy circles and editorial pages?

I have no idea whether it’s true, but it’s worth entertaining the possibility that the good general threaded the needle here.

6 comments

Machines making mistakes
 / 

Why Jonah Lehrer can’t quit his janky GPS:

The moral is that it doesn’t take much before we start attributing feelings and intentions to a machine. (Sometimes, all it takes is a voice giving us instructions in English.) We are consummate agency detectors, which is why little kids talk to stuffed animals and why I haven’t thrown my GPS unit away. Furthermore, these mistaken perceptions of agency can dramatically change our response to the machine. When we see the device as having a few human attributes, we start treating it like a human, and not like a tool. In the case of my GPS unit, this means that I tolerate failings that I normally wouldn’t. So here’s my advice for designers of mediocre gadgets: Give them voices. Give us an excuse to endow them with agency. Because once we see them as humanesque, and not just as another thing, we’re more likely to develop a fondness for their failings.

This connects loosely with the first Snarkmarket post I ever commented on, more than six (!) years ago.

2 comments

Universal acid
 / 

The philosopher Dan Dennett, in his terrific book Darwin’s Dangerous Idea, coined a phrase that’s echoed in my head ever since I first read it years ago. The phrase is universal acid, and Dennett used it to characterize natural selection—an idea so potent that it eats right through established ideas and (maybe more importantly) institutions—things like, in Darwin’s case, religion. It also resists containment; try to say “well yes, but, that’s just over there” and natural selection burns right through your “yes, but.”

If that’s confusing, the top quarter of this page goes a bit deeper on Dennett’s meaning. It also blockquotes this passage from the book, which gets into the sloshiness of universal acid:

Darwin’s idea had been born as an answer to questions in biology, but it threatened to leak out, offering answers—welcome or not—to questions in cosmology (going in one direction) and psychology (going in the other direction). If [the cause of design in biology] could be a mindless, algorithmic process of evolution, why couldn’t that whole process itself be the product of evolution, and so forth all the way down? And if mindless evolution could account for the breathtakingly clever artifacts of the biosphere, how could the products of our own “real” minds be exempt from an evolutionary explanation? Darwin’s idea thus also threatened to spread all the way up, dissolving the illusion of our own authorship, our own divine spark of creativity and understanding.

Whoah!

(P.S. I think one of the reasons I like the phrase so much is that it seems to pair with Marx’s great line “…all that is solid melts into air.” Except it’s even better, right? Marx just talks about melting. This is more active: this is burning. This is an idea so corrosive it bores a channel to the very center of the earth.)

So I find myself wondering what else might qualify as a universal acid.

I think capitalism must. Joyce Appleby charts the course it took in her wonderful new book The Relentless Revolution. “Relentless” is right—that’s exactly what you’d expect from a universal acid. I think the sloshiness is also there; capitalism transformed not just production and trade but also politics, culture, gender roles, family structure, and on and on.

I suspect, much more hazily, that computation might turn out to be another another kind of universal acid—especially this new generation of diffuse, always-available computation that seems to fuse into the world around us, thanks to giant data-centers and wireless connections and iPads and things yet to come.

But what else? Any other contemporary candidates for universal acid?

56 comments

Team Snuck
 / 

I’ve long agonized over snuck vs. sneaked. But the sly force and grinning vitality of this defense of the former—from the Paris Review!—puts me over the top. I’m sold. Snuck it is.

5 comments