The murmur of the snarkmatrix…

August § The Common Test / 2016-02-16 21:04:46
Robin § Unforgotten / 2016-01-08 21:19:16
MsFitNZ § Towards A Theory of Secondary Literacy / 2015-11-03 21:23:21
Jon Schultz § Bless the toolmakers / 2015-05-04 18:39:56
Jon Schultz § Bless the toolmakers / 2015-05-04 16:32:50
Matt § A leaky rocketship / 2014-11-05 01:49:12
Greg Linch § A leaky rocketship / 2014-11-04 18:05:52
Robin § A leaky rocketship / 2014-11-04 05:11:02
P. Renaud § A leaky rocketship / 2014-11-04 04:13:09
Bob Stepno § The structure of journalism today / 2014-03-10 18:42:32

Dappled
 / 

I am a sucker for a sun-dappled sidewalk, and it occurs to me that dappling is actually a pretty specific effect. You’ve seen images like this before: here’s a good look (with bonus Impressionist rendition). Overlapping tree-branches become cameras; how weird and how cool.

10 comments

In the darkness
 / 

Darkness at night is such an obvious and easily-neglected thing, probably because it’s no longer a problem. Our cities, even our houses, are made safe and accessible by electric light (and before that, gas lamps, candles, etc.).

But remember your experience of night as a child, the confounding absoluteness of darkness, and you begin to understand a fraction of what night was like prior to modern conveniences. The conquering of night might be the greatest event that wasn’t one in human history, certainly of the past 200 years — right up there with the massive declines in infant/mother death in childbirth or the emergence of professional sports.

Geoff Managh at BLDGBLOG lays it down with a tidy piece of paleoblogging by proxy:

Writing about the human experience of night before electricity, A. Roger Ekirch points out that almost all internal architectural environments took on a murky, otherworldy lack of detail after the sun had gone down. It was not uncommon to find oneself in a room that was both spatially unfamiliar and even possibly dangerous; to avoid damage to physical property as well as personal injury to oneself, several easy techniques of architectural self-location would be required.

Citing Jean-Jacques Rousseau’s book Émile, Ekirch suggests that echolocation was one of the best methods: a portable, sonic tool for finding your way through unfamiliar towns or buildings. And it could all be as simple as clapping. From Émile: “You will perceive by the resonance of the place whether the area is large or small, whether you are in the middle or in a corner.” You could then move about that space with a knowledge, however vague, of your surroundings, avoiding the painful edge where space gives way to object. And if you get lost, you can simply clap again.

Managh also thrills at Ekirch’s other discovery: “Entire, community-wide children’s games were also devised so that everyone growing up in a village could become intimately familiar with the local landscape.” Not only would you know your house in the dark, you would learn to know the architecture of your entire town. Managh asks:

But this idea, so incredibly basic, that children’s games could actually function as pedagogic tools—immersive geographic lessons—so that kids might learn how to prepare for the coming night, is an amazing one, and I have to wonder what games today might serve a similar function. Earthquake-preparedness drills?

Having spent most of the morning singing songs like “clean it up, clean it up, pick up the trash now” and “It’s more fun to share, it’s more fun to share,” I don’t see kids’ games as pedagogic tools as such a leap, although the collectivity of the game and the bleakness of the intent give me a chill. “If the French come to try to burn this village at night, the children must know exactly where they are before they begin to run.” Cold-blooded! They probably all learned songs about how kitchen knives and pitchforks could be used against an enemy, too. “Every tool can kill, every tool can kill…”

It’s probably also a good idea, if you’ve got kids, to teach them a thing or two about their neighborhoods. Not to get all grumpy and old, but in the absence of the random-packs-of-children-roaming-the-town-alone parenting style I grew up with, kids are probably not picking up the landmarks by osmosis. What will they do when the zombies attack? Use GPS? Call a cab?

2 comments

The inky swamp
 / 

Jonathan Harris, in one of his thoughtful photos-of-the-day:

I would like it if somebody worth emulating would give me a list of the 100 books that I need to read, in order to push and poke at my stiff sense of self until I am larger and more dynamic, expanded like a rubber balloon in 100 directions by 100 well-expressed world views.

With such a list, I would have no problem with a computerless cabin-bound existence, and I would never venture back to the swampland of the Smith Family bookstore, nor any other wetland like it, trudging through printed sprawl to look for pearls.

Two things. One: the photo-of-the-day, with a good caption, is really ideally internet-sized, isn’t it? Two: I admire the elegance of his articulation, but I disagree with Jonathan Harris’s destination. We’ve been stuck in cabins with too-short lists for too long! The printed sprawl is where the action is. Dive in, I say.

11 comments

Brought to you By the Committee to Find and Rescue Alma
 / 

Apparently this short film by Pixar animator Rodrigo Blaas is only available for a limited time. Which is good, because otherwise an unlimited number of children would have their nightmares haunted forever.

(MetaFilterrific.)

2 comments

Undercapitalized
 / 

Here’s an idea Malcolm Gladwell throws out in a long back-and-forth with ESPN’s Bill Simmons that’s mostly about the NBA. Simmons argues that career longevity, European imports, and overachieving young players have helped make the current NBA particularly well-stocked not just with talent, but with well-used talent. Gladwell makes one of his trademark Gladwellian connections:

What we’re talking about is what are called capitalization rates, which refers to how efficiently any group makes use of its talent. So, for example, sub-Saharan Africa is radically undercapitalized when it comes to, say, physics: There are a large number of people who live there who have the ability to be physicists but never get the chance to develop that talent. Canada, by contrast, is highly capitalized when it comes to hockey players: If you can play hockey in Canada, trust me, we will find you. One of my favorite psychologists, James Flynn, has looked at capitalization rates in the U.S. for various occupations: For example, what percentage of American men who are intellectually capable of holding the top tier of managerial/professional jobs actually end up getting a job like that. The number is surprisingly low, like 60 percent or so. That suggests we have a lot of room for improvement.

What you’re saying with the NBA is that over the past decade, it has become more and more highly capitalized: There isn’t more talent than before, but there is — for a variety of reasons — a more efficient use of talent. But I suspect that in sports, as in the rest of society, there’s still an awful lot of room for improvement.

Noam Scheiber, writing in The New Republic this week, says basically the same thing (about management, not basketball):

A lot of people talk about reviving the domestic manufacturing sector, which has shed almost one-third of its manpower over the last eight years. But some of the people I spoke to asked a slightly different question: Even if you could reclaim a chunk of those blue-collar jobs, would you have the managers you need to supervise them?

It’s not obvious that you would. Since 1965, the percentage of graduates of highly-ranked business schools who go into consulting and financial services has doubled, from about one-third to about two-thirds. And while some of these consultants and financiers end up in the manufacturing sector, in some respects that’s the problem. Harvard business professor Rakesh Khurana, with whom I discussed these questions at length, observes that most of GM’s top executives in recent decades hailed from a finance rather than an operations background. (Outgoing GM CEO Fritz Henderson and his failed predecessor, Rick Wagoner, both worked their way up from the company’s vaunted Treasurer’s office.) But these executives were frequently numb to the sorts of innovations that enable high-quality production at low cost. As Khurana quips, “That’s how you end up with GM rather than Toyota.”

In effect, what we’ve been doing in American industry is overpaying flashy ball hogs who put up great statistics but don’t know how to build teams or win games. In a similar vein, Umair Haque says that the whole model of a “leader” needs to be rethought, and what we really need are builders:

Leadership was built for 20th century economics. It’s a myth that leadership is a set of timeless skills. Is it? Abraham Zaleznik famously defined leadership as “using power to influence the thoughts and actions of other people.” Influence is the key word. The textbook skills of the “leader” — persuasion, delegation, coalition — aren’t universally applicable. Rather, they fit a very specific context best: the giant, evil, industrial-era organization.

Leaders don’t lead. How did this particular skillset emerge? Influence counts because the vast, Kafkaesque bureaucracies that managed 20th century prosperity, created, in turn, the need for “leaders”: people who could navigate the endlessly twisting politics at the heart of such organizations, and so ensure their survival. But leaders don’t create great organizations — the organization creates the leader. 20th century economics created a canonical model of organization — and “leadership” was built to fit it.

Haque actually doesn’t do a great job at articulating what a “builder” does differently, other than throwing out a few examples. (Yes, Obama isn’t as accomplished a builder as Gandhi — but saying that Gandhi “built” nonviolent resistance only scratches the surface.)

But if you use Haque’s new-economy and Scheiber’s old-economy critiques of current practices, you get something very powerful. The pre-managerial, heroic-age-of-capitalism industrialists of the 19th and early 20th centuries didn’t always build things that were good, from our perspective — but coalsmoke aside, they BUILT things, creating real capital and value along the way. It’s this fifty-year-blip of late uncreative capitalism, milking old property for its dregs, reshuffling money to create something from nothing, that has culturally really screwed us up.

7 comments

A strong and slow boring of hard boards
 / 

All the arguing I’ve been doing over the health care proposal on the table, including with some of my closest friends, reminds me of this great Max Weber essay, “Politics as a Vocation“:

We must be clear about the fact that all ethically oriented conduct may be guided by one of two fundamentally differing and irreconcilably opposed maxims: conduct can be oriented to an ‘ethic of ultimate ends’ or to an ‘ethic of responsibility.’ This is not to say that an ethic of ultimate ends is identical with irresponsibility, or that an ethic of responsibility is identical with unprincipled opportunism. Naturally nobody says that.However, there is an abysmal contrast between conduct that follows the maxim of an ethic of ultimate ends–that is, in religious terms, ‘The Christian does rightly and leaves the results with the Lord’–and conduct that follows the maxim of an ethic of responsibility, in which case one has to give an account of the foreseeable results of one’s action.

You may demonstrate to a convinced syndicalist, believing in an ethic of ultimate ends, that his action will result in increasing the opportunities of reaction, in increasing the oppression of his class, and obstructing its ascent–and you will not make the slightest impression upon him. If an action of good intent leads to bad results, then, in the actor’s eyes, not he but the world, or the stupidity of other men, or God’s will who made them thus, is responsible for the evil. However a man who believes in an ethic of responsibility takes account of precisely the average deficiencies of people; as Fichte has correctly said,he does not even have the right to presuppose their goodness and perfection. He does not feel in a position to burden others with the results of his own actions so far as he was able to foresee them; he will say: these results are ascribed to my action. The believer in an ethic of ultimate ends feels ‘responsible’ only for seeing to it that the flame of pure intentions is not quelched: for example, the flame of protesting against the injustice of the social order. To rekindle the flame ever anew is the purpose of his quite irrational deeds, judged in view of their possible success. They are acts that can and shall have only exemplary value.

But even herewith the problem is not yet exhausted. No ethics in the world can dodge the fact that in numerous instances the attainment of ‘good’ ends is bound to the fact that one must be willing to pay the price of using morally dubious means or at least dangerous ones –and facing the possibility or even the probability of evil ramifications. From no ethics in the world can it be concluded when and to what extent the ethically good purpose ‘justifies’ the ethically dangerous means and ramifications.

There are all sorts of ways that you can pervert this idea of “responsibility,” from the assumption that responsible politicians are always hawkish (and non-hawks are correspondingly irresponsible) to the proposition that the stance of responsibility implies an excess of caution (the accusation of which has dogged both Obama and John Kerry before him).

The proper sense, though, I think, is to recognize that politics, especially national politics, has a unique relationship to life and death of citizens (both of one’s own state and of others) — and that an ethic of responsibility demands that one account for the consequences of policy on these terms. To that end, a policy-maker will frequently have to compromise themselves ethically and politically (in the narrow sense of electoral politics). Here’s Weber again:

Politics is a strong and slow boring of hard boards. It takes both passion and perspective. Certainly all historical experience confirms the truth–that man would not have attained the possible unless time and again he had reached out for the impossible. But to do that a man must be a leader, and not only a leader but a hero as well, in a very sober sense of the word. And even those who are neither leaders nor heroes must arm themselves with that steadfastness of heart which can brave even the crumbling of all hopes. This is necessary right now, or else men will not be able to attain even that which is possible today. Only he has the calling for politics who is sure that he shall not crumble when the world from his point of view is too stupid or too base for what he wants to offer. Only he who in the face of all this can say ‘In spite of all!’ has the calling for politics.

Comments

The third vision
 / 

Now here you go! Take the best bits of that Sports Illustrated interactive magazine demo and Pictory, mash them up, add attractive depth-of-field and you get BERG’s vision for the future of the magazine:

I actually feel like it’s hard to judge, because there are two very significant confounding variables in the mix:

  • the warm, cinematic production, and
  • the device! I mean, look at that e-reader. I don’t care what kind of magazine you put on that thing—I’ll take it.

However, I’ve done the regression, and even when those elements are factored out, it’s still excellent. In particular, I love the concept of “heating” content. When content is cold, it sits on the page, crystalline and beautiful. When it’s warm, it bubbles and steams and you can pull it apart and push it around. Wonderful!

The strength of the video is really that it speaks—well I mean, specifically that Jack Schulze speaks. Compare it to this, the Microsoft equivalent, which is all mute gloss. What are the animating ideas? What can I extract from it, lacking wall-size screens and paper-thin LCDs here and now in 2010? Not much.

I do disagree with one premise of BERG’s, which seems to be that magazine-style content is generally Quite Good and just needs to be presented in a useful, modern way. I do think there’s demand for depth and design, of course. But increasingly, when I shift from screen-reading to magazine-reading it’s more than just the interface that stops me cold. It’s the voice. There’s a tone and distance to non-fiction magazine writing—even very good non-fiction magazine writing—that seems increasingly old-fashioned in 2010. If I was advising a magazine on strategy, I’d tell them to crack open the black box of content, of writing, and redesign that, too. (More to say about that at some point, but for now, scope out the BERG video.)

But really: this is all a side-show, because the star of the video is that table, isn’t it? I want one.

8 comments

The feeds, the blogs, the tweets, the years
 / 

You have no doubt seen the Bygone Bureau’s excellent round-up of the best new blogs of 2009 with contributions from everyone Snarkmarket admires. Three notes:

First: All hail Joanne McNeil and her 749 feeds. This is, by the way, the correct use of Google Reader. It’s not an email inbox… it’s baleen.

Second: I like Andy’s mention of Offworld, because it’s a video game blog that did what the very best journalism does: not just track an existing conversation but start a whole new one. Offworld’s focus on indie games and crazy ludology is as much a sustained argument as it is a chronicle.

Third: My favorite new blog of 2009 is… Twitter. It’s remarkable how much it’s changed this year—for me, at least. It went from being a pleasantly random feed of banter to, really, the locus of my internet life. When I open a browser, I fire up Twitter in the first tab—even before Gmail. Wow!

9 comments

No light, but rather darkness visible
 / 

The Morgan Library has a really excellent digital archive exhibition on its web site: a digital facsimile of the sole surviving manuscript of John Milton’s Paradise Lost, a fair copy of the first book.

One of the curious accidental consequences of the ubiquity of typing and copying technology today is that many people assume that “manuscript” refers to an early draft written in the author’s own hand. There are actually special terms for this: an autograph manuscript (a text an author writes him/herself), or sometimes holograph manuscript, which refers to a text written by the person who signed it. Autograph is more relevant in literary and holograph in non-literary contexts.

Manuscript by itself refers to any text that is written by a human hand. The most famous examples of manuscripts not handwritten by their authors would be in the pre-print era, when every text had to be copied out by hand. Print put an end to that most laborious (and glorious) instance of manuscript mechanical reproduction.

But even in the early modern period, most manuscript copies of a text would not have been written by an author, but recopied by a scribe or clerk. This actually persisted until the late nineteenth century, when the high-volume demands of modern businesses, and the technological emergence of shorthand, the typewriter, and carbon copying put an end to the traditional secretary/copyist, usually gentleman with a liberal education at home in legal and diplomatic contexts who wrote in a fine hand.

In Milton’s case, there was never an autograph copy of Paradise Lost:

Milton composed the ten books of Paradise Lost between 1658 and 1663. He had first planned the work as early as 1640, intending to write a tragedy titled Adam Unparadised. By 1652 he had become completely blind, probably due to glaucoma. Blindness forced him to compose orally, rendering him entirely reliant upon amanuenses (casual copyists among his friends and family circle) to whom he gave dictation. He composed the poem mostly at night or in the early morning, committing his composition to memory until someone was available to write down his words. He revised as his text was read back to him, so that a day’s work amounted to twenty lines of verse. According to contemporary accounts, when dictating, the poet “sat leaning backward obliquely in an easy chair, with his leg flung over the elbow of it” or “composed lying in bed in the morning.”

The only surviving manuscript of Paradise Lost is this 33-page fair copy, written in secretary script by a professional scribe, who probably transcribed patchwork pages of text Milton had dictated to several different amanuenses. This fair copy was corrected by at least five different hands under Milton’s personal direction and became the printer’s copy, used to set the type for the first edition of the book.

That’s one of the other fascinating things about early modern manuscript culture: there were multiple scripts that trained copyists used, almost like manual font sets, which dictated the shape and overall look of individual letters. These scripts varied from region to region and sometimes from one profession to another. The Secretary Script of the Paradise Lost manuscript, for example, was a form of Blackletter that flourished in England between the 14th and 18th centuries, until it was gradually displaced by the humanist scripts that had originated in Italy.

(This always delights me; not only did Renaissance humanists edit ancient texts, set up printing presses, transform education, and create great literature — they actually CHANGED THE WAY PEOPLE WROTE.)

But back to Milton! The survival of this partial manuscript also inadvertently reveals some of the difficulties writers faced in getting their books published in 17th-century England:

The Licensing Act, which was suspended during Cromwell’s term as lord protector, was renewed in 1662. Printers and publishers therefore required a license in order to legally print and distribute any book. Printing was authorized only when an imprimatur (Latin for “let it be printed”) was granted by the Stationers’ Company. The imprimatur for Paradise Lost appears on the inside cover (the first page of the manuscript in the digital facsimile). Soiled with ink smudges and compositor’s marks, printer’s copy manuscripts were customarily discarded or recycled after printing. In this case the presence of the imprimatur may account for the survival of Book 1—no manuscripts of the nine other books of Paradise Lost survive.

Milton sold Paradise Lost to the printer Samuel Simmons for £5. The contract is dated 27 April 1667; the book was published in late October or early November 1667. Although Milton had completed Paradise Lost by 1665, publication was delayed by a paper shortage caused by the Second Anglo-Dutch War, the Great Plague (during which over eighty London printers died), and the Great Fire of London, 1666, which destroyed many of the city’s presses. The absence of Simmons’s name on the earliest title pages indicates that he may have been unable to print the book himself. The title pages that do bear Simmons’s name do not give an address, suggesting that the printing of the first edition was assigned to Peter Parker.

Approximately thirteen hundred copies of the first edition were printed, with no fewer than six different title pages. Marketed at three shillings a copy, the first printing was sold out within eighteen months.

War, fire, plague, paper shortages, and Milton blind; this could make anyone wish “to justify the ways of God to men,” and imagine

A Dungeon horrible, on all sides round
As one great Furnace flam’d, yet from those flames
No light, but rather darkness visible
Serv’d onely to discover sights of woe,
Regions of sorrow, doleful shades, where peace [ 65 ]
And rest can never dwell, hope never comes
That comes to all…

Most digital manuscript projects for the web get at least one thing wrong, but they’ve been getting better. In this case, I highly recommend using the full-screen viewer to examine individual pages… which is all you can do, because you can’t switch pages without jumping out of full-screen. D’oh!

Comments

Moving the goalposts
 / 

As much as I’m disappointed by the outcome, I have to love Nate Silver’s analogy/analysis of the failed push for a public option. Essentially, Silver’s take is that the core group of conservative Democrats were never going to accept a public option, said so early, repeated it whenever they were asked, and so, a bill containing a public option was never going to beat a filibuster.

Suppose the following scenario plays out when you’re trying to buy a used car:

Dealer: The price of the car is $2,000.
You: For that beat-up Honda Accord? I’ll give you $1,200.
Dealer: Nope, it’s $2,000.
You: How about $1,500?
Dealer: I’m going to stick with $2,000.
You: Will $1,700 get it done?
Dealer: My best and final offer is $2,000.
You: Give a guy a break! $1,875?
Dealer: $2,000.
You: $1,995 and a free Slurpee coupon?
Dealer: Now we’re talking — step into my office.

Is that a negotiation in bad faith? Is the dealer moving the goalposts? No. He’s being very stubborn and very firm — but he’s also being very explicit about what he wants. It’s possible that you were an incompetent negotiator and that maybe if your first offer had come in a little lower, or a little higher, you could have gotten a better price. But more likely the dealer simply had more of the leverage and ultimately $2,000 is an acceptable price to you, even if it’s more than you were hoping to pay.

Progressives did just about everything in their power to try to get a decent public option into the bill. They threatened. They bargained. They complained. They organized. They persuaded. They begged. There was the opt-in, the opt-out, the trigger, the Medicare buy-in. There was no lack of initiative or creativity. And they actually had quite a bit of success: from 43 votes in August, they got up to perhaps as many as 48-52 for a strong-ish public option, and 57-59 for a weak-ish one. People like Kay Hagan, Tom Carper and Kent Conrad, to varying degrees, came on board.

But just because you perceive yourself as being in a negotiation with another party doesn’t entitle you to win that negotiation, or even to split things halfway. Sometimes your adversary doesn’t think there’s anything to negotiate at all. Sometimes they would in theory be willing to negotiate if you could find the right leverage point, but there’s nothing that fits the bill, for all your best efforts. Sometimes their first offer is pretty much as good as it’s going to get, and not merely a negotiating ploy.

What’s that? Oh, yes. Silver explicitly excludes Sen. Joe Lieberman (Prick — CT) from this analysis. Lieberman’s about-face on the Medicare buy-in proposal, motivated seemingly only by a desire to get payback from progressives who didn’t support him against Ned LaMont, would be comic if it didn’t play with people’s lives.

Comments