This is from the introduction* to Steven Johnson’s Interface Culture, a book from 1997 that I hadn’t previously read:
A few final observations, and warnings, about the pages that follow. The first should be a comfort to readers who have tired of the recent bombast emanating from both the digital elite and their neo-Luddite critics. I have tried to keep this book as free of dogma and polemic as possible, emphasizing both the tremendous intellectual liberation of the modern interface and the darker, more sinister implications of that same technology.
From its outset this book has been conceived as a kind of secular response to the twin religions of techno-boosterism and techno-phobia. On the most elemental level, I see it as a book of connections, a book of links — one in which desktop metaphors cohabit with Gothic cathedrals, and hypertext links rub shoulders with Victorian novels. Like the illuminations of McLuhan’s electric speed, the commingling of traditional culture and its digital descendants should be seen as a cause for celebration, and not outrage.
This is likely to trouble extremists on both sides of the spectrum.** The neo-Luddites want you to imagine the computer as a betrayal of the book’s slower, more concentrated intelligence; the techno-utopians want you to renounce your ties to the fixed limits of traditional media. Both sides are selling a revolution — it’s just that they can’t agree on whether it’s a good thing. This book is about the continuities more than the radical breaks, the legacies more than the disavowals.
For that reason, the most controversial thing about this book may be the case it makes for its own existence. This book is both an argument for a new type of criticism and a working example of that criticism going about its business.***
* I added some extra paragraph breaks to the excerpt to make it read more like a blog post.
** Compare my “Bookfuturist Manifesto,” from The Atlantic.com, August 2010.
*** I pretty much want to be Steven Johnson right now.
There’s a semi-viral video that’s been kicking around for a couple of weeks titled “The Future of Publishing.” The schtick is that the same column of text, about preferences of younger readers gets read two ways — descend and you get a sharply pessimistic, anti-book message, but if you roll the text back and read it on the ascent (get it?), it turns out that the kids love traditional books after all.
It’s the sort of thing I’d usually link to here, but I was embarassed for two reasons:
- It’s the sort of thing most of you (being who you are) had probably already seen elsewhere;
- I thought it was pretty silly. The contrast between the two POVs (let’s call them the young devil and the young angel) is so overbearing, it’s like a fight between two straw robots. Not every young person wishes books would die, and some young people still like books. Okay — and… It’s trivially true, without being truthful.
I’m finally linking to it, because Bob Stein has helped me name what it is: “a dream piece constructed to reassure middle-aged intellectuals that the seismic shifts which are upending life as we know it are not really happening.” PaidContent.org calls it “The viral video the publishing industry wants to believe.” It’s a feel-good fantasy cooked up for a sales conference, and you can’t even say that “the truth of the future of book reading is somewhere in between,” because there’s almost nothing that resembes the future of reading in the first take, and nothing that resembles the future of books in the second.
They’re two different versions of the bookservative fantasy: a dystopia and utopia that need each other to know what the other one looks like. And say what you will about the dystopian point-of-view (and I could and have said a lot), at least it is a version of the future. The utopia winds up being no future at all. It’s just a nowhere.
Some blogs written for university presses have gotten really good, featuring excerpts worth reading even if (especially if) you have no particular interest in plunking down beaucoup bucks for a hardcover scholarly book. For instance, here’s a choice bit from classics/philosophy prof Paul Woodruff’s The Necessity of Theater, featured at the website for Oxford University Press, which looks closely at both drama and sports (those two forms of theater both American and Athenian):
Why does theater need a measured space? In order to practice the art of theater successfully, some people must be watching the actions of others. Whether your job tonight is to watch or be watched, you need to know which job is yours; the watcher-watched distinction is essential to theater. We shall see that even this can break down at the end of a theater piece, with marvelous consequences. But one of those consequences is that the event is no longer theatrical. When no one is watching, it’s not theater; it has grown into something else. Marking off space in theater is a device for meeting the need to distinguish the watcher from the watched. In most traditions there is a circle or a stage or sanctuary or a playing field…
“Sacred” is a word we have almost lost in modern times, like “reverence,” to which it is related in meaning. Sacred things and places call us to reverence, as to do sacred timed like the Sabbath; perhaps in out own century we are too alert to the dangers of idolatry to recognize that we are, still, surrounded by what we wordlessly take to be sacred. And Christians have come more and more to neglect the Sabbath. Like reverence, the sacred is best known in religious contexts, but, if we are to recognize it now, we must looked for it also in the secular world, such as the football field. I will say that a place for an object or person is sacred if it is held to be untouchable except by people who are marked off, usually by ritual, so as to be allowed to touch it.
What makes theater sacred? Ritual, or a tradition based on ritual, defines the space and calls for penalties against those who violate it. All theater, football games and Antigone included, is the heir of a long line of spaces made sacred for religious ritual. Sometimes the space is permanently scared, like the adyton, the un-enterable room in an old Greek temple. Sometimes it is sacred for the time of the event, and the boundaries of time and place work together. So it is with the stage, after a performance of Hamlet, if you are invited as a sponsor to a reception with the cast on the set. Nothing wrong now with setting foot on this space (although, if the performance was good, I dare you to step on the stage afterward without a shiver.) So it is also with a trial at law. For the time of the trial the courtroom theater is sacred and may be entered only designated people and used only according to certain rules.
Which leads me to question another kind of reverence at play here: why do these wry observations need to be in a book-length work, a monograph, for them to be taken seriously?
Let me back up. Before I read Woodruff’s excerpt, I also read Rohan Amanda Maitzen’s look at academic publishing over at The Valve, which includes 1) laments that nobody buys academic monographs, and 2) wonderment that blogs don’t seem to have really affected either the purchasing or accreditation habits of academics much.
Not everything in Maitzen’s post is in her voice, but it’s a good round-up — for instance, here she quotes Cathy Davidson:
If we believe in what we do (and I happen to be a believer), we should be writing for readers, first of all, and, second, we should be reading one another’s work and, third, we should be teaching it. Right now, a sale of 300 or 400 copies of a monograph is a lot. That’s appalling. The result, materially, is that we do not pay our own way and certainly not that of junior members of our profession. Intellectually, our students never learn the value the genre of the monograph because we teach excerpts in our courses, even our graduate courses. We do not teach the kind of extended, nuanced thinking that goes into the genre that our very graduate students will have to produce for tenure. We say the scholarly monograph represents the epitome of our profession and a hurdle to “lifetime employment” at a research university. So we do not practice what we preach, adding to the crisis in scholarly publishing and the crisis in the profession of English in particular.
Now, note that slippage: the need for “extended, nuanced thinking” actually turns out to be material primarily because it’s required for tenure. Monographs remain absolutely essential to the legitimation rituals of academia (especially the PhD and tenure), even as they’ve diminished in importance for readers both in and out of the scholarly spheres. They’re only important at designating who gets to go inside the temple. They don’t do anything to maintain the relationship with the audience.
This is something I wrestle with in my mind frequently — when is a “book” necessary? particularly as a “work” is now more frequently coming to mean an ongoing project composed of many, many individual pieces of writing, which are extended and nuanced and interlinked but frequently not a single thing with a clearly defined architecture.
In short, the book is not always necessary. In fact, it sometimes isn’t even a book.
But when it is, it should be one deliberately — not merely to invoke a ritual of time or space or authorship, but to genuinely fulfill all of those demands. As Mallarmé would say, the book should attempt the impossible and abolish chance. How can we do that? Where do we begin?
Today’s a day for thinking about brains, plasticity, and renewal. At least in the pages of the New York Times.
First up is Barbara Strouch, who writes on new neuroscientific research into middle-aged brains:
Over the past several years, scientists have looked deeper into how brains age and confirmed that they continue to develop through and beyond middle age.
Many longheld views, including the one that 40 percent of brain cells are lost, have been overturned. What is stuffed into your head may not have vanished but has simply been squirreled away in the folds of your neurons.
One explanation for how this occurs comes from Deborah M. Burke, a professor of psychology at Pomona College in California. Dr. Burke has done research on “tots,” those tip-of-the-tongue times when you know something but can’t quite call it to mind. Dr. Burke’s research shows that such incidents increase in part because neural connections, which receive, process and transmit information, can weaken with disuse or age.
But she also finds that if you are primed with sounds that are close to those you’re trying to remember — say someone talks about cherry pits as you try to recall Brad Pitt’s name — suddenly the lost name will pop into mind. The similarity in sounds can jump-start a limp brain connection. (It also sometimes works to silently run through the alphabet until landing on the first letter of the wayward word.)
That’s a wonderful technique, all the more so because it sounds like something Cicero might have invented.
We are born with a highly structured brain. But those brains are also transformed by our experiences, especially our early experiences. More than any other animal, we humans constantly reshape our environment. We also have an exceptionally long childhood and especially plastic young brains. Each new generation of children grows up in the new environment its parents have created, and each generation of brains becomes wired in a different way. The human mind can change radically in just a few generations.
These changes are especially vivid for 21st-century readers. At this very moment, if you are under 30, you are much more likely to be moving your eyes across a screen than a page. And you may be simultaneously clicking a hyperlink to the last “Colbert Report,” I.M.-ing with friends and Skyping with your sweetheart.
We are seeing a new generation of plastic baby brains reshaped by the new digital environment. Boomer hippies listened to Pink Floyd as they struggled to create interactive computer graphics. Their Generation Y children grew up with those graphics as second nature, as much a part of their early experience as language or print. There is every reason to think that their brains will be as strikingly different as the reading brain is from the illiterate one.
Should this inspire grief, or hope? Socrates feared that reading would undermine interactive dialogue. And, of course, he was right, reading is different from talking. The ancient media of speech and song and theater were radically reshaped by writing, though they were never entirely supplanted, a comfort perhaps to those of us who still thrill to the smell of a library.
But the dance through time between old brains and new ones, parents and children, tradition and innovation, is itself a deep part of human nature, perhaps the deepest part. It has its tragic side. Orpheus watched the beloved dead slide irretrievably into the past. We parents have to watch our children glide irretrievably into a future we can never reach ourselves. But, surely, in the end, the story of the reading, learning, hyperlinking, endlessly rewiring brain is more hopeful than sad.
Put these two together, and you get a picture that’s even more hopeful. Our brains aren’t just plastic over the span of human evolution or historical epochs, but over individual lives. It might be easier and feel more natural for children, whose brains seem to us to be nothing but plasticity. But we don’t just have a long childhood — to a certain extent, our childhood never ends.
Human beings are among the only species on the planet who evolved to thrive in any kind of climate and terrain on the planet. (Seriously; underwater is the only real exception.) Compared to that, summoning the plasticity required to engage with any new kind of media is a piece of cake.
I do not like dreamy fashion spreads in magazines even a little bit, but I liked this thing—what to call it?—a lot. It has a soundtrack and fun, motion-graphics-y transitions between photos. Both elements are deployed thoughtfully; if the music was wrong, or the transitions too slow, the whole thing would collapse. As it is, I think it’s moody and really, uh, clickable.
I want to view content like this on my unicorn!
I liked this piece by Sam Anderson about the shift in novels in the 2000s. Partly it’s because of our shared affinity for The Brief Wondrous Life of Oscar Wao. But mostly it’s because he sounds, throughout, rather like a bookfuturist:
TV, in comparison, looks like a fairly simple adversary: Its flickering images lure readers away from books altogether. The Internet, on the other hand, invades literature on its home turf. It has created, in the last ten years, all kinds of new and potent rival genres of reading—the blog, the chat, the tweet, the comment thread—genres that seem not only to siphon our attention but to change the way our brains process text.
Also, I have to say, this rings 100% true for me:
Books formed under the attentional pressure of the Internet tend to devote disproportionate energy to style; if you can’t assume that your readership is going to stick with you beyond a paragraph or two, it’s probably smart to load that paragraph with maximum pizzazz.
I’ve started calling this “the paranoia of the screen.” It doesn’t necessarily serve you well on the printed page; I think readers of Annabel Scheme might find themselves wondering, like, what’s the rush? But it does flow naturally from writing blog posts and reading Jakob Nielsen. (Eep!)
(Also, love the meta-awareness here: Anderson’s bit about paragraph-level pizzazz is itself a gem of a graf.)
Following Anjali’s suggestion, I steered over to this post, “Bookshops are not dead. Long may it remain so.” Like me, James Higgs reacted negatively to Basheera Khan’s “No more bookshops? Good riddance.” There are some really good points in Higgs’s criticism, and I particularly like this one:
The binding and physical form of the book is an intrinsic part of its content, rather like the frame in a Howard Hodgkin painting. (Another example: James Joyce once made a fuss over the size of a full-stop in Ulysses.) You very much should judge a book by its cover.
Saying that a book can be reduced to a screen is the same thing as saying that a JPEG of Les Demoiselles d’Avignon is as good as the original. Thank heavens when we won’t be made to traipse around a physical space, but can have master works beamed into our houses, eh?
This, I think, is one of the major tensions the e-book will have to resolve, or at least develop alternative solutions to, in the years to come. Do we want a perfectly fungible object that the reader is free to resize and redesign according to their own tastes and needs, or do we want a through-designed, screen-independent object that preserves the aesthetic and visual choices of the author and designer? I don’t think that digital can’t do the latter — take a look at iTunes’s recent attempts to bring back album art with iTunes LP, which beats the restricted visuals of the CD, at least. But e-books to date have largely not provided for that possibility, have not sought to create those kinds of objects. Which gives printed books the aesthetic high ground.
My bigger worry, though, with criticisms like Higgs’s, is the following:
- “the experience of reading a book is fundamentally different from reading a text on a reading device. Many – and I’d contend that these are mainly people who are not compulsive readers – will not care about this distinction, but this is the market that successful booksellers are targeting.”
- “Borders and Books etc are in trouble because they are not good bookshops. There is little to distinguish one shop from the next and, on the whole, their staff are not knowledgeable about the books they sell. They clearly don’t read reviews, or subscribe to major literary periodicals.”
- “Most people don’t read seriously, and for them, these arguments will make no sense. But for the millions of people who do read compulsively, eReaders are not going to be universally welcomed.”
Now, I don’t care about the elitism in Higgs’s arguments. I’m an elitist reader, too, and I probably like the same books that he likes and would like the same bookshops and be frustrated by the same things in other readers. I do, however, object to the assumptions that
- the kinds of texts you like are inherently connected to the kinds of technology you like;
- that people who prefer either texts or technologies different from those you prefer are not “real” readers, not committed or compulsive or serious readers;
- that the class of readers you belong to is uniquely positioned to determine what the future of reading will look like, or at least ought to.
It’s this argument from authenticity that bothers me most. What’s more, it’s the same trick Khan tries to pull in her post; Khan thinks that “serious readers” would rather carry a thousand books than a handful, that they prefer the library to the bookshop — in short, that they look and act like she does.
I will say it again: reading includes many, many, many things, in every context. If we’re serious about charting the future of reading, rather than advocating for our particular preferences, we have to try to understand and account for all of them, and to do so with as few assumptions and as much good faith and openness to possibility as we can.
After all, the fact that a JPEG of Les Demoiselles d’Avignon is fundamentally different from and not as good as the original is only an argument for preserving paintings; it isn’t an argument for abolishing JPEGs of them, or caring about their quality.
Basheera Khan’s post for the Telegraph, “No more bookshops? Good riddance” is as clear an articulation of the technofuturist position on the future of reading as I could imagine. Here’s the key section:
I’m happy to see the back of bookshops, and not just because the paper publishing industry is inherently wasteful of natural resources.
When you buy a book, you’re not just paying for a few hundred bounded pages. You’re paying for the bookshelves you will need to store your books. The time it takes to dust said shelves. The effort and cost of lugging books and shelves if you happen to move house. The psychological debt that builds every time you survey all the books you bought over the years, on a whim, because they were cheap, but which remain unread — because with all the will in the world, there’s just no way to read every book you may want to.
I’ve been slowly divesting myself of the staggering piles of books I accrued when I still bought into the notion that to read a book you had to own it. I look forward with immense relief to the day when all my books are ebooks – light as a feather.
Actually, it’s not quite pure technofuturism. There’s a sop for bookservatives, too:
To people who bemoan the loss of bookshops as a loss to society, I say this: there’s already a place where you can go to find books you simply have to read in physical form. It’s a place where you can browse to your heart’s content, meet friends, take your kids, and do everything you did at your bookshop. It’s called the library. When did you last visit yours?
This is actually a weird binary, almost as weird as the one between bookservatives and technofuturists. Almost every full-throated embrace of technical/social change needs a still point, something that remains unchanged and which can still serve the function of whatever’s being swept aside.
In Khan’s post, it’s the library. It might be self-contradictory — shifting the locus of physical books to the library seems to solve the “hard to move house” and “annoying to dust” problems, but not necessarily the “inherently wasteful of natural resources” or the “I can’t read all of the books!” ones — but that doesn’t matter. She needs libraries. They’re a safety valve. And by praising libraries (and damning bookstore aficionados for not using them) she out-bookserves the bookservatives.
In the same way, folks who want to get rid of all of the physical books in a library would say, “if you still want a physical book, there’s always the bookstore.”
P.S.: If this “technofuturist”/“bookservative” language gets obnoxious or reductive, please tell me. Again, I want to advance “bookfuturist” as an alternative to both of these positions, so if I seem stuck on the words, that’s why.
Wow! How much do I love the fact that James Bridle actually cut a little window out of a book and made—what would you call it? A video sketch? A suggestive object? It’s not a prototype, per se; it just hints at something. Makes you go hmm. I like it.
(His inspiration was that iPhone-powered storybook that we mentioned earlier.)
On Nov 13th, Jason Kottke asked
Why doesn’t anyone talk about bacterial marketing? Or hookworm infestational media?
@jkottke viruses make a better metaphor; they need a host’s cellular architecture to replicate their own DNA. Also AIDS put viruses on map.
Just a couple of days later, I became very, very sick.
It turned out I had an infected abscess, around a hematoma in my lower back. I’d been trying since October — and if you remember, I was seeing a lot of doctors in October — to get a physician to take this swelling seriously, to say something other than “Wow, would you look at that?” or “Let’s just wait a few weeks to see if it goes down on its own.” Now it had almost killed me. It’s like my accident had finally found a way to get at my insides.
On Thursday, I was admitted to the hospital (again), to get the infection cleared up. This ultimately required not just dose after dose of antibiotics, but also surgery. Actually, two surgeries so far, and a third tomorrow. It’s not closed yet, for I’ve got a little vacuum pump sucking my incision dry. But no more little hunchback. And no more fevers or explosive bouts of illness. And a good chance I’ll be discharged in time for Thanksgiving.
I’ve had it with hospitals. After this year, I hope I don’t see the inside of one for another ten. I think I’m due a break.
Anyways, I wanted to explain my long Snark-absence. This is my first night with the computer, which also feels pretty good.
Because something has been growing inside me besides just bacteria. (Eww. Where’s this going?)
AN IDEA. I have an idea!
It comes from Joanne McNeil’s name for her Twitter list of wordly nerds who like to think about books and new media: “bookfuturism.”
More to the point — bookfuturists.
I love it because the first word modifies the second as much as the other way around. A futurist (in the original sense) wants to burn down libraries. A bookfuturist wants to put video games in them. (And he wants one of those video games to be Lego Hamlet.)
A bookfuturist, in other words, isn’t someone who purely embraces the new and consigns the old to the rubbish heap. She’s always looking for things that blend her appreciation of the two. (The bookfuturist might be really into steampunk.)
The bookfuturist is deeply different from the two people he might otherwise easily be mistaken for — the technofuturist and the bookservative. Technofuturists and bookservatives HATE each other. Bookfuturists have some affection for each of them, even if they both also drive him nuts.
What do I mean by “technofuturists” and “bookservatives”? Well, I can show you.
Bookservatives talk like this:
Accompanying this plague [of bookstore closings] is a feel-good propaganda campaign that enjoys the collusion of the major media outlets, including such true hi-tech believers as the NY Times and NPR—print and broadcast venues that are themselves cheerily being rendered obsolete by the hi-tech rampage—and that in subtle ways positions the destruction of book culture like so: “books” in and of themselves are nothing, only another technology, like the Walkman or the laptop. What is sacred are the texts and those are being transferred to the Internet where they will attain a new kind of high-tech-assured immortality. Like dead souls leaving their earthly bodies the books are, in effect, going to a better place: the Kindle, the e-book, the web; hi-tech’s version of Paradise…
The book is fast becoming the despised Jew of our culture. Der Jude is now Der Book. Hi-tech propogandists tell us that the book is a tree-murdering, space-devouring, inferior form of technology; that society would simply be better-off altogether if we euthanized it even as we begin to carry around, like good little Aryans, whole libraries in our pockets, downloaded on the Uber-Kindle.
Further, we are told that to assign to books a particular value above and beyond their clearly inferior utility as a medium for language is to mark oneself as an irrelevant social throwback. And then, goes the narrative, think of the extraordinary sleekness, efficiency and amplitude of a Kindle, where thousands of texts lie at your fingertips. Which teen or twenty something in their right mind is going to opt for paper over electronic texts? No one of course. That’s just the way of evolution, goes the narrative. Publishers and readers, writers and agents, are well-advised to get with this truth or perish. As to the bookstore, it is like the synagogue under Hitler: the house of a doomed religion. And the paper book is its Torah and gravestone: a thing to burn, or use to pave the road to internet heaven…
The advent of electronic media to first position in the modern chain of Being—a place once occupied by God—and later, after the Enlightenment, by humans—is no mere 9/11 upon our cultural assumptions. It is a catastrophe of holocaustal proportions. And its endgame is the disappearance of not just books but of all things human.
Technofuturists can get nearly as apoplectic, but they’re winning most of the fights these days, so most of them sound like this:
I am utterly perplexed by intelligent and innovative thinkers who believe a connected world is a negative one. How can we lambast new technology, transition and innovation? It’s completely beyond my comprehension.
It is not our fear of information overload that stalls our egos, it’s the fear that we might be missing something. Seeing the spread of social applications online over the past few years I can definitively point to one clear post-internet generational divide.
The new generation, born connected, does not feel the need to consume all the information available at their fingertips. They consume what they want and then affect or change it, they add to it or negate it, they share it and then swiftly move along the path. They rely on their community, their swarm, to filter and share information and in turn they do the same; it’s a communism of content. True ideology at it’s best…
Frank Schirrmacher asks the question “what is important, what is not important, what is important to know?” The answer is clear and for the first time in our existence the internet and technology will allow it: importance is individualism. What is important to me is not important to you, and vice-a-versa. And individualism is the epitome of free will. Free will is not a prediction engine, it’s not an algorithm on Google or Amazon, it’s the ability to share your thoughts and your stories with whomever wants to consume them, and in turn for you to consume theirs. What is import is our ability to discuss and present our views and listen to thoughts of others…
As someone born on the cusp of the digital transition, I can see both sides of the argument but I can definitively assure you that tomorrow is much better than yesterday. I am always on, always connected, always augmenting every single moment of my analog life and yet I am still capable of thinking or contemplating any number of existential questions. My brain works a little differently and the next generation’s brains will work a little differently still. We shouldn’t assume this is a bad thing. I for one hold a tremendous amount of excitement and optimism about how we will create and consume in the future. It’s just the natural evolution of storytelling and information.
I mean = it’s not THAT either, is it?
And yet = there are clear outlets — clear markets — for both of these sentiments and styles. They both LIKE arguing against the other. A more sophisticated point-of-view — which is also not just that of the distinterested critic, or the market watcher, or the tech insider — where is the space for that, really? Where is the community?
There are a lot of us — Joanne’s list is a decent place to start — mostly writing on blogs, on Twitter, trying to figure this out.
Stay tuned, Snarkkinder. I’ve got something cooking on this. Let’s keep thinking about this together.