September 5, 2009
Jamais Cascio on devices that pay attention:
Imagine a desktop with a camera that knows to shut down the screen and eventually go to sleep when you walk away (but stays awake when you're sitting there reading something or thinking), and will wake up when you sit down in front of it (no mouse-jiggling required).
Or a system with a microphone that listens for the combination of a phone ringing (sudden loud noise) followed by a nearby voice saying "hello" (or similar greeting), and will mute the system automatically.
When you go down this road, extrapolating from existing abilities (accelerometers, face and voice recognition, light detection) to more complex algorithms, the possibilities get correspondingly more complicated:
What prompted this line of thought for me was the story about the Outbreaks Near Me application for the iPhone. It struck me that a system that provided near-real-time weather, pollution, pollen, and flu (etc.) information based on watching where you are -- and learning where you typically go, to give you early warnings -- was well within our capabilities.
Or a system that listened for coughing -- how many different voices, how often, how intense, where -- to add to health maps used by epidemiologists (and other mobile apps).
It seems to be almost an axiom that the applications of digital technology that are potentially the most beneficial for the aggregate likewise require the most information from the individual user - and therefore creep us out to the point where we're reluctant to put them into practice. There's got to be a name for this paradox - a digital analogue to The Fable of the Bees.
September 4, 2009
Institutions Of Reading
What is happening here?
This year, after having amassed a collection of more than 20,000 books, officials at the pristine campus [of Cushing Academy] about 90 minutes west of Boston have decided the 144-year-old school no longer needs a traditional library. The academy’s administrators have decided to discard all their books and have given away half of what stocked their sprawling stacks - the classics, novels, poetry, biographies, tomes on every subject from the humanities to the sciences. The future, they believe, is digital.
“When I look at books, I see an outdated technology, like scrolls before books,’’ said James Tracy, headmaster of Cushing and chief promoter of the bookless campus. “This isn’t ‘Fahrenheit 451’ [the 1953 Ray Bradbury novel in which books are banned]. We’re not discouraging students from reading. We see this as a natural way to shape emerging trends and optimize technology.’’
Instead of a library, the academy is spending nearly $500,000 to create a “learning center,’’ though that is only one of the names in contention for the new space. In place of the stacks, they are spending $42,000 on three large flat-screen TVs that will project data from the Internet and $20,000 on special laptop-friendly study carrels. Where the reference desk was, they are building a $50,000 coffee shop that will include a $12,000 cappuccino machine.
And to replace those old pulpy devices that have transmitted information since Johannes Gutenberg invented the printing press in the 1400s, they have spent $10,000 to buy 18 electronic readers made by Amazon.com and Sony. Administrators plan to distribute the readers, which they’re stocking with digital material, to students looking to spend more time with literature.
Earlier today I wrote: "This... is not about technology or pedagogy, but remodeling - and only accidentally the other things."
I should note that I mean exactly the opposite of "this is no big deal." To clarify - eliminating the stacks in favor of a) a coffee shop and b) spaces for laptops and digital readers is a reorganization of space and expenditure of money, for which the technological and pedagogical commitments and consequences, while absolutely real, have not been fully thought through.
Let's consider some of them. Well, let's start with just one.
At present, the Kindle, like all other electronic readers, is conceived, designed, and marketed as a consumer object. It is designed for individual readers to purchase individual libraries of digital books, which they can then carry with them anywhere. Its closest analogue in the technological world is probably the digital music and media player. Its closest analogue in the history of reading is the consumer-owned paperback book.
For the most part, we've internalized and naturalized this mode of reading and all of its rituals. Readers are people who own books. But those aren't the only kind of readers, and (maybe more importantly) that's not the only kind of reading.
We read books that aren't ours. We read stray pieces of paper that are shoved in front of our face and then thrown or tucked away. We read maps and charts posted on walls, newspapers left on chairs, business cards handled and filed, forms that we fill out and return, post-it notes that we wrote as reminders to ourselves weeks and months ago. And a hundred and one other things in a thousand and one different ways.
The Kindle models the reading behavior and rituals of the mainstream owner of books, who is also not accidentally, the mainstream customer of Amazon.com. While there is considerable demographic and behavioral overlap between this person and the library patron, the rituals of use are actually quite different. Here are a few things to look at:
- Most books (and nonbooks) in libraries are intended and frequently designed to be read by many different people over a long range of time. To use the language of kitchen-appliances, it's a commercial-grade item. To use the language of IT, books in libraries are terminals or workstations, not PCs. But there's no such thing (yet) as a multi-user, workstation Kindle.
- We usually privilege the big library ritual of picking out a book, checking it out, and taking it home, but most library materials are designed to be read in-place. Rare materials, noncirculating reference, the old card catalog, and of course, books you look up and thumb through, maybe even make some notes or photocopy a few pages from, and return to be shelved. Some of this reading, e.g., searching a library's entire catalog, a computer terminal performs admirably. But a Kindle doesn't actually do this very well. Its chief asset, portability, actually works against it; and when you tether a Kindle to a particular building, you've eliminated much of its function altogether.
- Libraries are collections, typically quite specialized ones, optimized in terms of audience (public, research, youth) if nothing else. And then there are collections within collections - subject wings and reading rooms. Kindles are omnibus devices, offering no particular specializations. In fact, you CAN'T make the Kindle a specialized device, either in its hardware or its software, because no particular reading specialization has a dominant foothold. (This is my impression, anyways; I'd love to hear otherwise).
In short, when it comes to electronic reading machines, there is no equivalent to the library stacks or the computer workstation. There is also no real equivalent to the newsstand or bulletin board, the teacher's chalkboard, or the family message board. Everything is geared towards the individual reader-owner.
One of my favorite talks, that I return to again and again whenever I'm trying to figure out consumer electronic media, is the joint interview Bill Gates and Steve Jobs gave with Walt Mossberg and Kara Swisher a few years ago. I'm paraphrasing, but one of the things Gates talks about are the different spaces of experiencing digital technology. The "four-foot experience" - whether you're watching TV or gaming or watching a movie - is fundamentally different from that of the office PC or the laptop or the handheld. They're reciprocally different. They require different technologies, different interfaces, to match their different possibilities and inherited rituals.
We haven't figured this out for digital readers yet - how to vary the hardware and software to match the different possibilities and rituals of reading in different contexts. We don't have the ordinary library experience, or classroom experience, let alone the Library of Congress experience. In that vacuum, the only thing you can recreate when you pull out the stacks is a coffeeshop or cybercafe. There is nothing else to offer.
By the way, please vote - today! - for our SxSW panel on Kindle 2020. This is one of the things we'll be talking about.
(Below the fold is the point in the thread where I can become a prematurely old man.)... Read more ....
File under: Books, Writing & Such, Technosnark
August 30, 2009
Your Future Portaphone
He hasn't posted a ton lately, and really, going after mobile phones is low-hanging fruit, but I was still delighted with today's look at portable phones (from a 1976 book titled Future Facts). It includes this quote:
For a while at least, the portaphone will remain a business tool or luxury item. In time, however, portaphones will get smaller and cheaper, just as transistor radios have.
First: "portaphones!" When did we stop applying multisyllabic prefixes to words? Probably around the same time "port-a" became uniquely associated with outdoor toilets.
Second: today, we would almost certainly have to reverse that analogy: "Over time, transistor radios became smaller and cheaper, just as celullar phones have today." I consider this a sign of the analogy's intrinsic merit.
Last: it's easy to look at old predictions of the future with awe at what they get right and glee at what they get wrong. But this should be taken seriously as symptoms. They show how the past dreamed itself, and indeed, how it dreamed the present, in all of its possibilities and constraints, into being.
File under: Gleeful Miscellany, Object Culture, Technosnark
Scholars To Google: Your Metadata Sucks
Geoff Nunberg at Language Log on one of the biggest problems for scholarly use of Google Books: :
It's well and good to use the corpus just for finding information on a topic — entering some key words and barrelling in sideways. (That's what "googling" means, isn't it?) But for scholars looking for a particular edition of Leaves of Grass, say, it doesn't do a lot of good just to enter "I contain multitudes" in the search box and hope for the best. Ditto for someone who wants to look at early-19th century French editions of Le Contrat Social, or to linguists, historians or literary scholars trying to trace the development of words or constructions: Can we observe the way happiness replaced felicity in the seventeenth century, as Keith Thomas suggests? When did "the United States are" start to lose ground to "the United States is"? How did the use of propaganda rise and fall by decade over the course of the twentieth century? And so on for all the questions that have made Google Books such an exciting prospect for all of us wordinistas and wordastri. But to answer those questions you need good metadata. And Google's are a train wreck: a mish-mash wrapped in a muddle wrapped in a mess.
The devil here is in the details - Nunberg goes on to list dates and categories that aren't accidentally, but systematically misapplied, in wild, impossible fashion. There's a great discussion after the post, too - not to be missed.
It's actually surprising that this is such a problem, considering that the bulk of Google Books's collection is gathered from major research libraries, who DO spend a lot of time cataloguing this stuff for themselves. What happened?
In discussion after my presentation, Dan Clancy, the Chief Engineer for the Google Books project, said that the erroneous dates were all supplied by the libraries. He was woolgathering, I think. It's true that there are a few collections in the corpus that are systematically misdated, like a large group of Portuguese-language works all dated 1899. But a very large proportion of the errors are clearly Google's doing. Of the first ten full-view misdated books turned up by a search for books published before 1812 that mention "Charles Dickens", all ten are correctly dated in the catalogues of the Harvard, Michigan, and Berkeley libraries they were drawn from. Most of the misdatings are pretty obviously the result of an effort to automate the extraction of pub dates from the OCR'd text. For example the 1604 date from a 1901 auction catalogue is drawn from a bookmark reproduced in the early pages, and the 1574 dating (as of this writing) on a 1901 book about English bookplates from the Harvard Library collections is clearly taken from the frontispiece, which displays an armorial booksmark dated 1574...
[It's like that joke from Star Trek VI: "not every species keeps their genitals" (by which I mean, metadata) "in the same place."]
After some early back-and-forth, Google decided it did want to acquire the library records for scanned books along with the scans themselves, and now it evidently has them, but I understand the company hasn't licensed them for display or use -- hence, presumably, the odd automated stabs at recovering dates from the OCR that are already present in the library records associated with the file.
Ugh. I mean, the books in these libraries are incredibly valuable. But when you think about all of the time and labor spent documenting and preserving the cataloguing info over centuries, it's kind of astonishing that we're losing that in favor of clumsy OCR. Out of any company, Google should know that a well-optimized search technology is at least as important as the data it helps to sort.
Maybe they're just excessively cocky about their own tools. After all, the metadata problem isn't limited to browsing through Google Books. If you've ever tried to use an application like Zotero or EndNote to extract book and article metadata from Google Scholar, you find incomplete and mistaken information all over the place. You spend almost as much time checking your work and cleaning up as you would if you'd just entered the info in manually in the first place.
And in the end, manual entry is what we want to avoid. I'd say half the value of digital text archives for scholars is that they can put their eyeballs on a document - the other half is that they can send little robots to look at thousands and thousands of them, in the form of code that depends not least on good metadata.
File under: Books, Writing & Such, Technosnark
August 24, 2009
Technologies Don't Transform, Societies Do, Pt. 2
As a follow up to my first linkpost on this topic, I'm adding an exhibit: Apple's celebrated "Knowledge Navigator" late-80s concept video. Watch it, then come back.
Here's the thing that's always struck me about this video. Technologically, it's wonderfully optimistic. (I love it when the professor flubs the name of the researcher he's looking for, and the computer figures out the right name, like a Google-search correcting spelling.)
But socially, it's incredibly conservative. Basically, it treats the computer interface as a synthesis of secretary, research assistant, and wife to the prototypically WASPy-dude professor. He doesn't even have to learn how to type! Imagine how short his Acknowledgements page will be! And his mom still nags him about his dad's birthday party! Oh, will life's problems never go away?
The assumptions are that 1) a breakthrough communication technology and 2) probably quite a bit of time passing won't produce any social changes at all. It won't create any new problems, either. It will simply make life easier.
We're actually usually pretty good at forecasting technological change. But we're astonishingly bad at predicting social responses to it. This is why most past attempts to predict the future strike us as unintentionally funny in retrospect: it's the mismatch between their creators' social imagination and our own -- or rather, between the constitutive blindnesses of their creators' social imagination and our own. We see and say things that they can't, and (often enough) vice versa.
Sideloading Now Seems So Simple
Nilay Patel, on the whole Apple/Google/AT&T/App Store-avaganza:
I don't think there's any good reason the most interesting things about the App Store right now should be procedural details and the number of submissions each reviewer handles a day -- somewhere around 80, if you can believe it. I'd rather be talking about new and exciting ways to integrate the iPhone and other mobile devices into my daily life -- I'd rather be talking about apps. And the more I think about it, the only way Apple can get back to that is by doing what it should have done in the first place: allowing developers and users to bypass the App Store and sideload apps onto the iPhone themselves.
Every single App Store submission story we've covered boils down to the fact that Apple is the single point of control for the iPhone ecosystem, and it's simply not fast or flexible enough to keep up with the rapid pace of innovation we're seeing on the platform. Like it or not, what's happening on the iPhone is leading the entire tech industry, and Apple should be doing everything in its power to enhance that, rather than miring itself in scandal and regulatory investigation. If that means releasing some control over the platform, then so be it -- especially since allowing sideloading would make almost all of these problems simply disappear.
See also #8.
Away We Go
See Winged Chariot press -- I think it's UK only for the moment.
Technologies Don't Transform. Societies Do.
Quick-hitting today, but here's an important axiom from Dan Visel at if:book --
the social use of digital media is more transformative than the move to the digital itself
Visel's responding to Eric Harvey's "The Social History of the MP3":
The first widespread music delivery technology to emanate from outside industry control, mp3s, flowing through peer-to-peer networks and other pathways hidden in plain sight, have performed the radical task of separating music from the music industry for the first time in a century. They have facilitated the rise of an enormous pirate infrastructure; ideologically separate from the established one, but feeding off its products, multiplying and distributing them freely, without following the century-old rules of capitalist exchange. Capitalism hasn't gone away, of course, but mp3s have severely threatened its habits and rituals within music culture. There is nothing inherent or natural about paying for music, and the circulation of mp3s > through unsanctioned networks reaffirms music as a social process driven by passion, not market logic or copyright. Yet at the same time the Internet largely freed music from its packaged-good status and opened a realm of free-exchange, it also rendered those exciting new rituals very trackable. In the same way that Facebook visually represents "having friends," the mp3s coursing through file-sharing networks quantify the online social life of music by charting its path.
P.S.: This observation from Harvey's essay is a great coda to my "How the iPod Changed the Way We Read" --
This might be the most profound social shift of the mp3 era: hoarding and sharing music changed from an activity for eccentrics to the default mode of musical enjoyment for millions.
The Twitter Valve
Henry Jenkins has a solid post on the value and limitations of Twitter. It has two parts, descriptive and normative. Here's the descriptive part:
Someone recently asked me, "If McCluhan is right and the medium is the message, what is the message of Twitter?" My response: "Here It Is and Here I Am."
And the normative:
My first impressions were correct that Twitter is no substitute for Blogs or Live Journal. And in so far as people are using it to take on functions once played on blogs, there is a serious loss to digital culture.
I think you can find a lot to talk about in the descriptive part of Jenkins's account even if you quibble with the normative part. But there are also descriptive claims contained in the normative account. I want to look at just one slice of this:
Three years ago, when I started this blog, if people wanted to direct attention to one of my blog posts, they would write about it in their blog and often feel compelled to spell out more fully why they found it a valuable resource. I got a deeper insight into their thinking and often the posts would spark larger debate. As the function of link sharing has moved into Twitter, much of this additional commentary has dropped off. Most often, the retweets simply condense and pass along my original Tweet. At best, I get a few additional words on the level of "Awesome" or "Inspiring" or "Interesting." So, in so far as Twitter replaces blogs, we are impoverishing the discourse which occurs on line.
In other words, Twitter acts as a kind of valve, where the energy that would go into 1) writing extended comments and 2) signing a blog of your own gets siphoned off into minimalist links.
I'll hold off on explaining what I think about this -- I'm still formulating it -- but I want to note that you could apply this logic to a lot of other kinds of contemporary web discourse, from Facebook "Likes" to Diggs -- maybe even things like Instapaper.
There is clearly demand for a minimalist approach to reading and commenting. We like the option of doing "less" and doing it later. Why is this? And what does it change about the way we communicate ideas online?
August 20, 2009
How The iPod Changed The Way We Read
Since I slid this claim in at the end of a long post with a lot of literary theory, you might have missed it:
When the media landscape changes, we actually begin to SEE things differently, even (or ESPECIALLY) things that haven't changed at all.
This is the reason why the iPod didn't just change the way we listen to music - and later, look at pictures or movies or play video games. It changed the way we read.
And (because I couldn't help my ever-qualifying self):
(As did movies, television, video games, and many, many other things.)
(The big one I Ieft out in this list was mobile phones, but since the iPod and the smartphone wound up being convergent/complementary technologies, I think they're more arguably part of the same story.)
Let me try to spell out point by point how I think the iPod - or more precisely, the evolution of the iPod - changed reading.
- Design Matters. The iPod elevated the level of aesthetic pleasure people expected from handheld devices, as well as the premium they were willing to pay for well-made things. Looking back at the first-generation Kindle, it's actually astonishing how much of the early commentary focused on the perceived ugliness of the device. In particular, the first Kindle didn't just look ugly - it looked out of date. This was something we used to care about with home theater equipment and kitchen appliances - the iPod taught us to care about it on our handhelds, even when we were walking around with cheap plastic phones. If the e-reader breakthrough had happened in 1999 or 2002, even if the device had been similarly awkward-looking relative to the technology around it, I don't think this would have been as much of a problem as it became.
- Software Matters. I almost titled this "Design Goes All The Way Down." It's a truism now that Apple was able to swoop in on the digital music market because they wrote better software than the Sonys and Samsungs they were competing with on the high end. But it's true. You're not just creating a piece of hardware; you're creating an interface for an experience. And in particular, if you get the experience of buying, sorting, finding, and selecting media wrong, you've got real problems. You have to make the software intuitive, powerful, and fun. The goal is to reduce the friction between a user's intent and their goal - whether it's buying music, listening to it, or flipping through album art. If there's friction anywhere in the experience, it had better be deeply pleasurable friction. (That's right, I said it.)
The Kindle actually seems to understand this really, really well.
- This is more specific: People Like Full Color. Was anyone complaining about the monochrome taupe-and-dark-taupe display of the first iPod? No. Was I when I bought my first iPod, in 2004? Not at all. Did I cry inside when they launched the first color-display, video-capable iPod about a month afterwards? Not exactly. I cried on the outside, too. Color is resource-intensive, and hard to get right on a small screen. But god - it's beautiful. It's also one of the things that easily gets lost in the transition from print to digital; there's nothing like a book with full-color prints, and the only thing sadder than an image-heavy book that's all in black-and-white is a digital version of the same book that doesn't have images at all.
- Images Make Reading Easier. I mean, this is one of the big lessons of the graphical interface on the desktop, right? Column after column of text is hard to look at, and it's hard to distinguish one version from the next. Seriously - sorting through an early iPod, like my third-gen one, is one of the most intense reading experiences you're likely to have, and I think it (along with text messages) totally softened people up for reading strings of text on small screens. But texts with icons - even generic icons that just look like little pieces of paper next to the text that identifies with them - reinforces the idea that you're dealing with distinct objects. Add covers - like book or album covers, or preview images of pictures, and you've got a hieroglyphic hybrid mode of reading that is frankly more powerful and intuitive than text or images alone. Create a software interface where you can manipulate those objects, and you've got something that's genuinely game-changing.
- Media Devices Should Do More Than One Thing. It's great that I can take my music with me, but I'd really like to listen to radio programs, too. (Podcasts.) I carry around all of these pictures in my wallet - maybe you could...? (Done.) What about TV? I like TV. And my kids like to watch movies in the car. (We can do that.)
Was it obvious that there was a hidden affinity between pictures and music and movies? No. But once you've got a screen with a big hard drive, a great syncing tool, and a solid store that can deal with media companies... You follow the logic of what you can meaningfully offer and what your customers can use the device to do.
The only thing more appealing for multiple media than a tiny screen with a big hard drive is a great big screen with a big hard drive. I can't believe that future reading devices won't take advantage of it.
- Make It Easy For Me To Get My Own Stuff On The Screen. Can you imagine if Apple had ONLY let you put stuff on your iPod that you'd bought or ripped through iTunes? The iPod moment benefited tremendously from the Napster moment, which in turn was driven by the CD-ripping and cheap fast internet moment. You had all of this digital material sitting on people's hard drives and floating around networks, and we just needed someplace to put it. There's no stuff we want more than our own stuff. Apple smartly opened itself up to it. Well, likewise, now, we've decades of office documents sitting on people's hard drives and hypertext pages floating around networks, and nowhere but our computers to put it.
I'll say it again: There's No Stuff We Want More Than Our Own Stuff. If Amazon, or Google, or anybody, could find a way for me to get MY print library on a portable screen, I would both love and pay them dearly for the chance to do so.
- Devices Should Talk To Each Other. My DVD player is an idiot. It has nothing to say to anyone except maybe my TV and some speakers. Now, I just leave it in a drawer. My TV is a little better, because it listens really well, but not by much. From the beginning, the iPod could both talk and listen to your computer. Now, because of its wireless connect, the iPhone can talk to almost anything.
The Kindle's networking ability, still limited as it is, stands on the shoulders of those devices. (And your computer, too, does a much better job of talking to small, post-PC devices than it used to, from video game consoles to mobile phones.)
- This last point is from Gavin Craig, and it includes the iPod, and the Kindle, but also is more general: "It should be possible to make the device useful in ways that the designer may not have intended." I call this half-jokingly "Media Existentialism." (Existence precedes essence; we come to terms with our determined place in the universe, and only afterwards do we define who we are and what we're for.)
The point is that users, not designers, ultimately determine what an object is for; and any attempt to engineer-through that process in a closed-ended way restricts value rather than creating it.
This is a short list of the expectations we have for reading machines now that we largely didn't have a decade ago. None of them came from devices that were designed (except largely accidentally) to read anything.
But this list only barely begin to speak to the expectations we'll have for an electronic reader decades from now.
What might those expectations be? Where will they come from? How might they change everything else?
File under: Books, Writing & Such, Design, Object Culture, Technosnark
August 17, 2009
Snark By Snarkwest: Kindle 2020
The iPod wasn't the first digital music player, but that doesn't really matter; when it was introduced in 2001, it was the first digital music player that made ordinary tech-inclined (but not necessarily tech-savvy) consumers pay attention.
I graduated from college that year, so I remember that time very well. Let's review; Napster had been shut down. I didn't own a DVD player. In fact, I didn't even have my own computer. (I bought both that December.) I didn't have a cellular phone, but some of my friends did. (In fact, I didn't get one until 2005.) I had never used wireless internet, ever. I had bought an APS camera two years before, on study abroad. (Digital cameras cost about a kajillion dollars.) Instead of writing a blog, I kept email lists of everyone I knew and periodically quasi-spammed them with prose poems, Nietzsche quotes, outlines for essays on Bulworth ("The key to understanding Bulworth is that it's not very good"), and news about my life. Oh, and I used telnet for email.
The time hardly seemed propitious to launch a device that would effectively break wide open handheld digital media. But that's what happened.
It's worth remembering this, because we've now had eight years of the iPod, iTunes, and the Apple Store, during which we've had to clear all of these technical and commercial and psychological and social hurdles to get to the devices that most of us carry around (in one version or another) every day.
What does this year's model of the iPhone (already almost three years removed from the announcement of the first version) have in common with the first iPod? It fits in your pocket; and maybe - maybe - you still put stuff on it from your computer - to update the firmware, if nothing else.
That's eight years of the iPod. I'm glad I saw it, because 21-year-old me wouldn't have believed it. All the more so because none of what happened is in retrospect at all ridiculous.
Now let's imagine twelve years of the Kindle.
Now the Kindle in 2020 might not even be the Kindle anymore. Maybe Sony or Apple or Google or Microsoft or someone we don't even expect might shoulder Amazon aside and take center stage, or readers will be more like the smartphone market right now, with a handful of solid competitors egging each other on.
But the Kindle now, like the iPod eight years ago, is the first electronic reader that most of us tech-inclined but not tech-savvy users have paid much attention to. It's already gotten better, it's already spurred competition, and the chances are good that we're going to see some significant advances in these devices before the end of the year.
In twelve years, we know electronic readers will do more, store more, work faster, look cooler, and offer more things to look at then it does now.
But what don't we know about the Kindle 2020 yet?
Robin, Matt, and I - yes, all three of us - have proposed a presentation for South by Southwest Interactive where we -- and some other supremely smart people -- are going to try to figure out just that.
Here are some basic questions:
- What kind of devices will we use to read?
- What formats will be used to deliver documents?
- What kinds of documents will be "read" - text, image, video, audio, hybrids?
- How will documents be written and produced?
- How will documents be bought, sold, and otherwise supported?
- How will contributors be compensated?
- How will reading work in different industries?
And here, I think, are - for me, at least, some more interesting ones:
- What could turn an electronic reader into a totally NECESSARY device - like a mobile phone, or iPod?
- What features will the reader of 2020 have that nobody's even talking about yet?
- What are we going to use it to do that nobody uses anything to do now?
- What's going to be your favorite thing to read on it?
- Forget your favorite thing - what are you going to use it to do over and over again, whether you like it or not?
- How are you going to write with it?
- Who's going to have one? How are they going to pay for it?
- How do we share what we read?
- What will we still want but not get?
- Here's the big one: how might it change the entire FIELD of media consumption, handheld devices, computing, reading, etc... Will everything restructure itself around the reader? Or will it be a fun, handy curiosity, plotting its own logic while everything else goes along unchanged?
Beginning today, you can vote to help get this panel accepted to South By Southwest. I am way excited. First, I am a nerd for all things related to the written word. Second, Robin and Matt are the most talented futuronomists I know.
Finally, in addition to being awesome, Austin is (oddly enough) geographically centered for the three of us. If you look at our locations (Philadelphia, Minneapolis, San Francisco), Austin is the country's fourth column, which (I think) bestows it with Penumbra-like magical powers.
Between books, papers, and screens, I think we might just have this covered. But see, this is where we start to worry about our own blind spots or idiosyncratic enthusiasms, not because we want to lose them, but because we need to put them in context.
So we don't just need your vote. We need to know what you know. And we're willing to use the patented Snarkmarket figure-four leg-lock -- by which I mean, your comments in the thread below -- to get the conversation started.
They say that hindsight is 20/20, but that's not really true; some people remember the past better than others. The future, however, really is 20/20 (especially in 2020). Right now, we all know just as much about the future of reading as everyone else.
The only difference is that we -- you and I -- are focused.
What do you see?
File under: Books, Writing & Such, Collaborations, Object Culture, Technosnark
August 14, 2009
The Real Google Documents
Here's an idea for a great Google web application - an online archive where you can tag, sort, and store all of your used-to-be-paper documents, i.e., PDFs - and to share the same documents with other people, or even everybody.
I use many, many applications that perform a similar service with the PDFs on my hard drive; Yep!, Papers, Zotero, Scrivener, Evernote. And I use Dropbox to backup and sync my PDFs between machines. I also use Scribd to read PDFs and share them with the world. But Google could easily offer a service that does everything these applications do and more. They're already offering a web-reader for PDFs. What they need is something that actually lets you USE them.
Here's how I imagine this goes. Let's say someone emails you a PDF to your Gmail account, or appends a PDF to a feed you read in Google Reader. Instead of downloading it onto your computer (or, egads, a public machine), you have the opportunity to load it into Docs. Just like that, it's in your archive. You can also have Google Desktop scan for and index your PDFs and auto-load them into your archive, too.
Once you import it, you don't have to do anything else. It'll either pull the text -- or if there's no text layer, it'll OCR the document FOR you. You can auto-tag it or add your own tags to help you sort your docs together. It can also pull metadata, like Zotero. And you can create smart collections that link PDFs with text documents, emails, and stuff from Google Books, Scholar, even Maps or Groups.
You can also customize levels of privacy and security. Some files you might want to have public, like on Scribd. Maybe you'll even create RSS channels so folks can receive your new images/PDFs/ebooks/XML documents automatically. Others you want to share with specified users, like Dropbox or Groups. Still others (tax and employment info, etc.), you'll encrypt with extra passwords.
In fact, this is awfully close to the vision two enterprising chaps passed off years ago of the Google Grid.
Seriously; Google says it wants to index the world's information. Well, let me tell you - I'm chock full of information that I don't know what to do with. Why can't it start by taking some of mine - and giving me some tools so that I can do things with it as payment?
File under: Books, Writing & Such, Technosnark
August 6, 2009
Rhonda, Rhonda, Rhonda!
Wow. Tonight I got a chance to try Rhonda, a crazy drawing application that's somehow both 2D and 3D at the same time. It's like SketchUp for actual, uh, sketching.
So, this is just a video of me using Rhonda, played at 4X speed, which is to say, it might be really boring to watch, so feel free to skip it. All I know is, if some blogger I subscribe to tried out Rhonda, I would want him to post a video:
File under: Design, Gleeful Miscellany, Technosnark
August 5, 2009
A Fine Vintage In the Kitchen
I'm a sucker for this kind of stuff; Regina Schrambling praises vintage stoves:
So many other essentials in life are clearly improved in their latest incarnation: Phones are smaller and portable; stereos are downsized to ear buds; cars are safer and run on less fuel. But stoves are a basic that should stick to the basics: The fewer bells and whistles, the less need for bell-and-whistle repairmen. Motherboard is not a word that should ever be associated with the kitchen—put computer technology in a stove, and you're asking for a crash. Google "I hate my Viking" these days, and you get a sense of how many things can go wrong with techno-overload. Some of these ranges combine electric and gas elements, which is a recipe for trouble, as is microwave or convection capability. This kind of overdesign is what killed combination tuner/turntables—one goes, and the other dies from neglect.
I get kind of excited about things like self-updating blenders and coffee makers that I can control from my Blackberry, but there's also, sometimes, something to be said for saying, "You know, I think we've kind of figured this out. Maybe we'll work the kinks out on what's next in another few decades, but until then, let me have my dumb appliance."
This sort of dovetails with Michael Pollan's essay about Julia Child and food TV -- there's something about the convergence of cooking with electronics that transformed it into entertainment, that elevated it into something harder than most people could or would do at home, that left us with celebrity chefs and high-powered gadgets and a vastly reduced proportion of us actually cooking anything on them.
Which in turn makes it harder for technology to help us - we'd have to actually KNOW what we were doing to actually make a better (as opposed to shinier, or more convenient) device.
July 31, 2009
Constellations of Intelligence
Matthew Battles takes on academics and corporate types horning in on social media. This paragraph reminded me a bit of the sensibility underpinning New Liberal Arts:
In a thriving networked culture, it should be possible not merely to complement but to replace institutions and corporations with commons-native constellations of intelligence. The mainstream media quakes before the ever-multiplying range of news-gathering alternatives. In the intellectual world, the Infinite Summer—a massively distributed endeavor to collectively read and discuss the late novelist David Foster Wallace’s magnum opus Infinite Jest—is proving the power of social media to build loosely-structured networks of brains to replace the medieval legacy of colleges, faculties, and curricula.
Ultimately, I think some blending of the academy and the social web is inevitable, but it's a genuine dilemma which one will ultimately remake the other after its own matrix. Ultimately, I would bet on the web, and here's why.
For one thing, it's not a head-to-head but a three-way competition. The base of the university is still probably wash after wash of traditional intellectual culture - medievalism, humanism, the Enlightenment. But that's been increasingly uprooted by first state and then corporate bureaucracies. The ethos of digital culture is actually more sympathetic to traditional humanism than corporate office suite. But the technology and economic possibilities of digital culture can also peel away the more futurist-thinking of the capitalist side.
The real clincher, though, is writing. If writers and students and researchers and administrators at universities begin to port their assumptions about how all of these things work into the classroom and the academic conference, then it'll be a relentless wave. Within a generation, nothing will look the same. (Nothing will be wiped out, either - universities, as the archives of the world, retain everything, like the unconscious.)
By the way, if I haven't said it already, Battles's and Josh Glenn's Hilobrow is 100% required reading. I think it's the best new blog of 2009.
File under: Learnin', New Liberal Arts, Technosnark
July 30, 2009
Hey! I See You Copying That
Over at Nieman Journalism Lab, Zachary Seward explains Tracer, a utility with two functions, one terrible and the other cool:
- Terrible: It inserts extra stuff into your copied-and-pasted text. So for instance, if Snarkmarket was running Tracer, and you copied this line, when you pasted it, it would also say: "Come check out the original post at Snarkmarket!" along with a link. T-E-R-R-I-B-L-E.
- Cool: Forget the copy-paste hijacking and focus on the analytics you could get from this thing. Seward writes: "But I'm much more impressed by Tracer's backend, which allows publishers to see which pages—and, even better, which parts of those pages—are most frequently copied."
Don't miss the graphics on the Nieman Journalism Lab post.
This connects back to some of the ideas in my post about tethered books—and has some of the same creepy/cool combo, too. But, on balance, I think more granular information about how people read and use text is really exciting—simply because it could help you make your text so much better.
July 29, 2009
Mr. Penumbra Would Like This
Each week postliteracy.org presents visitors with a single image, which will often have multiple layers of meaning in its visual content. Embedded within that image, though, is textual content hidden through steganography. The audience must decode the hidden text [...] in order to "read" the entire message.
And this sounds pretty new liberal artsy, doesn't it:
Thus, each post at postliteracy.org requires polymodal literacy—here, visual, interactive, computational, and textual literacies—to decode its full meaning.
Helpfully, they link directly to the tools required to find the hidden messages.
City of Inference
So, this research team at U of Washington totally out-awesomed PhotoSynth by building amazing point-cloud 3D models of monuments and cities from Flickr photos.
July 27, 2009
Fun Work Could Mean Free Work
Snarkmarket (along with others) has been talking recently about the economic model implicit in the free release of New Liberal Arts and the deliberately limited revenue realized from its sale. As one of the authors of that book, I was conscious going into the project that I wouldn't be paid for my contribution, no matter how successful or influential the book might become—and with the release of Chris Anderson's book "Free: The Future of a Radical Price," this seems like a good time to discuss working for free.
Virginia Postrel's review of "Free" in the New York Times ends with the following paragraphs:
"No man but a blockhead ever wrote except for money," Samuel Johnson said, and that attitude has had a good two-century run. But the Web is full of blockheads, whether they're rate-busting amateurs or professionals trawling for speaking gigs. All this free stuff raises the real standard of living, by making it ever easier for people to find entertainment, information and communication that pleases them.
Business strategy, however, seeks not only to create but to capture value. Free is about a phenomenon in which almost all the new value goes to consumers, not producers. It is false to assume that no price means no value. But it is equally false to argue that value implies profitability.
This is true as far as it goes, but I think it's more interesting as a starting point than an ending point. In particular, I feel like it misses the non-monetary value that work produces for those who do it.
Most of us, if we're fortunate, derive some form of value from the work we do, above and beyond the pay we receive. We enjoy working, or we enjoy the status that results from doing a certain kind of work—being widely recognized as a scholarly authority or having our ideas praised by people we respect and admire. To the extent that this intrinsic value is higher than the monetary value we could receive for doing something else, we will happily work for less or work for free, because the non-economic rewards are so significant.
Now, in the previous economic paradigm, it was possible to do work that you would have done for less or for free and still be paid well for it, because it was too much trouble for your employers/clients to find someone who could do the work as well and for free. But the internet drastically reduces that barrier. Imagine trying to find people to write a computer operating system and all the associated applications without expecting payment before the internet—now look at Linux.
I wonder if we're heading toward an economy where, to put it bluntly, people don't get paid for doing fun things. If something is fun—for someone in the world who finds it fun enough to become good at it, and to do it without expecting pay—it will no longer pay.
In this world, people still work for money, maybe 20 hours a week, but they don't really derive happiness from their jobs (if their job was something that people enjoyed doing, like playing in a symphony or writing poetry, it wouldn't pay—someone would be doing it for free*). They spend the rest of their time doing things for free, things that produce tremendous creative value for themselves and for others, but form a gift economy outside the normal capitalist economy.
I think most creative, intellectual, and information-oriented pursuits would end up on the free side of that divide—which is not to devalue them at all. Rather, I think that clarity about the kinds of rewards you could expect from each activity could lessen a lot of the anxiety about "how will I make a living as a writer, journalist, playwright, composer?" Maybe you won't—and that's okay.
*When I say "free," I don't necessarily mean $0.00. You might still earn some token payments for your creative effort, but not enough to contribute in a meaningful way to your income—a few hundred dollars a year, perhaps.
File under: Media Galaxy, Society/Culture, Technosnark
July 24, 2009
Towards A Theory of Secondary Literacy
There's a great scene in Star Trek IV - yes, the one where the crew travels back in time to save whales - where Scotty, the engineer, tries to control a Macintosh by talking to it. When McCoy hands him the mouse, he speaks into it, in a sweetly coaxing voice: "Hello, computer!" When he's told to use the keyboard ("How quaint!"), he irritably cracks his knuckles -- and hunts-and-pecks at Warp 1 to pull up the specs for "transparent aluminum."
As recently as 2000, it seemed inevitable that any minute now, we were going to be able to turn in our quaint keyboards and start controlling computers with our voice. Our computers were going to become just like our telephones, or even better, like our secretaries. But while voice and speech recognition and commands have gotten a lot better, generally the trend has been in the other direction - instead of talking to our computers, we're typing on our phones.
(Which is arguably the hidden message of Scotty and the Mac - even somebody with the most powerful voice-controlled computer in the galaxy can touch-type like a champ. He probably only talks to the computer so his hands are free to text his friends while he's engineering! "brb - needed on away team" -- "anyone know how to recrystallize dilithium" -- That's why he's so inventive! He's crowdsourcing!)
The return to speech, in all of its immediacy, after centuries of the technological dominance of writing, seemed inevitable. The phonograph, film, radio, and television all seemed to point towards a future dominated by communications technology where writing and reading played an increasingly diminished role. I think the most important development, though, was probably the telephone. Ordinary speech, conversation, in real-time, where space itself appeared to vanish. It created a paradigm not just for media theorists and imaginative futurists but for ordinary people to imagine tomorrow.
This was Marshall McLuhan's "global village" - a media and politics where the limitations of speech across place and time were virtually eliminated. Walter Ong called it "secondary orality" - we were seeing a return to a culture dominated by oral communication that wasn't QUITE like the primary orality of nonliterate cultures - it was mediated by writing, by print, and by the technologies and media of the new orality themselves.
“When I first used the term ‘secondary orality,’ I was thinking of the kind of orality you get on radio and television, where oral performance produces effects somewhat like those of ‘primary orality,’ the orality using the unprocessed human voice, particularly in addressing groups, but where the creation of orality is of a new sort. Orality here is produced by technology. Radio and television are ‘secondary’ in the sense that they are technologically powered, demanding the use of writing and other technologies in designing and manufacturing the machines which reproduce voice. They are thus unlike primary orality, which uses no tools or technology at all. Radio and television provide technologized orality. This is what I originally referred to by the term ‘secondary orality.’
I have also heard the term ‘secondary orality’ lately applied by some to other sorts of electronic verbalization which are really not oral at all—to the Internet and similar computerized creations for text. There is a reason for this usage of the term. In nontechnologized oral interchange, as we have noted earlier, there is no perceptible interval between the utterance of the speaker and the hearer’s reception of what is uttered. Oral communication is all immediate, in the present. Writing, chirographic or typed, on the other hand, comes out of the past. Even if you write a memo to yourself, when you refer to it, it’s a memo which you wrote a few minutes ago, or maybe two weeks ago. But on a computer network, the recipient can receive what is communicated with no such interval. Although it is not exactly the same as oral communication, the network message from one person to another or others is very rapid and can in effect be in the present. Computerized communication can thus suggest the immediate experience of direct sound. I believe that is why computerized verbalization has been assimilated to secondary ‘orality,’ even when it comes not in oral-aural format but through the eye, and thus is not directly oral at all. Here textualized verbal exchange registers psychologically as having the temporal immediacy of oral exchange. To handle [page break] such technologizing of the textualized word, I have tried occasionally to introduce the term ‘secondary literacy.’ We are not considering here the production of sounded words on the computer, which of course are even more readily assimilated to ‘secondary orality’” (80-81).
This is where most of the futurists got it wrong - the impact of radio, television, and the telephone weren't going to be solely or even primarily on more and more speech, but, for technical or cultural or who-knows-exactly-what reasons, on writing! We didn't give up writing - we put it in our pockets, took it outside, blended it with sound, pictures, and video, and sent it over radio waves so we could "talk" to our friends in real-time. And we used those same radio waves to download books and newspapers and everything else to our screens so we would have something to talk about.
This is the thing about literacy today, that needs above all not to be misunderstood. Both the people who say that reading/writing have declined and that reading/writing are stronger than ever are right, and wrong. It's not a return to the word, unchanged. It's a literacy transformed by the existence of the electronic media that it initially has nothing in common with. It's also transformed by all the textual forms - mail, the newspaper, the book, the bulletin board, etc. It's not purely one thing or another.
This reminds me of one of my favorite Jacques Derrida quotes, from his essay "The Book to Come":
What we are dealing with are never replacements that put an end to what they replace but rather, if I might use this word today, restructurations in which the oldest form survives, and even survives endlessly, coexisting with the new form and even coming to terms with the new economy --- which is also a calculation in terms of the market as well as in terms of storage, capital, and reserves.
I doubt that "secondary literacy" will catch on, because it sounds like something that middle school English teachers do. But that's too bad - because it's actually a pretty good term to describe the world we live in.
File under: Books, Writing & Such, Language, Media Galaxy, Object Culture, Technosnark
July 18, 2009
The Post-Orwellian Future of Connected Books and Everything Else
This is the post where I tell you I don't really mind that Amazon yanked "1984" from all those Kindles.
The backstory: Unauthorized editions of "Animal Farm" and "1984" were available, briefly, for sale in the Kindle store. At some point Amazon discovered this and removed them from the store, and also—this is the important part—from people's Kindles. The NYT quotes an Amazon rep: "When we were notified of this by the rights holder, we removed the illegal copies from our systems and from customers' devices, and refunded customers."
The poetry of the fact that this happened with "1984" is irresistible. And, to be clear, I agree with Jason Kottke when he says: This stinks like old cheese! It's obviously creepy in a lot of ways—and Amazon, for its part, has conceded that it was a bad decision.
But, here's the thought experiment that occurred to me: Imagine that this story didn't seem creepy. Or at least, didn't seem particularly noteworthy.
For that to be true, what kind of world would we have to be living in?
I think it looks something like this:
Nothing is sold as a static, flash-frozen object anymore. Instead, you buy things with the assumption they'll get better and better over time. In fact, one of the ways you weigh competing brands is by asking: Who has a better track record of upgrades?
- Each iPhone OS upgrade is basically like getting a new phone.
- Every month, your Prius downloads new fuel-management software, and its mileage steadily improves. (And there's an ongoing, Netflix-style competition to improve that software.)
- One day, your Oster blender beeps, because it now has a new blend mode. Puree 2.0!
Now, if that's true for objects, you know it's true for media. You don't buy tracks or albums, shows or movies anymore. It's all included in subscriptions to big libraries that are always growing.
There are many big, competing subscription services, and like the phone carriers, each is notorious for a different level of coverage and service. Apple has the widest coverage, but it also faces the sharpest legal challenges. One week, all the Bollywood movies will be blacked out on iTunes; the next week, after the dispute is settled, they'll be back. It's annoying in the same way that only getting one bar of reception in your neighborhood is annoying, and we've come to live with it.
There are lots of smaller sub services, too, most with some specific selling point: a deep jazz library, say, or the complete collection of 80s cartoons. Most people subscribe to many.
Generally, the pattern goes:
- Some new service springs to life in a blaze of publicity.
- People rush to join.
- They enjoy it for several years.
- The media starts to seem lo-rez, or it's not compatible with the newest devices, or some contract runs out.
- It shuts down.
But by the time 5 happens, there's a new 1 somewhere else. The migration from sub service to sub service is a hassle, but at least it's easier than switching insurance companies.
You'd better believe that repressive regimes are paying attention to who's watching what on the sub services in their jurisdictions. The media that doesn't live comfortably in this world is, therefore, the controversial and the political; too often, the tether feels like a trip-wire. So there are times and places when you want to truly download something—want to save a local, static, disconnected copy—and it tends to feel a bit cloak-and-dagger when you do.
(Several movie studios have been called out for trying to distribute their movies on these nonsub networks in order to create buzz—"playing at moral seriousness," one critic said.)
In this world, Googlezon's sub service for books is completely awesome.
For $4.99 a month (how can it be that low??) you get full access to all books ever printed, period. And even better: Because readers are always connected, whether it's a browser, a special app, or a device, each one of these books is surrounded with metadata about how people read them. There's a graph on every Googlezon book page showing how far people got before losing interest; it's a much more revealing review than the star rating.
Because books are all downloaded (or re-downloaded) at the time of reading, you're always looking at the very latest version. This capability creates a new expectation, and writing non-fiction is suddenly a lot more like blogging, or shepherding a Wikipedia page: your book always needs attention. It's a lot more work, actually, and you still don't make very much money.
You'd think fiction wouldn't be as deeply affected. You'd be wrong. The hot new literary form is the "living novel," constantly being re-written in real-time. This is exciting in a lot of ways; it's also frustrating. You read a section that moves you, and you want to share it with a friend—but by the time she gets to it, it's gone, replaced by some weird passage about the history of beekeeping.
And when you open your reader, you see the same thing. The section you liked has vanished. Beekeeping. Damn it.
Actually, yeah, it's really frustrating.
But it's hard to stop. Writers, especially young writers who grew up with the web, love the ability to revisit and re-edit text. It feels natural. The argument goes: "Why wouldn't I make it better? What's with this fetishization of the 'final draft'? If you want a static version so badly—print one out. But don't tell me to stop editing."
Remember that graph on every Googlezon book page that shows how far people got? In this world, every writer is addicted to that graph. "Okay, it dips at chapter three... I can tighten that up. I can keep them going." There are cautionary tales, here—writers who get "lost in the loop" and never publish a new novel because they're too busy optimizing the old ones—but there are also new books more widely-read than any in the last 50 years. The tether is a powerful tool not just for commerce, but for creativity.
And yes: The tether also means Googlezon can yank books from the shelves, and therefore from your life, at any time. There are, of course, sneaky ways to copy and save them, but there's not a huge market for the copies, simply because it's so easy to get them the legit way.
So last week, in this world, rogue editions of "Animal Farm" and "1984" were remotely deleted from a variety of reading apps and devices. It was annoying—especially for the people in the middle of reading them—but really, no more annoying than a dropped call or a momentary power outage. People routed around the damage; they found other editions and resumed reading.
And the record of that reading—page turn by page turn—flowed up through the air and into the network. It curled through a monitoring hub in Beijing, and one in Fort Meade. It glimmered across a dashboard on the desk of an assistant book editor in New York. And it found its way, finally, to Googlezon's library—a library no longer made up only, or even mostly, of books, but now, somehow, of reading itself.
How do you feel about this world?
File under: Books, Writing & Such, Media Galaxy, Technosnark
July 15, 2009
Anthony Grafton and Digital Humanism
I think, for folks interested in what's happening with digital books, at this point it's foundational to read Anthony Grafton's 2007 New Yorker essay on book digitization. Grafton is a historian of Renaissance humanism and early print culture; he writes with a great deal of sympathy even as he criticizes a lot of the ramshackle moves that have been made in getting print books up on the web.
It's a weird thing - I think I can say that age of digital humanism we're in shows the same enthusiasm as the Renaissance in getting old texts into circulation and generating new information, but much less care than the early humanists in making sure that the information is complete, accurate, or discriminating. And it seems as though this is what traditionalists and futurists argue about, endlessly.
This at least, is the tension in Peter Green's TLS review of Grafton's new book, Worlds Made By Words, which contains an expanded version of that New Yorker essay, plus plenty of tasty goodness about Renaissance humanists like Leon Battista Alberti, or Justius Lipsius, a Flemish philologist who "offered to recite the text of Tacitus with a knife held to his throat, to be plunged in if he made a mistake.” Green's review is titled "Google Books or Great Books," and it offers a nice peek into what Grafton's all about. Here's a slice of the good:
An editor at Cambridge University Press, reputedly the world’s oldest publisher, cheerfully admitted to Grafton that, conservatively, “95 percent of all scholarly enquiries start at Google”. Which, as Grafton says, “makes sense: Google, the nerdiest of corporations, has roots in the world of books”, to the point where (if you throw in Amazon and one or two others) “the Web has become a vast and vivid online bookstore”... Today all would-be members of the Republic of Letters, all hopeful explorers of past history, have, in a literal sense, the world at their fingertips. As Grafton says, “it is more than transformative to sit in your office at a small liberal arts or community college and call up, as you already can, thousands of books in dozens of languages, the nearest material copy of which is hundreds of miles away”.
And the bad:
Scanning by optical character recognition, ironically, commits some of the same errors as those made by careless medieval scribes, including long “s” read as “f” (German scholarship sometimes appears as Wiffenschaft), and the confusion of u and n. Thus, key in the meaningless qnalitas for qualitas (a key term in medieval philosophy) and you get over 600 hits for qualitas which you would miss if you only keyed in the correct word. Much of the old German spiky Gothic black-letter material (Fraktur) comes out in “plain text” as gobbledegook.Which Grafton synthesizes in a really lovely way, as follows:
Yes, the young scholar is told, take every advantage of the new electronic Aladdin’s cave. But – and here Grafton shows a rare moment of deeply felt emotion – these streams of data, rich as they are, will illuminate rather than eliminate the unique books and prints and manuscripts that only the library can put in front of you. For now, and for the foreseeable future, if you want to piece together the richest possible mosaic of documents and texts and images, you will have to do it in those crowded public rooms where sunlight gleams on varnished tables, as it has for more than a century, and knowledge is still embodied in millions of dusty, crumbling, smelly, irreplaceable manuscripts and books.
Here's a thought. One term that I think we can use to bring the digital enthusiasts and the traditional scholars - besides humanism, which I still think is a super-powerful idea - is standards. One thing web and software people are actually surprisingly good at, considering the libertarian ethos that drives a lot of the best work, is at establishing standards of mutual interoperability.
What if scholarly bodies like the Modern Languages Association, American Library Association, American Historical Association, etc., worked together with the tech guys to establish standards for digital scholarly texts in their fields? Work to verify the scans, establish the bibliographies (it would really help to know, for example, if a full-preview book in Google Books is actually from a pirated or faulty edition), and verify the results? Hashtags for scanned books!
I think that could be beautiful.
File under: Books, Writing & Such, Learnin', Technosnark
July 12, 2009
Romance, Manuscripts, and Cyborgs
Virginia Heffernan says that internet romances "are not romances between people at all. They’re affairs with the Internet" - like World of Warcraft, where you become your own avatar:
O Computer World! At its most elementary, it’s a marvelous place, filled with risk and surprises and novelty, unbounded by space and time, where you can be a teenager again, trade gossip, avoid your overseers, gab to friends and boyfriends — all while pretending to do homework. What a perfect realm for puppy love or love with that Sanford-patented “soul-mate feel” — unconsummated love, in other words. By removing the body from relationships, electronic communication makes romantic love less animal. The lovers’ discourse becomes simultaneously more childlike and more intellectual, more spiritual.
As Chapur wrote to Sanford, “I haven’t felt this since I was in my teen ages.”
Epistolary romance seems to have existed as long as romance itself. But letters — the ink-on-paper kind, the kind Byron and Anaďs Nin wrote — had a dense materiality, with handwriting that always suggested the beloved’s hand and thus her body. Besides, wasn’t writing paper always being supplemented with dried flowers, locks of hair and wafts of perfume? It’s not clear whether mp3 love songs or links to insightful blog posts, the value-adds that now come with love e-mail, contribute a sensory dimension or only amplify how nerdy and how platonic digital romance is.
The connection to communications technology — the connection to connection — has become part of what makes us human. In the idiom of those who are swooningly in love, it makes us “feel alive.” When we’re denied the connection to connection, it’s no wonder we lust for it. Probably the pundits are wrong: there’s no special problem with marriage or romance in this country right now. Instead, our current bind is with offline reality — real life. We’ve been cheating on it, all of us, for a long time, living in a wireless fairyland where we r all so giddily hot.
I wrote my college girlfriend love letters over two summers, when she was in Texas and I was in Michigan (and then London). It is completely different. But I fell in love with her, at least in part, not least because in our freshman year, she was my first constant email correspondent.
Manuscript is different. I disagree, though, about the total virtualization/dematerialization of the body with the internet - I think at one time, exchanging flirtatious glances on Friendster, or staring into a telnet terminal in your campus computer lab, that was true. There was something cold and immaterial about that world, where you had to wait hours for a response, when you couldn't take an email with you without sheepishly printing it out on a dot-matrix.
But the ubiquity and intimacy of our net-connected objects have changed that. Heffernan's friend hands her his Blackberry with a note from his mistress, and she recoils: "I didn’t like holding the device. It felt hot and even damp, as if it had been inside a human body. Lots of erotic energy was going into that thing." It's a secondary physicality, a different kind of fantasy of immediacy - a love letter that can reach your beloved wherever she is, finding her not at her office desktop but in her purse or pants pocket. And when your phone vibrates with her new message, you have received something real, something you can touch.
File under: Books, Writing & Such, Object Culture, Society/Culture, Technosnark
July 11, 2009
Britta Gustafson, "Learning to see wooden poles":
When I’m not in a rush to get somewhere, I look up at the tops of telephone poles. I don’t know anything about electricity, but I find myself reading glossaries of linemen’s slang and technical definitions, learning how to refer to the grey buckets that transform electricity for home use (cans, bugs, distribution transformers) and how to identify several other pole features, especially different varieties of shiny ceramic insulators.
It's a really nice photo-essay, with little detours about the pleasures of walking, childhood memories of the Mister Rogers crayon factory documentary, and generally finding joy in "functional and authentic technical equipment, the more elaborate and less appreciated the better."
My grandfather was (and my uncle is) a lineman for Detroit Edison, so like Britta, I find power lines really fascinating. The general tendency of this century has been to make our infrastructure and industrial more invisible and remote, even as it becomes more individualized and less communal. (Think about riding a train versus driving a car.) Utility lines, when you notice them, spell out the lie in all that. Of course, they're most conspicuous when they stop working. (Actually, they're really conspicuous when they're knocked over in a shower of sparks and flame, but that's a special case.)
One of my favorite parts in Terry Zwigoff's documentary Crumb is when R. Crumb explains how he takes photographs of ordinary buildings and street corners - apartments, gas stations, strip malls - so he can use them as reference for adding details like telephone and electrical poles, junction boxes, gutter grates. Otherwise, he says, you forget about these things; it's as if they were never there.
File under: Beauty, Cities, Design, Object Culture, Recommended, Technosnark
July 4, 2009
Evolution 2.0 (and 3.0 beta)
This is kind of a cool idea. Let's say that evolution writ large is only accidentally about the preservation, transmission, and development of living species, but essentially about the preservation, transmission, and development of information. On this view, organisms are just a means to an end, particularly well-adapted couriers for all of this chemical data.
If that's the case, then maybe there isn't anything particularly special about the specific form of that data (i.e. DNA) or the way it's been transmitted in humans (sexual reproduction). That's just one way of doing things - in nonconscious, nonverbal, or nonhistorical species, genetic transmission, instinct, inherited traditions are the only means you've got. But once modern humans arrive on the scene, with all their increasingly sophisticated means of representing information, then Evolution 1.0, internal transmission of information, isn't the only game in town -- you've also got Evolution 2.0, characterized by the external transmission of information.
Once you reframe evolution in this way, then you can say that our species' rate of evolution "over the last ten thousand years, and particularly... over the last three hundred" is actually off the charts.
So the guy who's arguing this is a physicist named Stephen Hawking. (Maybe you've heard of him - he's awfully smart, and was part of Al Gore's Vice Presidential Action Rangers.) He also says that our tinkering with evolution ain't over:
[W]e are now entering a new phase, of what Hawking calls "self designed evolution," in which we will be able to change and improve our DNA. "At first," he continues "these changes will be confined to the repair of genetic defects, like cystic fibrosis, and muscular dystrophy. These are controlled by single genes, and so are fairly easy to identify, and correct. Other qualities, such as intelligence, are probably controlled by a large number of genes. It will be much more difficult to find them, and work out the relations between them. Nevertheless, I am sure that during the next century, people will discover how to modify both intelligence, and instincts like aggression."
If the human race manages to redesign itself, to reduce or eliminate the risk of self-destruction, we will probably reach out to the stars and colonize other planets. But this will be done, Hawking believes, with intelligent machines based on mechanical and electronic components, rather than macromolecules, which could eventually replace DNA based life, just as DNA may have replaced an earlier form of life.
I can't decide if this is totally anthropocentric, or exactly the opposite. But it's kind of exciting, isn't it? I'm evolving the species right now, just by typing this! And so are you, by reading it! And so are Google's nanobots, by recording all of it in their fifteenth-gen flash brains!
File under: Books, Writing & Such, Language, Science, Technosnark
July 2, 2009
Geeking Out, c. 1990
I love this; Hewlett-Packard is selling an exact copy of its HP-12C financial calculator for the iPhone.
The iPhone version of the HP-12C is a near carbon copy of the actual machine. It not only looks the same, but it actually runs the same code as do the physical calculators. The iPhone version is actually a bit better than just a clone of the original, though, because HP includes a simplified portrait-mode calculator (the 12C is a landscape-mode device). When used in portrait mode, you can use the number keys, along with all the usual math operators and a couple of other functions such as square roots and memory—perfect for those times when you just need a basic calculator.
The real power of the HP-12C is found when you rotate your iPhone to landscape mode; what appears on the screen then is a photographic reproduction of the actual HP-12C calculator, complete with the gold-brown-orange-blue color scheme that made the original so…endearing? Because the app uses the actual calculator’s code, absolutely everything works just like it does on the real calculator.
I used a calculator just like this to win a middle school mathematics competition - in those days, it was called a "Calculator Competition," because you could (gasp!) use a calculator. There was a school-wide thing, then a regional, and then a state final; it was a whole thing. The state final was the first time I'd ever seen a graphing calculator; that shiz blew my mind.
June 25, 2009
The Future Is All Filters
I made my Iran dashboard because I needed a better filter for Iran news. But filters aren't just for just for tracking global tumult; people need them on all levels. For example: My sister, an ultra-busy grad student and dancer, doesn't really have time to read Snarkmarket.
No you cannot unsubscribe from this feed and sign up for that one. I'm going to know if you do. We have analytics for these things.
June 13, 2009
Path-Dependence, Increasing Returns, and Technological Competition
I think anyone interested in technological change ought to read W. Brian Arthur's legendary paper on path-dependence (PDF) :
Modern, complex technologies often display increasing returns to adoption in that the more they are adopted, the more experience is gained with them, and the more they are improved. When two or more increasing-return technologies "compete" then, for a "market" of potential adopters, insignificant events may by chance give one of them an initial advantage in adoptions. This technology may then improve more than the others, so it may appeal to a wider proportion of potential adopters. It may therefore become further adopted and further improved. Thus it may happen that a technology that by chance gains an early lead in adoption may eventually "corner the market" of potential adopters, with the other technologies becoming locked out. Of course, under different "small events"--unexpected successes in the performance of prototypes, whims of early developers, political circumstances -- a different technology might achieve sufficient adoption and improvement to come to dominate. Competitions between technolologies may have multiple potential outcomes...
The argument of this paper suggests that the interpretation of economic history should be different in different returns regimes. Under constant and diminishing returns, the evolution of the market reflects only a-priori endowments, preferences, and transformation possibilities; small events cannot sway the outcome. But while this is comforting, it reduces history to the status of mere carrier--the deliverer of the inevitable. Under increasing returns, by contrast many outcomes are possible. Insignificant circumstances become magnified by positive feedbacks to "tip" the system into the actual outcome "selected". The small events of history become important. Where we observe the predominance of one technology or one economic outcome over its competitors we should thus be cautious of any exercise that seeks the means by which the winner's innate "superiority" came to be translated into adoption...
Under increasing returns, competition between economic objects--in this case technologies--takes on an evolutionary character, with a "founder effect" mechanism akin to that in genetics. "History" becomes important. To the degree that the technological development of the economy depends upon small events beneath the resolution of an observer's model, it may become impossible to predict market shares with any degree of certainty. This suggests that there may be theoretical limits, as well as practical ones, to the predictability of the economic future. (all emphases mine)
Here Arthur uses the examples of nuclear reactors and steam-vs-petrol car engines -- other classic examples are the QWERTY keyboard and the Microsoft OS, both cases where learning effects and coordination costs might lock-in an inferior (or at least quirky) product. (I'm also rereading Henry Petroski's The Evolution of Useful Things, which takes a similar historical-accident-over-essential-function approach to design history.)
File under: Design, Object Culture, Technosnark
June 12, 2009
The Original Technocrats
Alexis Madrigal (@alexismadrigal) points to an article by John G. Gunnell about the history of technocracy:
The term "technocracy," though originated in the United States in 1919 by an engineer named William Smith, first became common when it was adopted by a movement that developed in the early 1930s as a response to the Great Depression. That movement, which for a time gained considerable notoriety and a substantial following, began with a group of technicians and engineers dedicated to social reform whose concepts were modeled on the technological republic in Edward Bellamy's late-19th-century utopian novel Looking Backward. They were also influenced by the economic theories of Thorstein Veblen and the principles of scientific management growing out of the work of Frederick W. Taylor, both of which suggested, much like the later work of James Burnham in The Managerial Society, that politicians and industrial entrepreneurs should, and would, give way to technical elites. Although the movement may have appeared somewhat bizarre, it reflected a characteristic American faith in the compatibility of technology and civic vitality. The aim was to abolish corrupt politics and an obsolete economic system and expand administrative and technical rationality. "Technocracy" has been applied retrospectively to many of the technological utopias and dystopias that are so persistent a feature of Western literature and political theory.
It's sometimes easy for us to forget that the early twentieth century was a time of huge media revolutions -- radio, cinema, phonographs, among others -- and that the engineer was very much at the center of it. There was also, I think, a really powerful charismatic quality associated with scientists, inventors, and capitalists, of the secular-aristocracy-without-history mode previously available probably only fully to generals. I mean, Steve Jobs had nothing on Thomas Edison. That dude literally appeared to be a magician. (For a great take on Edison-as-magician-inventor, see Villiers de l'Isle-Adam's novel, The Future Eve -- part of the inspiration for Fritz Lang's Metropolis.)
Also, something I try to keep in mind is that then as now, "bureaucracy" is really used in two senses -- both pejorative, sure, but functionally distinct. Bureaucracy can be cold, efficient, disciplined -- in short, inhuman. But bureaucracy can also be petty, irregular, inefficient, feudal. You can be subject either to the impersonality of the machine or the fickle whims or incompetencies of an individual.
Traditionally, bureaucrats were minor officials, positions traded within and among families, indifferent to rules guiding their idiosyncrasies -- think about Kafka's The Trial, and it's pretty clear that this is the kind of bureaucrat most of us truly dread. Max Weber's model for the perfect bureaucracy wasn't the modern office but the modern army. And when you think about the idea of a civil servant -- professional, well-qualified, uncorrupt, willing to sacrifice for the public good, fastidious about following process and law -- you can see the ethos of military discipline in a positive sense.
I wonder whether the idea and ideal of the technocrat - the true social engineer - is dead for us. What kinds of technologies would genuinely revolutionize -- aw, that's saying too much -- substantively improve our politics, communities, society? Could an inventor genius somehow come along and charm us all once again?
June 3, 2009
Google Wave Solo
Is there anything lonelier than a Google Wave developer sandbox account with only one user?
No. No there is not.
I thought I was going to be able to invite other people, but alas. Sandbox for one. This is creeping me out. So long, Wave. See you when you actually get released.
June 1, 2009
The Earth is Hiring (Extended Remix)
I gave Paul Hawken's "the earth is hiring" commencement speech mixed marks, but I feel like I should upgrade my assessment, because it did one of the best things any piece of rhetoric can do: It started an interesting conversation.
I have never been able to warm to an argument that posits "the Earth" as a central player. The earth is not hiring.
Rather, each graduate will help build a world from the materials left to them from past generations of humans and other living creatures. Their challenge is to work together to build a good world for themselves and for the next generations that will come.
Tim called this the "now do it bigger, and more humble" approach... and I can already tell that this going to become a recurring phrase on Snarkmarket.
But Saheli says:
...but I also think the reason why that too big/more humble canvas doesn't work for many people is their brains are not widescreen enough to properly count disappearing possibilities; and their engines are not rational enough to abstain from some large source of affection, approval and courtship. By Deifying the Earth and ennumerating Her gifts, Hawken provides that external motivator and waves away the necessity for rationally understanding the dangers of failure. So I understand your critique, but I can see why Hawken's metaphorical fancy makes more sense for a large class of college graduates.
"Their engines are not rational enough." What a great phrase.
From there we get into supernova-prevention schemes and the ethics of museum guards with guns. This is a thread you gotta read.
May 30, 2009
Finally, You Too Can Be Marcus Aurelius
I am a sucker for long histories, especially when they're summarized with simple schema. Phillip Greenspun wrote this for a talk on how the internet has changed writing, under the subhead "Publishing from Gutenberg (1455) through 1990":
The pre-1990 commercial publishing world supported two lengths of manuscript:
Suppose that an idea merited 20 pages, no more and no less? A handful of long-copy magazines, such as the old New Yorker would print 20-page essays, but an author who wished his or her work to be distributed would generally be forced to cut it down to a meaningless 5-page magazine piece or add 180 pages of filler until it reached the minimum size to fit into the book distribution system.
- the five-page magazine article, serving as filler among the ads
- the book, with a minimum of 200 pages
In the same essaylet, Greenspun has a subhead, "Marcus Aurelius: The first blogger?":
Marcus Aurelius, Roman Emperor from 160 AD to 180 AD, kept a journal during a military campaign in central Europe (171-175). It was not available until after his death and not widely available until printed in 1558 as the Meditations...
This was preserved because the author had been Emperor. How much ancient wisdom was lost because the common Roman citizen lacked TCP/IP? [By 1700 BC, the Minoans were trading with Spain, had big cities with flush toilets, a written language, and moderately sophisticated metalworking technology. Had it not been for the eruption of Thera (on Santorini), it is quite possible that Romans would have watched the assassination of Julius Caesar on television.]
It's not all since-the-dawn-of-civilization stuff -- there are lots of examples of writing that really only works on the internet and more pedestrian things like the virtues of blogs over Geocities. "Webloggers generally use a standard style and don't play with colors and formatting the way that GeoCities authors used to." This shows how in the weblog, content becomes more important than form. (Psst-- It also suggests that if Minoan civilization had survived and spread, Augustine's Confessions might have been excerpted on a lot of home pages with lots of crappy animated GIFs.)
Via Daring Fireball.
File under: Books, Writing & Such, Learnin', Object Culture, Technosnark
May 24, 2009
Two Visions Of Our Asian Future
Looking to the east for clues to the future (or the past) of the west isn't the least bit new, but these two recent takes (both in the NYT, as it happens) offer some interesting contrasts.
First, Paul Krugman looks at Hong Kong:
Hong Kong, with its incredible cluster of tall buildings stacked up the slope of a mountain, is the way the future was supposed to look. The future — the way I learned it from science-fiction movies — was supposed to be Manhattan squared: vertical, modernistic, art decoish.
What the future mainly ended up looking like instead was Atlanta — sprawl, sprawl, and even more sprawl, a landscape of boxy malls and McMansions. Bo-ring.
So for a little while I get to visit the 1950s version of the 21st century. Yay!
But where are the flying cars?
In the subway, Ms. Kim breezes through the turnstile after tapping the phone on a box that deducts the fare from a chip that contains a cash balance. While riding to school, she uses her mobile to check if a book has arrived at the library, slays aliens in a role-playing game, updates her Internet blog or watches TV.
On campus, she and other students touch their mobiles to the electronic box by the door to mark their attendance. No need for roll call — the school’s server computer logs whether they are in or how late they are for the class.
“If I leave my wallet at home, I may not notice it for the whole day,” said Ms. Kim, 21. “But if I lose my cellphone, my life will start stumbling right there in the subway.”
It has been a while since the mobile phone became more than just a phone, serving as a texting device, a camera and a digital music player, among other things. But experts say South Korea, because of its high-speed wireless networks and top technology companies like Samsung and LG, is the test case for the mobile future.
“We want to bring complex bits of daily life — cash, credit card, membership card and student ID card, everything — into the mobile phone,” said Shim Gi-tae, a mobile financing official at SK Telecom, the country’s largest wireless carrier. “We want to make the cellphone the center of life.”
It was easier in the 1950s for Americans to imagine flying cars than it was to imagine cashless subways. Hell, it may still be easier.
Height or distance? The billboard ad or the cellphone ad? Physical mobility or mobility of information? The skyscraper or the network?
File under: Cities, Object Culture, Society/Culture, Technosnark, Worldsnark
May 12, 2009
The Most Inverted Pyramid of All
Heh heh. I like the NYT's new TimesWire feed because it grants you a glimpse of parallel incarnations of the same article. If you do any work whatsoever with web content, this little pair will be all too familiar:
The top headline and description, all grace and wit. The bottom headline and description... all blunt Google-juice.
File under: Journalism, Media Galaxy, Technosnark
May 10, 2009
The Ideas! The Ideas! Part... Whatever
Charlie Jane Anders, "Why Dollhouse Really Is Joss Whedon's Greatest Work":
The evil in Dollhouse is harder to deal with than the evil in Buffy because it's our evil. It's our willingness to strip other people of their humanity in order to get what we need from them. It's our eagerness to give up our humanity and conform to other people's expectations, in exchange for some vaguely promised reward. And it's our tendency to put any new piece of technology to whatever uses we can think of, whether they're positive or utterly destructive.
And that last bit, about technology, is the other main reason why Dollhouse is Whedon's most accomplished work, especially if you love science fiction like we do. Unlike Joss' other works, Dollhouse really is about the impact of new technology on society. It asks the most profound question any SF can ask: how would we (as people) change if a new technology came along that allowed us to...? In this case, it's a technology that allows us to turn brains into storage media: We can erase, we can record, we can copy. It's been sneaking up on us, but Dollhouse has slowly been showing how this radically changes the whole conception of what it means to be human. You can put my brain into someone else's body, you can keep my personality alive after I die, and you can keep my body around but dispose of everything that I would consider "me."
Kindle Up Your Textbooks, Children
The Chronicle of Higher Education on the Kindle DX and the market for electronic textbooks:
Most college students—more than 80 percent, according to a survey by Educause—already own portable machines that can display electronic textbooks: They're called laptops. And more than half of all major textbooks are already offered in electronic form for download to those laptops.
Yet so far sales of electronic textbooks are tiny, despite efforts by college bookstores to make the option to buy digital versions clearer by advertising e-books next to printed ones on their shelves. "It's a very small percentage of our sales at this point," said Bill Dampier, general manager of MBS Direct, a major textbook reseller.
What the textbook industry needs is the equivalent of an iTunes store for e-books, say some experts, who note that sales of digital music never took off until Apple created the iPod and an easy-to-use online music marketplace. That's why Amazon seems like a promising entrant.
Except for one thing: Publishers have already set up a digital store meant to serve as the iTunes of e-textbooks, and it has been slow to catch on. The online store, called CourseSmart, was started two years ago by the five largest textbook publishers. Now 12 publishers contribute content to the service, which offers more than 6,300 titles. The e-books are all designed to be read on laptops or desktops, rather than Kindles or other dedicated e-book reading devices.
One problem for CourseSmart has been a lack of awareness by both students and professors that the service even exists.
Yep -- sounds about right. You think we'd be easy to target, but we're actually not. In fact, probably the ONLY two media/publishing companies with significant overlapping penetration among both students and professors would be Amazon and Apple.
Also of note: the only reason why publishers are really interested in electronic books is that they can use DRM to crush sales of used books beneath their foot forever. (I remember the first book I ever used that required you to register a CD w/ a unique ID number in order to use it; SBS sold it to me at about 75% of cover used and then refused to take it back. I had to buy the new copy again.)
Also also of note: one of the lines Bezos used again and again in his Kindle presentation (from the transcripts I've seen -- anybody know where I could find video) with respect to textbooks is "structured content." I actually think this is a hugely important idea. A book gives a text physical form, sure, but that physicality works together with paratextual devices to structure its content. Page numbers, title pages, tables of content, indices, volume and chapter devisions, footnotes/endnotes, captions, commentary, usw.
This is why Project Gutenberg or any other kind of throw-it-up-there text file service will always suck. It's also why a lot of digital archives don't work. We need ways to give content structure, and to make that structure easily and productively navigable to users. Ebooks have suffered from a lack of legitimate and visible marketplaces, but to borrow a metaphor, they've also suffered from really crappy gameplay. Whoever figures out how to solve these problems will solve long-form electronic reading.
File under: Books, Writing & Such, Design, Learnin', Marketing, Object Culture, Technosnark, Video Games
May 6, 2009
Gina Trapani hits on what might turn out to be Twitter's killer feature:
When you post a question on Twitter and get a dozen replies within the next 10 minutes from actual humans–some of whom you know and trust–it’s waay better than impersonal Google search results.
If about.com shows you what random dudes think, Wikipedia shows you what nobody in particular thinks, and Google shows you what everybody thinks, Twitter shows you what the people you trust think. Who needs Wolfram Alpha or the semantic web when you've got real, live people whom you can ask complicated open-ended questions? You can keep the wisdom of crowds -- I'll take the wisdom of MY crowd.
The only trouble with this is that the answers stay bottled up in the little group. Google might not have the personal touch, but at least everyone can benefit from it.
But wait; Trapani's got you covered:
After 1,700 posts and two years on Twitter, this insta-Q&A is my favorite use of the service–except I always want to share what I learn from my followers, and it’s not easy. My post on what people love and hate about netbooks, sourced entirely from Twitter replies, took me hours to compile manually, because Twitter doesn’t easily list replies to a particular “tweet” in a very readable or republishable format. So this weekend I dug into the service’s API to make that happen. Using Kevin Makice’s new book, Twitter API: Up and Running, after just a day of coding I had my entire Twitter archive plus replies ready for viewing and publishing.
I like that this is the complete opposite of what Robin did with his Twitter feed a couple of months ago -- not least because it shows that while the basic principle of Twitter is extraordinarily simple, the implementations of it are varied enough to be tremendous.
What we need now, though, are Twitterhacks for the rest of us! Most of us don't have a day to devote to coding this stuff, even if we knew how to code in the first place. We need an ecosystem of smart implementations and variations that build on this simple infrastructure. We need these more than 101 different spiffy backgrounds or client apps.
So... what happens next?
May 2, 2009
"The Problem With Cable Is Television"
But, it turns out, the problem with television is sports:
The broadband business is doing fine, as costs are coming down. Cable executives do worry that if costs rise as they expect because of surging online video use, they will need to find some way to get prices going up the way they are used to in their video business.
The bigger question is what happens to the video business. By all accounts, Web video is not currently having any effect on the businesses of the cable companies. Market share is moving among cable, satellite and telephone companies, but the overall number of people subscribing to some sort of pay TV service is rising. (The government's switch to digital over-the-air broadcasts is providing a small stimulus to cable companies.) However, if you remember, it took several years before music labels started to feel any pain from downloads...
The wedge that breaks all this may well be sports. ESPN alone already accounts for nearly $3 of every monthly cable bill, industry executives say. With all these new sports networks pushing up cable rates, at some point people who aren't sports fans might start turning in volume to Internet services like Netflix. We're not there yet, but looking at the industry in the last quarter, you can see the pressures building.
Fascinating (and quick!) look at cable companies' businesses. [Everything in bold is my emphasis.]
April 30, 2009
You Want Bookporn? Oh, Man. We Got Some Bookporn.
VERY mature books (is 8000 BC old enough?) with an astonishingly sexy zoom feature -- similar to Google Maps, but smoother and more natural, especially with a two-finger trackpad. It's all yours, for free, at the World Digital Library.
April 21, 2009
Ink: Flock/Songbird For Writing
I gave a presentation to my students today on writing and research tools, doing what I always do -- apologizing for the limitation of every single thing that I showed them. Zotero is pretty good at building a research database -- but you can't use it to write. MS Word 2008 is a champ for layout and even does a good job at formatting bibliographies -- but it sucks for organizing research or pulling data from an application. Scrivener is a good place to organize research or notes and build drafts -- but it turns PDFs into pictures and doesn't really handle citations. Yep and Papers are great PDF organizers, but not much else. (I didn't even want to get into DevonThink.) But Papers builds in a WebKit browser, so you can do research and navigate into online databases and plug anything you find right into your library.
This feels like the big conceptual leap. We're finding our information on the web. We're writing our documents on the web. We're storing our data on the web. We're using the web to collaborate on docs. But while online storage and collaboration are winners, AJAX writing apps kind of suck. They're low-powered exactly where we need the full power of a rich client. We don't just need more formatting and layout options; we need to be able to manage databases, for research and reading material, and lots of interconnected projects that bridge online and offline work.
What I want is just what my title says: a specialized browser-based client devoted to writing.... Read more ....
File under: Books, Writing & Such, Collaborations, Learnin', Technosnark
April 17, 2009
The Pathos Of Twitter
Virginia Heffernan looks deep into the Twitterverse and doesn't like everything she finds:
The "ambient awareness" that Twitter promotes — the feeling of incessant online contact -- is still intact. But the emotional force of all this contact may have changed in the context of the economic collapse. Where once it was "hypnotic" and "mesmerizing" (words often used to describe Twitter) to read about a friend's fever or a cousin's job complaints, today the same kind of posts, and from broader and broader audiences, seem... threatening. Encroaching. Suffocating. Twitter may now be like a jampacked, polluted city where the ambient awareness we all have of one another's bodies might seem picturesque to sociologists (who coined "ambient awareness" to describe this sense of physical proximity) but has become stifling to those in the middle of it.
I only subscribe to a handful of Twitter feeds -- about twenty, almost all people I've met and known for years -- and I protect my updates, partly to ward off feeling this way. However, I still can't escape whiners like me:
In the old days, Facebook updaters and Twitterers mostly posted about banal stuff, like sandwiches. But that was September. It's spring now. Look at Twistori, a new site that sorts and organizes Twitter posts that use emotionally laden words like "wish" or "hate" or "love," thereby building an image of the collective Twitter psyche. The vibe of Twitter seems to have changed: a surprising number of people now seem to tweet about how much they want to be free from encumbrances like Twitter. "I wish I didn't have obligations," someone posted not long ago. "I wish I had somewhere to go," wrote an other. "I wish things were different." "I wish I grew up in the '60s." "I wish I didn't feel the need to write pointless things here." "I wish I could get out of this hellhole."
Exactly. Obviously, people use Twitter to do different things. A professor of mine has, I think, perfected it as an art of academic self-promotion -- linking not just to new posts but old articles, interviews, projects, etc. But one thing that scares me about the way that I use it is that I often find myself being brutally honest about my feelings -- like I'm in therapy with Wonder Woman's lasso wrapped around my brain.
For every detached quip like "tcarmody thinks Proust would have been a great blogger. Joyce? Not so much," there's a strain of sentimentality ("tcarmody is watching my son play catch with my sister, who taught me how to play catch when I was a little boy"), self-pity ("tcarmody is recovering from surgery and apparently is pissing off everyone in his life. If you're going to be useless, don't be cranky too"), petty complaints ("tcarmody will not give up cream in his coffee. Will. Not."), and full-blown existential dread: "tcarmody is trying and failing to call in friendships and favors. Help. I need help"; "tcarmody is deeply uncomfortable and entirely alone."
Heffernan pulls back from this conclusion and settles for a vexed explanation based on long-felt class anxieties. I think something else is at work. Maybe it isn't a new epoch in the history of being, but it is SOMETHING. This isn't just ordinary moaning. Is it?
April 16, 2009
Jigsaw-Fragment Models Of Tomorrow
Ozymandias on the history of tabbed browsing:
Multi-screen viewing is seemingly anticipated by Burroughs' cut-up technique. He suggested re-arranging words and images to evade rational analysis, allowing subliminal hints of the future to leak through... An impending world of exotica, glimpsed only peripherally.
Perceptually, the simultaneous input engages me like the kinetic equivalent of an abstract or impressionist painting... Phosphor-dot swirls juxtapose: meanings coalesce from semiotic chaos before reverting to incoherence. Transient and elusive, these must be grasped quickly.
This jigsaw-fragment model of tomorrow aligns itself piece by piece, specific areas necessarily obscured by indeterminacy. However, broad assumptions regarding this postulated future may be drawn. We can imagine its ambience. We can hypothesize its psychology.In conjunction with massive forecasted technological acceleration approaching the millennium, this oblique and shifting cathode mosaic uncovers the blueprint for an era of new sensations and possibilities. An era of the conceivable made concrete...
... And of the casually miraculous.
File under: Books, Writing & Such, Media Galaxy, Technosnark
Library Culture / Information Culture
|LIBRARY CULTURE||INFORMATION-RETRIEVAL CULTURE|
a. quality of editions
b. perspicuous description to enable judgment
c. authenticity of the text
|Access to everything
a. inclusiveness of editions
b. operational training to enable coping
c. availability of texts
a. disciplinary standards
b. stable, organized, defined by specific interests.
a. user friendliness
b. hypertext--following all lines of curiosity
a. preservation of a fixed text
a. intertextual evolution
b. surfing the web
It is clear from these opposed lists that more has changed than the move from control of objects to flexibility of storage and access. What is being stored and accessed is no longer a fixed body of objects with fixed identities and contents. Moreover, the user seeking the information is not a subject who desires a more complete and reliable model of the world, but a protean being ready to be opened up to ever new horizons. In short, the postmodern human being is not interested in collecting but is constituted by connecting.
The chart is from an apparently unpublished lecture by computer scientist extraordinaire Terry Winograd; the commentary is by Heidegger scholar extraordinaire Hubert Dreyfus.
File under: Books, Writing & Such, Object Culture, Technosnark
April 11, 2009
Loss Of Service
Matt Richtel whines:
Technology is rendering obsolete some classic narrative plot devices: missed connections, miscommunications, the inability to reach someone. Such gimmicks don’t pass the smell test when even the most remote destinations have wireless coverage. (It’s Odysseus, can someone look up the way to Ithaca? Use the "no Sirens" route.)
Of what significance is the loss to storytelling if characters from Sherwood Forest to the Gates of Hell can be instantly, if not constantly, connected?
Plenty, and at least part of it is personal. I recently finished my second thriller, or so I thought. When I sent it to several fine writer friends, I received this feedback: the protagonist and his girlfriend can't spend the whole book unable to get in touch with each other. Not in the cellphone era.
Then Christopher Breen whines:
As you may have heard, areas of San Francisco’s South Bay and coast lost their landline, cell phone, and Internet connectivity because an individual or individuals unknown deliberately sliced four fiber optic cables in San Jose, California. This action (currently termed “vandalism”), in addition to unplugging over 50,000 area residents, caused many businesses to shut down and threatened lives because 911 services were out for the better part of the day...
I had no Internet access. I couldn’t call the office to alert my boss that I was off the grid. And my iPhone was no good with its constant No Service heading regardless of where I drove. I was completely unplugged.
Voilŕ.... Read more ....
File under: Books, Writing & Such, Media Galaxy, Object Culture, Society/Culture, Technosnark, Television
April 10, 2009
Thousand-Dollar Steampunk Idea
Teletwitter (or "Twittergraph"): A multiplatform twitter client that pounds out received tweets like an oldtimey telegraph/teletype machine. Morse code optional. Also sheds punctuation formats in telegram style & replaces period with STOP
April 3, 2009
Some Of That Information I Actually Need
Joshua Schachter lists several reasons why shortened URLs (those mini-links provided by TinyURL and its children), despite their convenience in some circumstances, are actually pretty bad. And I agree.
Some of Schachter's reasons are technical, related to DNS servers and the code they're written in, and others are more counterfactual - like what happens when a company goes out of business, and all of those links go dead?
But eventually, under the rubric of "usability issues" he gets around to the big one for me: "The clicker can't even tell by hovering where a link will take them, which is bad form."
I don't know about you, but when I'm browsing the web, I hover over links like each one were a suspect public toilet -- only touching down when I'm sure I know what I'm getting into. I take clicking through VERY seriously. Hovering over a link to get a peak at the URL may not always be perfect information, but to me, it's essential. TinyURLs don't let you do that. You're going to the middle of nowhere. This bothers me, every time.
In response to Schachter, Jason lists what he'd like to change about the way Twitter uses shortened URLs:
With respect to Twitter, I would like to see two things happen:
1) That they automatically unshorten all URLs except when the 140 character limit is necessary in SMS messages.
2) In cases where shortening is necessary, Twitter should automatically use a shortener of their own.
That way, users know what they're getting and as long as Twitter is around, those links stay alive.
Very reasonable ideas, all of these. In general, it seems like Twitter's going to have to create its own rhetoric of linking as powerful as the "@username" designation for links to Twitter users. Maybe an "%sitename" HTML tag in lieu of a shortened URL? Not sure.
April 1, 2009
Our Phones, Ourselves
I recently had one of those moments where a few disparate thoughts click into place, and I was left with an insight that seems obvious in retrospect. (Trouble is, you never know when those moments actually are obvious, and have occurred to everybody except you ages ago. Forgive me if this is one of those.)
It starts with the mobile "phone," or whatever you want to call it. First, as has been widely remarked, half the world has one. Adoption rates exceed 100 percent in countries from Romania to New Zealand. Here in the US, it's not hard to imagine a near future where smartphones with touchscreens are as ubiquitous as the Nokia bricks of yesteryear.
Here's what strikes me about mobile phones: they correlate pretty well with actual people. To a degree unmatched by a computer and certainly by a landline, a cell phone is a personal device. Every member of a family is likely to have one. Not all that many people carry more than one. Between phone number portability and Google Voice, you can almost imagine a person's phone number becoming an identifier almost as reliable as a Social Security number, certainly more stable than, say, a driver's license ID.
This got me thinking about biometrics. The notion of your mobile phone touchscreen reading your fingerprint isn't exactly new, and these devices are almost made for voice recognition, right? The point is, verifying identity with a mobile device seems like it should be easier and more accurate than it has typically been throughout the digital transition, yes?
We're already paying for things with our cell phones. You can see the vast upsides to voting via cell phone. I'm already jonesing for my cell phone to interface with all my other electronic devices: "Desktop and air conditioner, Matt's on his way home. Work it."
The upshot of all this is that we're hurtling towards a moment when your mobile telecommunications device is entangled with your identity in all sorts of curious ways. What does this mean? What does it mean to be that closely enmeshed with a computer?
And how does that enmeshment implicate our relationships with the telecom industry? It's already squicky enough that I rely on T-Mobile for phone service. I am not at this moment OK with signing on to T-Mobile's Identity™ service.
Perhaps this epiphany occurred much earlier to those of you with iPhones, but it felt novel enough to me to remark upon it. At any rate, "mobile phone" is not cutting it any more. If this thing really is becoming the prime representative of our digital identity, it needs a more accurate rebranding. Nominations?
A Place To Gather (And Use The Printer)
Diana Kimball praises the campus computer lab:
Computer labs offer a combination of connectivity and escape at the same time: they provide a location, a destination, where all of the necessary technological tools are assembled and maintained. They also establish in student’s minds the existence of a “computer place” on campus—the natural place to gravitate toward when your laptop has gotten a virus, or its hard drive has died, or you’re wondering how to set up your email client. Here, the IT helpdesk is right in the computer lab, reinforcing that relationship.
With laptops all but ubiquitous, community computer labs may seem frivolous. But that very ubiquity, and its inescapability, means that colleges have a responsibility to respect and support the relationship between students and computers. A computer lab sends a strong signal, offers an obvious location to honor and troubleshoot that relationship, and gives students an alternative to squinting at tiny screens.
An indication of how fast things have changed: when I started college (in 1997), not only did I not own a laptop, I didn't even own a computer. I had never owned a computer. (My first honest-to-goodness PC to call my own came in 2001, my first year of graduate school.) Every paper I wrote was improvised in a computer lab. (Hmm. Maybe I should try that again.)
Here's my vision of the future of the computer lab: rows of ready-to-go machines, yes, but also of laptop kiosks, places where you can plug in and recharge, hook up to the networked printer, and chat with the techs and support staff. Maybe even a floating reference librarian to help with research questions and writing papers. A place to gather, where the communal intellectual energy can hum and crackle and strike down with electric inspiration. And to use the printer.
File under: Learnin', Object Culture, Society/Culture, Technosnark
March 31, 2009
The Age of Ajax
Love this five-year remembrance of the birth of Gmail -- still my favorite thing to use on the web, ever.
March 29, 2009
Metaphors, particularly of the "A is B" variety, are best when they can teach you something you didn't know or fully recognize before -- about either A or B. I think Noam Cohen's "Wikipedia is a City" conceit does the job.
For instance, he tackles the anti-Wikipedia movement:
People don’t treat ineffectual inventions as taboo — that is reserved for things like evolution, alcohol or, yes, cities. And just as the world has had plenty of creationists, temperance societies and ruralists, there is a professional class of Wikipedia skeptics. They, too, have some seriously depraved behavior to expose: Wikipedia represents a world without experts! A world without commercial news outlets! A world lacking in distinction between the trivial and the profound! A world overrun with facts but lacking in wisdom!
It’s all reminiscent of the longstanding accusations made against cities: They don’t produce anything! All they do is gossip! They think they are so superior! They wouldn’t last a week if we farmers stopped shipping our food! They don’t know the meaning of real work!
My favorite, though, is his analysis of one of the Wikipedia core principles, "Assume good faith":
Wikipedia encourages contributors to mimic the basic civility, trust, cultural acceptance and self-organizing qualities familiar to any city dweller. Why don’t people attack each other on the way home? Why do they stay in line at the bank? Why don’t people guffaw at the person with blue hair?
The police may be an obvious answer. But this misses the compact among city dwellers. Since their creation, cities have had to be accepting of strangers — no judgments — and residents learn to be subtly accommodating, outward looking.
Why isn't "assume good faith" a working assumption for the entire internet? Because, you know, people in cities are actually pretty nice. And people on the internet, especially in forums and discussion groups outside of Wikipedia, are often not as nice as they ought to be. The relative civility of Wikipedia should be touted more often as one of its primary virtues.
H/t to Rex at Fimoculous.
Civ, Counterfactual Progress, and the Rolling Katamari Ball of Science
This post is hard to sum up because it's sort of about everything.
Why did science and history unfold the way they did?
Why didn't somebody in China invent the electric light bulb? In an alternate reality with no Edison and, let's say, no America, does anybody invent an electric light bulb?
Is the video game Civilization's "technology tree" a good model for technology and history -- or just a dorky game mechanic? Rob MacDougall had his students think about alternative models. One of his favorites invoked the imagery of Katamari Damacy:
The student's idea was a rolling tech wheel. The spokes of the wheel represented paths of technological development you could pursue -- navigation, metalworking, what have you -- but you also had to adapt to technological contingencies in the form of the various things you rolled over. I'm not sure how this would actually work as a game, but as a crazy Katamari bricolage view of human history, it's fun to wrap your head around.
It all springs forth from a class called Science, Technology, and Global History. There is nothing not to like here. (Thanks for the link, Dan!)
File under: Snarkonomics, Society/Culture, Technosnark, Video Games, Worldsnark
March 26, 2009
Compress Into Diamonds
I've reached the terrible moment. Google Reader has long since stopped telling me how many unread items I have, opting instead for the euphemistic "1000+". I've dumped all the folders I'm willing to dump. I am unwilling to declare bankruptcy, but I don't know how long I can stave off my attention creditors.
Here's what I've come to realize about myself: I fully accept that there's not a particular link in that ridiculous heap that will change my life. It's been a while since I worried about missing a single killer post or app or XKCD or whatever; if it's valuable enough, it'll find me, I got it.
What I most value, and what's most difficult to recreate outside of my RSS reader, is the exchange of perspective that erupts around a particular moment. Tim Geithner outlines a massive bailout plan, and my economists folder becomes an accessible but rigorous debate about scenarios and probabilities and consequences, light years more interesting and enlightening than a cluster of news stories. I found Jake DeSantis' resignation letter and the attendant comments instantly fascinating as a drama about class that doesn't quite resemble any story I remember. But the claims and counter-claims thrown about in the letter and its responses would have been impossible to untangle without the referees in my reader, who shed light even in their disagreement with each other. Atul Gawande's broadside against solitary confinement sparked a characteristically luminous exchange between Ross and Ta-Nehisi. It's not the Gawande piece or the DeSantis letter or the bailout story that I worry about missing, but what insights those writings touch off.
Babies won't die if I don't read these things. I am fully aware of all the precious, precious insight I'm forgoing to blog at this very minute. My aversion to the "Mark all as read" button is irrational; I recognize this.
But I have a proposal that could make this all a lot less difficult.
Google, I want you to give me a button labeled "Compress into diamonds." When I click that button, spin your little algorithmic wheels and turn my reader into a personalized Memeorandum. Show me the most linked-to items in the bunch, and show me which of my feeds are linking to them. And take it a step further. You've got all that trends data that reflects the items I'm reading. Underneath the hood might very well be data about the links I click on in those posts. Use that information about me to compress my unread items into diamonds I will find uniquely wonderful.
The dirty little secret, Google, is that you barely even have to make this good. Even if the diamond-making algorithm is super-basic, all it needs to do is neutralize the psychological hurdle of the bankruptcy button. I just hate the very idea of clicking "Mark all as read." Make me a cheap promise, and I will bite.
March 16, 2009
Snark by Snarkwest: Bruce Sterling Speaks
Augmented Reality Toys
This whole theme is particularly poetic because it plays on what's already magic about kids and toys: There is so much happening that an observer can't see. In a very real sense, toys are already surrounded by layers of augmented reality. But the technology that powers it isn't fancy goggles; it's just imagination.
I remember playing with Transformers and other assorted robots as a kid and being impatient for the "toy fugue state" to kick in. Like reading a book, you know? There's a big difference between the moment after you've just opened a book -- just-reading-each-word-in-order -- and the cruising speed that comes later, when the pages have melted away and something totally different is happening with your eyes and your brain.
The same thing exactly would happen to me as I "got in the groove" of playing with toys. It was sorta like flow for kids! Does this ring a bell with anybody else? Any similar experiences?
March 15, 2009
The Ghosts in the Machine
After taking a moment to digest some of the insights from the two awesome panels this morning, this thought is still dancing in my head a bit. At one point, John Mark Josling said (in paraphrase), I want to push the idea of deepening the social aspects of software. What if Photoshop had a sandbox that could enable you to watch designers/photogs editing a photo in real-time, so you could replicate their actions later? What if Fireworks allowed you to view "ghosts" of other editors creating projects?
I'm fascinated by that notion, especially as apps like Photoshop take their place in the cloud. What if you could "follow" Quentin Shih on Photoshop Express, getting notified whenever he was editing an image, and watch his virtual ghost create art in real-time on your screen? Or watch the ghost of Kutiman splicing and editing hundreds of YouTube clips?
This gets back to Robin's notion of the emerging "public artist." It also ties in with my argument about the responsibility of journalists to encode into their work information about how to replicate that work.
What Are People Doing In the Cloud?
Matt's experience at South by Southwest suggests that a lot of the big social networking companies actually don't have (or won't share) a whole lot of insight into what their users are doing on line, or how it's changed their lives. But is this because their systems are too simple (they just host/carry what other folks are doing) or too complex (too much information, too much noise -- they can't monitor it all)?
Clive Thompson's new article on netbooks and cloud computing suggests that it might be a little bit of both:
In The Innovator's Dilemma, Clayton Christensen famously argued that true breakthroughs almost always come from upstarts, since profitable firms rarely want to upend their business models. "Netbooks are a classic Christensenian disruptive innovation for the PC industry," says Willy Shih, a Harvard Business School professor who has studied both Quanta's work on the One Laptop per Child project and Asustek's development of the netbook...
A really powerful application like Adobe Photoshop demands a much faster processor [than a netbook's]. But consider my experience: This spring, after my regular Windows XP laptop began crashing twice a day, I reformatted the hard drive. As I went about reinstalling my software, I couldn't find my Photoshop disc. I forgot about it—until a week later, when I was blogging and needed to tweak a photo. Frustrated, I went online and discovered FotoFlexer, one of several free Web-based editing tools. I uploaded my picture, and in about one minute I'd cropped it, deepened the color saturation, and sharpened it. I haven't used Photoshop since...
It used to be that coders were forced to produce bloatware with endless features because they had to guess what customers might want to do. But if you design a piece of software that lives in the cloud, you know what your customers are doing—you can watch them in real time. Shirazi's firm discovered that FotoFlexer users rarely do fancy editing; the most frequently used features are tools for drawing text and scribbles on pictures. Or consider the Writely app, which eventually became the word processor part of Google Docs: When Sam Schillace first put it online, he found to his surprise that what users wanted most was a way to let several people edit a document together.
I'm really fascinated by this idea -- little companies with serious chops doing simple things (whether building netbooks or cloud apps) that users actually need and want. It's like the Unix philosophy expanded to clients and hardware!
This is actually one reason why (unlike Clive) I'm a little down on the idea that the web browser will just become the do-everything client that interacts with every cloud service. (Thompson writes, "I wrote this story on a netbook, and if you had peeked over my shoulder, you would have seen precisely two icons on my desktop: the Firefox browser and a trash can. Nothing else.")
The problem is that while doing everything most of us want to do over the web is possible, doing it all in the web browser isn't very satisfying. It means that every time I'm trying to do a specific task, I've got a whole bunch of stuff I don't need -- bookmarks to other sites, browser extensions I'm not using, link-and-click interfaces that aren't optimized for this specific task. You can solve this a little bit with Flash or AJAX interfaces, but it doesn't seem like a very good trade if I'm trading a bloated client app filled with tools I don't need for a bloated browser with tools that aren't even relevant to what I'm doing.
Especially on the smaller screens of phones or netbooks, we need interfaces that allow us to focus totally on what we're doing, without extra junk getting in the way.
This is why I actually prefer (with some limitations) the iPhone interface to the netbook's -- lots of little web-capable clients that just work with one cloud service, or one KIND of cloud service. You can see this already in dashboard gadgets and little tray apps like Twitteriffic, Dropbox, or Skitch -- that ideally don't interact with your browser at all. This is also why I'm more sanguine about a netbook that's more like an oversized iPhone than a shrunk-down laptop.
There's obviously room for compromise -- e.g. do you want/need a hardware keyboard or a software one, or are you willing to trade it for extra screen -- that will be similar to the kinds of hashing out we did with PDAs and early laptops before that. But it is going to change -- and we'd better all start paying attention to the folks at these little companies (plus smart observers) who actually know what's happening and why.
March 14, 2009
Snark by Snarkwest: Bite-Sized Info for a Hungry World
Snark by Snarkwest: Emerging Trends of Mobile Technology
March 12, 2009
If Robin Had Invented Language
I just ran across Siftables, another Media Lab concept that doesn't suggest any immediate practical applications, but sent my imagination on a little trip. (The closest it got to a destination was this thought: "Wow, our kids are going to have even cooler toys than we did.") "Siftables" lacks poetry, though. Might I recommend "Robinblox" or "Roblox"?
The Chinese Written Character as a Medium for Typing
Increasingly, Chinese people don't actually have to write (rite? right?) out these characters by hand. More and more, they key them in with mobile phones or at computers. And when they do that, it's just as easy to 'write' a traditional-style, complex, information-dense character as a streamlined new one. (Reason: you key in clues about the character, either its pronunciation or its root form, and then click to choose the one you want.) So -- according to current arguments -- the technology of computers and mobile phones could actually revive an important, quasi-antique style of writing.
Hmm -- Fallows is definitely one-up on me, since he reads Chinese and I don't, but I wonder whether other considerations (e.g. screen size and corresponding size of characters) might still put some pressure towards some kind of simplification of the character form. A lot of that information-density just turns into noise if it has to be packed into a tiny space.
Alternatively, kids (it's always kids, at first) might start using "abbreviations" that minimize the number of keystrokes required to type useful phrases -- maybe by not choosing the precisely "correct" character but an approximation of it (the root or a related pronunciation or whatever), like our "lol," "brb," "btw," etc.
In short, technology rarely has a purely stabilizing effect on tradition -- it might help block a particular chirographic attempt at reform/revolution, but only to displace it in favor of its own matrix. (And yes, I just quoted Spock from The Wrath of Khan.)
March 11, 2009
Technologies have a social dimension beyond their mere mechanical performance.Â We adopt new technologies largely because of what they do for us, but also in part because of what they mean to us. Often we refuse to adopt technology for the same reason: because of how the avoidance reinforces, or crafts our identity.
Most of Kelly's aticle focuses on tool cultures among Highland tribes in New Guinea, but Kelly's also recently written about technology adoption among the Amish -- which is, of course, unusually explicit about the relationship between technology and group identity.
I'm not sure about this hedge, though:
In the modernized west, our decisions about technology are not made by the group, but by individuals. We choose what we want to adopt, and what we donâ€™t. So on top of the ethnic choice of technologies a community endorses, we must add the individual layer of preference. We announce our identity by what stuff we use or refuse. Do you twitter? Have a big car? Own a motorcycle? Use GPS? Take supplements? Listen to vinyl? By means of these tiny technological choices we signal our identity. Since our identities are often unconscious we are not aware of exactly why we choose or dismiss otherwise equivalent technology. It is clear that many, if not all, technological choices are made not on the technological benefits alone. Rather technological options have unconscious meaning created by social use and social and personal associations that we are not fully aware of.
But aren't these choices still deeply social? Partly it's about access: if you don't have daylong access to the web (or access to the web at all) you ain't twittering, son. But you're also not likely to do it if your friends and coworkers and neighbors don't twitter. Group identity is a lot more complex in the modernized west, sure -- but pure individual choice it ain't. In fact, our adoption of technology actually helps us form new groups and social identities that are not quite tribal/ethnic -- or it helps us reinforce those bonds.
P.S.: My title, "tool culture," isn't from Kelly's article, but from paleoanthropology. One of the things I love about the study of groups like the Neanderthals is that we have evidence of their tool use long after we have fossilized remains. We can actually distinguish between Neanderthal and human settlements based on their tools.
Neanderthals and homo sapiens definitely coexisted. People aren't sure whether Neanderthals interbred with modern humans or not, which makes it hard to know when exactly the Neanderthals died out. Wouldn't it be interesting, though, if a group of anatomically modern humans adopted Neanderthal tools? That technologies could reach not just across ethnicities, but across species as well?
File under: Object Culture, Society/Culture, Technosnark
March 5, 2009
The Joy of Paper Tape
There are so many reasons to enjoy Maximum PC's"Computer Data Storage Through the Ages -- From Punch Cards to Blu-Ray," but I like the way it relates the technologies to the broader culture. For instance:
Elvis Presley, Buddy Holly, and magnetic tape all rose to prominence in the 1950s, and it was the latter that helped shape the recording industry. Magnetic tape also changed the computing landscape by making long-term storage of vasts amount of data possible. A single reel of the oxide coated half-inch tape could store as much information as 10,000 punch cards, and most commonly came in lengths measuring anywhere from 2400 to 4800 feet. The long length presented plenty of opportunities for tears and breaks, so in 1952, IBM devised bulky floor standing drives that made use of vacuum columns to buffer the nickel-plated bronze tape. This helped prevent the media from ripping as it sped and up and slowed down.
Likewise, audio quality of cassette tapes improved, "ushering in the era of boom boxes and parachute pants (thanks M.C. Hammer." And "the floppy disk might one day go down as the only creature as resistant to extinction as the cockroach."
But my favorite digital storage media, hands-down, is paper tape:
Similar to punch cards, paper tape contained patterns of holes to represent recorded data. But unlike its rigid counterpart, rolls of paper tape could feed much more data in one continuous stream, and it was incredibly cheap to boot. The same couldn't be said for the hardware involved. In 1966, HP introduced the 2753A Tape Punch, which boasted a blistering fast tape pinch speed of 120 characters per second and sold for $4,150. Yikes!
One thing I've always wondered about these early paper-based computer programs is whether they were copyrighted -- and whether that, in part, led to the adoption of paper. One of Thomas Edison's clever exploitations of copyright loopholes was to take celluloid moving pictures (which weren't initially eligible for copyright) and copy them onto a long, continuous paper print -- this meant that an entire feature film could be copyrighted as a single "photograph."
I also wonder if/why early computer programmers didn't use celluloid instead of paper. You can move it a lot faster than paper tape, and it's generally stronger -- except, perhaps, if you punch it with lots of little holes.
File under: Movies, Music, Object Culture, Technosnark
March 3, 2009
Well, There It Is: Kindle + iPhone
Starting Wednesday, owners of these Apple devices can download a free application, Kindle for iPhone and iPod Touch, from Apple's App Store. The software will give them full access to the 240,000 e-books for sale on Amazon.com, which include a majority of best sellers.
The move comes a week after Amazon started shipping the updated version of its Kindle reading device. It signals that the company may be more interested in becoming the pre-eminent retailer of e-books than in being the top manufacturer of reading devices.
But Amazon said that it sees its Kindle reader and devices like the iPhone as complementary, and that people will use their mobile phones to read books only for short periods, such as while waiting in grocery store lines.
"We think the iPhone can be a great companion device for customers who are caught without their Kindle," said Ian Freed, Amazon's vice president in charge of the Kindle. [emphasis mine]
Mr. Freed said people would still turn to stand-alone reading devices like the $359 Kindle when they want to read digital books for hours at a time. He also said that the experience of using the new iPhone application might persuade people to buy a Kindle, which has much longer battery life than the iPhone and a screen better suited for reading.
I think is pretty cool, and can potentially benefit everybody -- if reading e-books on the iPhone takes off, iTunes could make a play for the market. In the meantime, it might even help them sell some iPhones -- for Apple, the money's in the hardware. Meanwhile, Amazon gets to take a crack at a bunch of readers who can now read e-books on a device that, whatever its relative limitations for reading, is one they already own.
John Gruber has a short review of the app at Daring Fireball.
As the only Kindle-less Snarkmaster, let me say this: I'd really like a freeware Kindle Reader for my MacBook. I like to read to relax, sure; but I also like to read where I do my work (a good deal of which involves reading books). I'm sure whatever prohibitions you'd wind up having to put on the books (no cut-and-pasting?) would make the experience stink. But it is one I would be willing to accept.
Let me put forward this thesis. There will be a lot of portable digital reading devices in the near future: dedicated readers, phones and PDAs, digital paper that you can wad up and throw away, tiny projectors that can use any sufficiently bright surface. But the most important one is and will continue to be the laptop computer. People in the electronic reading business need to continue to think about how they can make that experience both better and sustainable.
And let me also advance thesis #2: Don't let the race to greater portability convince you that this is the end of the game. We need software and hardware that take advantage of BIG reading surfaces -- from the TV-sized screen in your kitchen or living room to Penn Station and the Library of Congress. We don't all always read tucked away in our own private worlds, nor should we -- sometimes reading needs to be a spectacle, on a big public wall, where you can always be dimly aware of it, where it can't ever be fully ignored.
File under: Books, Writing & Such, Object Culture, Society/Culture, Technosnark
February 22, 2009
Sometimes a New Medium Sneaks Up On You
I'd seen references to Prezi here and there -- it's billed as a new presentation tool, a way to pan and zoom through ideas instead of clicking through slides. Which sounds pretty cool but, having now used this thing, I gotta say: The potential is much bigger than that.
I haven't been this excited about a new format in a long time. The tutorial video actually gave me chills. (Pretty sure I have never typed that sentence before.)
So here's my first prezi, which is just a little anecdote laid out in space -- absolutely not a good use of the technology. But it will give you a taste of the potential.
Cross-reference this with our ongoing future-of-books discussion. Also with Scott McCloud's infinite canvas.
File under: Books, Writing & Such, Comics, Design, Media Galaxy, Technosnark
February 20, 2009
John Gruber on reducing friction between thought and expression:
Friction is a problem for software in general, not just programming languages specifically. There’s the stuff you want to do, and there’s the stuff you have to do before you can do what you want to do. People have a natural tendency to skip the have to do stuff to get right to the want to do stuff if they can get away with it. Friction is resistance. Hence untitled document windows containing hours of unsaved work — there’s an idea in your head that you want to express or explore, and the path of least resistance is to hit Command-N and just start working.
I would say that friction in this sense is a problem for a Lot Of Things in general, not just software specifically. But Gruber's take on "Untitled Document Syndrome" is a really good illustration:
Saving a document for the first time is a minor chore, but it’s a chore nonetheless. The avoidance of such a minor chore is not rational; it is neither particularly complicated nor time consuming to hit Command-S and deal with the Save dialog. But we humans are not perfectly rational. We don’t always floss our teeth. We’ll pick the burger and fries instead of the salad. We’ll have one more beer. And sometimes we just don’t feel like dealing with the Save dialog box yet so we’ll put it off.
Gruber's post is part of an ongoing "everything buckets" debate in the Mac blogosphere. It kinda boils down to a debate about writing versus reading, users versus programmers, what's smart for software vs. what's smart for hardware. In short, the eternal dillemas.
February 10, 2009
Everything I Know About Life I Learned from My Search Engine
An intriguing aside from a long Silicon Alley Insider article:
I do wonder whether Twitter's success is partially based on Google teaching us how to compose search strings? Google has trained us how to search against its index by composing concise, intent-driven statements. Twitter with its 140 character limit picked right up from the Google search string. The question is different (what are you doing? vs. what are you looking for?) but the compression of meaning required by Twitter is I think a behavior that Google helped engender. Maybe Google taught us how to Twitter.
I'm not sure if there's enough evidence to make the claim that Google taught us how to Twitter (did it then also teach us how to text?). But I wonder what else Google might have taught us. Has the nature of our Google queries changed over time? Do we type fewer words? More? How does our use of Google compare to the first generation of search engines?
February 6, 2009
The Inevitability of Electronic Reading
Many of you have probably read John Siracusa's insightful, entertaining, and long anecdotal history of e-books at Ars Technica. Still, with Amazon set to make a big Kindle-related announcement early next week, it seems like a good time to highlight this sample:
In 2003, Apple started selling music for the iPod through its iTunes music store. Apple sold audio books as well, through a partnership with Audible. Perhaps unknowingly, Apple had just positioned itself perfectly for e-book domination.
It was all happening right before our eyes. First the device, already far past the minimum threshold for screen size and legibility, and rapidly gaining market penetration. Then the digital distribution channel, accessed via a desktop application used by every iPod owner. Then the deals with content owners—not just the independent labels or the scraps from the big table, but all the top record labels, and for their most popular content...
The e-book market was Apple's for the taking.
And then a funny thing happened: Apple never took it... The iPod sold in numbers that made the PDA phenomenon look quaint. And still Apple didn't move. No one moved. The entire e-book market was stalled.
These were the dark times for the e-book market, akin to the five years during which Internet Explorer 6 had over 90% market share and received no major updates. Here was this technology that had so much potential but was not making any substantial progress in the market because the players who were motivated to drive it forward had failed or been rendered powerless by larger forces.
January 17, 2009
I Am A Circuit Through Which Approval Flows
I'm beta-testing Windows 7, and man, the speech-recognition training program is creepy. Instead of giving you something cool to read and letting you figure out your own mistakes and corrections like Dragon NS, it puts you through your paces by reading canned lines.
- If you don't read the canned lines, the recognition app doesn't understand you. It'll just say "what was that?" until you read what you're told.
- Even if you read it perfectly, if the app is teaching you to correct a mistake, you WILL make a mistake. Which you then have to correct in exactly the way they prescribe, even if there are multiple ways to do the same thing.
- As soon as you figure out that the app isn't really listening to anything you're saying, a little pop-up window tells you that even though the screen doesn't register your words, THE APPLICATION DOES. In fact, it's using your speech patterns to program the recognition engine. So if you say "this sucks," instead of "this is awesome," it'll somehow try to figure out what accent you have where "awesome" sounds like "sucks."
- This leads to the creepiest part. The text you read is all brainwashy. It's all "speech recognition works great! I can speak faster than I can type. I really want to do this more often." I kid you not.
I should add that the speech recognition itself does work pretty well and the visual integration with the OS pays off. MS clearly thought hard about accessibility. However, they didn't think at all about personality or humor.
December 24, 2008
Kindle: The 24-Hr Take
Soooo happy I gave myself a Kindle for Christmas.
The device came in handy immediately. I'm staying with my boyfriend Bryan in Minneapolis over the holidays, and the UPS guy arrived with the package very shortly after he left for work. As soon as I left Bryan's apartment to go upstairs and sign for it, I realized the door had locked behind me, leaving me in Bryan's robe and slippers, with no keys and no cell phone. But I did have a Kindle. Which meant I had Web access. I surfed to Ask MetaFilter, found lock-picking advice, and managed to get back in. Score.
Gripes: Like everybody else, I'm not really a fan of the paging button positions. Also, when you start typing notes, they should auto-save. I've been done in a few times by the combo of these two: I'll start typing a note, then accidentally hit the back key and lose what I've written.
The "locations" concept is smart, but I wish there were more cues about where locations start and stop.
Loves: Having a virtual library is already world-changing. I never imagined how cool it would be to instantly shift between different texts as I enter different information-seeking modes. I have always been a juggler of multiple books — there are times I want to read fiction, times I want to read non-fiction, times I want to read fluff. In the analog world, this is disorienting; it's hard to pick up where I left off with one book after having read another. On the Kindle, freed from a cacophany of book darts and dog ears, this feels wonderfully natural.
It is the same sort of epiphany the iPod invoked for me. Carrying a bunch of books around at once, it turns out, is every bit as much of an experiential leap as carrying tons of music around was in 2001.
I love the way the notes I take are both integrated into the book and separate from it. I never used to take notes on books because I hated having to skim all the pages to snatch fragments of the insights that occurred to me as I was reading. Suddenly, I'm taking all kinds of notes. (This works especially well with cheesy self-help books like The Four-Hour Work Week, which require you to do all sorts of exercises.)
I love that you can read for hours and the battery bar will not budge from 100%.
I've named my Kindle "Inkless."
Update: I extra-super-duper love the fact that I can use Google Reader from my Kindle. Yes, I could do this on my phone, but this is even nicer.
December 8, 2008
File under "Wow": Adobe is working on an application called Zoetrope that allows you to quickly flip through archived web pages like you flip through pictures in iPhoto.
Google's cache has nothing on this. Basically it turns the isolated snapshots of the web we usually see into an evolving movie. What's more, it's got a feature-rich set of tools (all visually oriented) that lets you play with, reshape, and visualize what you find. The video showing it off is pretty amazing.
So you can:
1) See how web pages change over time;
2) Isolate just some data or images from those web pages;
3) Do statistical correlations from that data;
4) Plot it to another app.
December 4, 2008
The Internet Is An All-In-One Machine
Kevin Kelly, one of Snarkmarket's many intellectual crushes, now has a downloadable PDF of his "Better Than Free" manifesto available through Change This, which is like Revelator's brainy futurist cousin.
Here's how "Better Than Free" starts:
The Internet is a copy machine. At its most foundational level, it copies every action, every character, and every thought we make while we ride upon it. In order to send a message from one corner of the internet to another, the protocols of communication demand that the whole message be copied along the way several times.
True! But the internet isn't just a copy machine; it's an all-in-one machine! Sure, we might mostly be using it to make copies, especially in the blogosphere. But when it's not all jammed up, this machine of ours can really do a whole lot more.
Change This is a great example. Sure, they could just copy the text of Kelly's manifesto, or point to it with a link. But instead they've taken the time to add value by giving that text a new, physically rich form. In other words, they've printed it -- taking that text and creating a well-designed document.
And what about Wikipedia? Sure, a lot of those entries are just auto-generated from old Brittanicas, or cut-and-pasted from fan sites and news articles. But a whole lot are patiently entered in by devotees, translating either from the offline analog world or from one language or context into the new, universal encyclopedia. It's the same impulse that leads people to track down old TV commercials or bootleg alternate endings to movies. It's what prompts them to track down the genuine text for obscure interviews between George Bernard Shaw and an Islamic mystic. These are the digital humanists, the scanners -- in this case freeing media from their old physical form before it can bounce around on the copying web.
Because ultimately, the web is really about faxing -- broadcasting your content to the world. The ability to freely copy, scan, or print would just be an exercise in narcissism if there wasn't a way for that message to reach a receiver, whether anonymous or known to us. This is what YouTube does, what Facebook does, and (yes) what blogging does -- it creates that electronic chain between sender and recipient, only in all directions, like light itself.
If you want to be more than a copy machine, you have to do at least one of these, and do it well.
December 3, 2008
I Am A Robot. Can I Help You?
Microsoft is working on a robot receptionist.
Also from Network World's slideshow: The project's code name is "Robot Receptionist."
And "What It Is: A Robot Receptionist."
December 1, 2008
Lifehack of the Month: Bookmark to Wordpress
I was sharing this technique with participants in a blogging seminar I'm teaching in this week*, and I thought y'all might find it interesting.
This is a trick for making link-blogging even slicker than posting to your blog from Del.icio.us. Even if you're not interested in link-blogging, some of the steps might be useful from an info-management perspective.
2. Install the glorious Foxmarks. Enjoy seamless access to your bookmarks and passwords from any of your Firefox-enabled computers, complete with robust controls over which computers can access what. Feel free to import your Del.icio.us bookmarks. You won't be needing that service anymore (unless you require feeds for each of your tags). Mwa ha ha.
3. I've also installed Ex Bookmark Properties, which enables you to edit a bookmark's description from Firefox 3's default bookmark properties dialog.
4. Use Foxmarks to share as many bookmark folders as you desire.
5. Pipe the feeds from any shared folders into Wordpress using the FeedWordpress plugin.
Done. Linkblogging is now as easy as bookmarking in Firefox. Links will post to your blog after Foxmarks syncs your bookmarks and FeedWordpress fetches the Foxmarks folder.
I used this technique to make a quick-n-dirty linkblog for the seminar.
* Yes, I realize I'm the least prolific blogger in the blogosphere, and the only reason I can even cling to that title is that I've got excellent blogmates. Never mind any of that. I intend to more than make up for my infrequency by employing this trick to great effect once the Wordpress switchover is complete.
November 24, 2008
Processing has been hugely important to me -- it's basically what transitioned me from a non-programmer to hacky programmer. (It's progress.) Processing's is the first forum I've contributed to since the Prodigy video game boards in 1992. (And these contributions are, er, a little more thoughtful.)
And seeing what others have done with Processing has bent my ambitions around (going for sort of a gravity well vs. photons analogy here) and set me on a track towards things like generative media. In fact, I think it's fair to say that Processing changed my life. Whoah -- I don't think I'd even had that thought before just now. Heavy! True!
So thanks Casey, thanks Ben, and thanks to everybody who's contributed code, time, expertise and explanation. It's a brilliant, broad-spirited project, and I'm delighted to see it flourish.
November 22, 2008
Speaking of Kevin Kelly, I had basically taken for granted that one of us had already posted his call for more visions of the near future, given our recent spate of near-futurism. It appears no one had. Well, that's fixed.
November 9, 2008
Control Browser Refreshing
After the ABC News site auto-reloaded the page three times while I was trying to watch an 18-minute segment from This Week, I went hunting for a way to make Firefox prevent this. Fortunately, it's wonderfully easy. Go to about:config, bypass the warning message, and look for "accessibility:blockautorefresh." By default, this is set to false. Set it to true, and Firefox will prompt you for approval whenever a site tries to refresh itself.
If you're wondering why so many sites auto-refresh these days, it's basically a cheap and easy way to inflate our pageview counts. What we tell you, of course, is that we want to make sure that if you keep the site open in a tab while you click away, we want to make sure you see the freshest content when you click back. I strongly suspect if that were really our primary motive, we'd find a way to update our pages with AJAX, thereby preventing a severely annoying disruption of the site experience.
June 24, 2008
More Is Different
I quite enjoyed the Wired cover story this month, which begins by arguing that a surfeit of data is rendering the notion of scientific modeling basically obsolete, and continues by walking through several ways in which this phenomenon has manifested itself out in the world. I especially enjoyed this mini-essay about the Europe Media Monitor, which looks like a useful potential news source to scan to see what the world is talking about. You can see, for example, that it identified the pre-election violence in Zimbabwe as the biggest story of the day yesterday, and pulls together reports from all over the global press on the subject.
June 19, 2008
My Brain Is Changing (Put That Phone Away)
It made its rounds last week, but I only just got to Nick Carr's Atlantic article on Google, brains, books, reading, and thinking.
I liked it a lot, and I think his central premise -- that using the web so much, for so long, is changing the way our brains work -- is correct. As with every kind of change like that, it's a mixed bag: terrific in some ways, awful in others.
Generally of course I'm a fan of the web way of thinking, but as it has wormed its way into the walking, talking physical world -- mostly via mobile phones but also via laptop if you're in an office like I am -- it's started to freak me out.
William Gibson's got this line that goes something like: "Our descendants are going to think it was quaint that we distinguished at all between the virtual and the real." And, oh man, to sit around a table at a bar these days, with people phasing in and out to flip open phones and tap text messages to invisible companions -- it's here. For a certain kind of person in a certain kind of place, the virtual suffuses the real and sits alongside it.
And jeez, it's distracting!
We're in this odd phase now where technology far outpaces manners and mores, so I think part of the problem is just that nobody knows how to act. (I am no paragon here; I paw my phone for texts, tweets, emails, alerts, and who-even-knows-what as much as anybody else, and mine doesn't even have a swooshy touch-screen or anything.)
I'm sure we'll develop better instincts for this stuff. Or get used to it. Or both.
But either way, it's a pretty special thing for this to be happening so quickly (it's happening quickly, right?) and to be so aware of it: to see the texture of our inner and outer lives warp and change before our very eyes.
June 2, 2008
So, not to completely nerd out on you, but this is neat:
Hard problem: This whole "cloud computing" thing requires that you be able to communicate in two directions with lots of machines at once: tell them what to do, yes, but also check to see what they're up to, and if they're still running at all.
Fun solution: Why not just treat them all like chat clients and use Jabber?
That's an oversimplification, but I just love the idea of essentially managing a complex computing cluster via glorified IM. Here are the details from Ezra Zygmuntowic.
April 3, 2008
Templated Creation Wizards
Couple newish websites make it easy to make formerly complicated things:
1) BitStrips offers a surprisingly robust tool for making comic strips. Fell in love with it a little at first, but the honeymoon's kinda wearing off. Why can't I save strips as drafts? Why don't I have access to *all* the characters other users have made public? Why can't I make characters based on those characters?
2) AniMoto makes wonderfully kinetic automatic slideshows from your images, synced to a song of your choosing. You can then export the slideshows to YouTube, or dispense with them as you please.
Oh yeh, and also: This has nothing to do with templated creation, but Lifehacker's talking about the best IM clients. Pleasingly, I see they've chosen Digsby, which I've been meaning to blog about forever. Digsby is my *jam*. It connects not only to your IM service of choice, but to your Gmail, Facebook, Twitter, and a host of other social apps. And it's got a slick, freakishly customizable interface. And it's fresh out of a private beta, so developers are polishing it up more every day.
March 25, 2008
But Can It Vacuum My Floor
Forgot where I ran across this, but I was reminded today of the typeface Champion Script Pro, "the most advanced and powerful script ever made. Developed over a period of two and a half years, each one of the 2 weights is loaded with 4253 glyphs (now 4280 glyphs)." What does that mean? It means the typeface is programmed to dynamically adjust glyphs to complement each other in a given word. All for just €175.
February 4, 2008
Just Because We Can ...
danah boyd writes a typically thought-provoking post on the prospect of exposing users' "Social Graphs," a meme that's been heating up recently. Quick backstory in case you didn't know: Google and a bunch of techy types want to make it so you can easily port your identity and contacts to any application on the Web. The advantages include easier sign-ups for different Web applications, no longer having to maintain the same information in a bunch of different places, quickly finding any contacts who are using an application you just signed up for, etc. Those of us with MySpace/Facebook/Friendster/LinkedIn/Flickr/vita.mn/etc. accounts are planning to be, for the most part, happy.
But danah makes the good point that those stumping for this move are all tech-savvy people who mostly have no idea of what the repercussions will be for some of the most vulnerable heavy users of the Web -- teens. A typical argument in favor of more open data refers to what Tim O'Reilly calls "security by obscurity" -- i.e. we have the illusion we're secure just because all our data is usually tucked out of the way, but this is patently false, as any reporter could tell you. Exposing public data more commonly means fewer people will harbor this false sense of security, ostensibly making them more directly conscious of how they manage their personal data. But as danah points out, it could be an awfully risky way to make a point.
January 31, 2008
Things points to the fascinating idea of the "virtual cable" for driving directions in cars. There's been a lot of recent buzz about projecting data on car windshields. The virtual cable is a three-dimensional line drawn onto the road ahead showing you exactly where you're going. Trippy, probably distracting, but nonetheless fascinating.
January 23, 2008
Adrian, Wilson and co. have launched Everyblock, a mashup of several information sources down to the block level for different cities (currently Chicago, New York and San Francisco). The site is very pretty, especially the maps, and as you would expect, there's fun data hidden beneath every click. But it's otherwise hard for me to evaluate how cool it is, since I don't live in any of the included cities. How about it, residents?
Update: One surprise ... no RSS feeds? (Except this one.)
January 4, 2008
Astroturfing: Always Bad; Usually Obvious
"Astroturfing is a neologism for formal public relations campaigns in politics and advertising that seek to create the impression of being spontaneous, grassroots behavior."
For example, say you founded a non-profit dedicated to vetting charity organizations and grading them on their effectiveness. Your org is attracting some high-profile attention, but you're hankering for more. So you create accounts on a few well-trafficked websites. First, you pose as a naïf, adrift in a galaxy of charities, desperately seeking guidance. Then, under different accounts, you guide your little sockpuppet and any other interested parties right to your org. Step three, profit. Right?
Right, unless you attempt your ruse at the wrong site, where the users are savvy enough to see right through your act and call you on the mat. Now, your follies are on Digg and everywhere for all the world to see, and no amount of groveling will make amends. For shame.
I have to deal with minor astroturfing all the time on vita.mn (and pretty ridiculous astroturfing occasionally), and it's always a forehead-slapper. It's generally easy to spot, no matter how clever the offending party seems to think s/he is, and it cultivates a heaping mess of ill will. If you ever have the urge to misrepresent yourself online in a manner you think will advantage your company, don't do it. You will be found out, and it will be very unpleasant. Your exploits may even be exposed in New York Magazine. Just remember this mantra -- "Astroturfing makes an ass out of -- never mind, just don't do it.
File under: Snarkpolitik, Society/Culture, Technosnark
December 10, 2007
Needed: a term for when your phone makes calls to random entries in my address book on its own volition, usually as a byproduct of unintentional button-mashing. Somehow, my phone intuits the romance/dating-related entries and goes straight for them. It's particularly enamored of one of my exes, which can be awkward. But not as awkward as the time it sent a discouraged suitor of mine five copies of a text message to a friend describing what I was going to wear that night.
I understand that keyboard lock (and probably looser jeans) would mostly solve this problem. But until I decide whether those are sacrifices I'm willing to make, I need something to describe this phenomenon. Ghost-dialing?
December 7, 2007
"The iPod Moment"
The Kindle/iPod comparison keeps coming up, usually in service of the point, "Amazon, don't flatter yourself." Which I think is fair. But in reading all this talk about the "iPod moment" for books, I feel as though I have a completely different notion of what that moment meant for music. Sure, on the face of it, Apple's innovation was a tiny-but-capacious music player that allowed us to carry our music library everywhere we wanted. But wasn't the deeper surprise/lesson of the iPod that Apple had essentially invented a need where none had formerly existed?
When I remember 2001, I remember Apple launching a device that garnered some admiration for its technical savvy, but whose price and function drew something of a raised eyebrow from critics. "'Breakthrough digital device' might be pushing it," wrote David Pogue, in his review of the first iPod. ("Apple, don't flatter yourself.") Meanwhile, the first New York Times mention of the device was hardly breathless. The article quoted three people. The first was a Gartner analyst, who said, "It's a nice feature for Macintosh users ... but to the rest of the Windows world, it doesn't make any difference.'' The second was Steve Jobs, who was paraphrased as "disputing the concern that the market was limited, and said the company might have trouble meeting holiday demand. He predicted that the improvement in technology he said the iPod represented would inspire consumers to buy Macintosh computers so they could use an iPod." The RIAA declined to comment, and another analyst simply said, ''This raises the bar." The one actual description of the iPod in the article called it a "hybrid of existing products." The article included an estimate that the size of the market for all digital music devices would be 18 million units by 2005.
I remember this muted enthusiasm pretty clearly because I was one of the skeptics. What could be so impressive about a portable music player? The Walkman's been around almost as long as I have. Storage size? Honestly? What need could I possibly ever have to carry my whole music library around with me? How much music can I lsten to at one time?
32 million iPods were sold in 2005. That's not even counting other digital music devices. This year, the 100-millionth iPod was sold. Clearly there was a market need here for a vast mobile music library that most of us were blind to in 2001.
I now have three iPods.
When folks talk about Kindle doing (or not doing) for books what the iPod did for music, they usually seem to mean creating a tiny-but-capacious e-book reader that allows us to carry our library everywhere we want. But I don't think Bezos et al. are aiming at that at all. I suspect they're trying to create something we didn't know we needed. A leap of imagination so bold, it could only seem obvious in hindsight. Jury's still out on whether or not they succeeded.* But I'm wonderfully excited by the possibility that I could one day encounter something that just transforms my notion of what a book can be.
* Personally, I felt for the Kindle the murmur of a tug I hadn't yet felt for any other digital reading devices, although not strong enough to win me over.
File under: Books, Writing & Such, Technosnark
October 23, 2007
Universal Computing in Two States and Three Colors
As previously noted, I couldn't hack Stephen Wolfram's big book but I like his way of thinking. This new post from his blog is fun and fascinating. It's about a 20-year-old kid who met a challenge Wolfram set out earlier this year -- with a $25,000 reward attached. Good (if esoteric) reading.
The general concept of "discovering" solutions vs. engineering them seems fairly profound, yeah?
October 4, 2007
The Large Hadron Collider Is, In Fact, Large
Normally not a huge fan of QTVR but these panoramas of the Large Hadron Collider are unbelievable. The color palette in particular is so pleasingly industrial-primary.
September 17, 2007
Over at Steven Talcott Smith's blog, tales of non-programmers writing software. Some really fun stories in there, all of which I am entirely sympathetic to, as someone who a) admittedly does not have The Knack for programming but b) really enjoys it anyway.
And besides, knack or not, I think it's on its way to becoming a new required literacy. Sure sure, computers will get easier to program, and the gap between our intent and their instructions will close as they scootch our way -- but you'll still have to learn to think procedurally, to think in terms of objects or messages or other computer-y things.
And you'll have to learn what && means. You always end up having to learn what && means.
September 10, 2007
35 Years, 10 Seconds
Time-lapse video of Tokyo's skyline. It's crazy. The progress looks cartoony and alien... almost insectile!
Via Long Views.
September 4, 2007
We've Been Stuck With Violins for Centuries
Given how many hours I spent with a crappy Casio keyboard, I'm pretty sure 12-year-old Robin would never have come out of his room if he'd had one of these.
What's cool about it is not the synthesis (which is kinda boring), and even not the interface per se, but rather the interface in a physical context -- all those buttons! How can you not want to monkey with it?
August 26, 2007
The Motion of Motion
- Select video, e.g. "Run Lola Run."
- Display thousands of copies of said video on a gigantic wall-spanning video matrix, each offset from its neighbor by a single frame.
The patterns that emerge out of different kinds of motion in the movie, and different kinds of cutting, are pretty nutso.
August 24, 2007
All You Need is the Cloud
Half way into the flight, after responding to about a half hour's worth of e-mail, my laptop hard disk crashed. [...]
On the plane and afterwards in my Vancouver hotel room, I went through the predictable stages of grief that accompany data loss. First you assume that the problem is software and then after employing several disk utility programs you begin to realize that you are really in the soup.
How was I going to write the three articles I had promised without a computer?
[...] I considered a number of stopgap measures. There was the possibility of asking the paper to ship out a replacement laptop overnight. (My call to the paper's computer support hotline was answered three or four days later). And there was the possibility of resorting to the hotel's $15-an hour business center.
Then, while hunting through my bag for some elusive stopgap measure, I came across a CD disk with a copy of Ubuntu Linux. A number of versions of Linux now come with a demonstration feature that makes it possible to run the program without actually installing it on a hard disk.
Inserting the disk, I was able to restart my computer, this time it was running Ubuntu, instead of Apple's OS X version of Unix.
What I discovered was that - with the caveat of a necessary network connection - life is just fine without a disk. Between the Firefox Web browser, Google's Gmail and and the search engine company's Docs Web-based word processor, it was possible to carry on quite nicely without local data during my trip.
Seriously... I find I care about which computer I'm using less and less. This is awesome.
P.S. The NYT's Bits blog is terrific.
August 17, 2007
Deep History (in 160 Characters or Less)
Went to the Long Now talk tonight and took my parents. Unfortunately: way longer than the normal (snappy) Long Now talk. Fortunately: totally awesome subject, and loads of interesting details. The presenter was Alex Wright, an information architect. He's written a book called Glut about the history of information systems -- the deep history. Like, all the way back to bacteria.
My new habit of notetaking is to text messages to myself. Thus you can gauge the interestingness of a Long Now talk by the pile of weird short emails that's waiting for me when I get back home. Here's what I'm looking at now:
(Okay, actually, one of them is a note about a dream I remembered during the talk. I'll leave it to you to guess which.)
August 14, 2007
Make RSS Work Again
When David Weinberger talks about how effective the Internet has been at evolving sophisticated filters for processing all the stuff that's on the Internet, this is what he means. AideRSS is a Godsend. It analyzes the activity around each item in an RSS feed -- Technorati hits, comments, Del.icio.us links, traffic reports, etc. -- and calculates a score for the item. It then creates four feeds from the original feed, each set to a higher activity threshold.
Example: So far today, BoingBoing has posted a liver-curdling 18 entries. I could cut that down to two entries by subscribing to the feed of what AideRSS has deemed to be BoingBoing's "best" posts. (Today, I'd be reading the obit of the fellow who could dial a phone by whistling, and a post on this "John Hughes meets George Romero" graphic novel. Among other things, I'd miss cheap plastic toys, fugly sweatshirts, a clay iPhone, and politically-themed crafting projects. Think I'd live.) If I really only want to hear from BoingBoing every couple of days, I could go for just the hits.
For those of you overloading on RSS feeds, but hoping not to miss anything big, this is totally key.
August 12, 2007
The Thermodynamics of the Internet
Over at Wired's great Danger Room defense-tech blog, there's a post up about DARPA's new programs to monitor internet traffic even as the volume of that traffic keeps increasing exponentially:
But the Navy has been pioneering an approach, called "Therminator," which might be able to do the job a little better. It's one of a number of potential new net-defense tools that DARPA would like to see in action. The idea is to monitor the flow of traffic, rather than the individual packets. To treat it like to movement of temperature -- thermodynamics -- rather than the travels of ones and zeros. "If, all of a sudden, we see a big flow to China, we know there's a problem," Hearing says.
Yeah, I realize the whole "whoah, the net, it's like, it's like, a giant BRAIN" thing is a cliche by now, but even so, it's wild to see this weird creation begin to exhibit more and more of these macro-properties that we associate with other physical or biological systems.
July 30, 2007
Perl Is a Shinto Shrine
July 20, 2007
The Google Grid, Broadcasting at 700MHz
Google has committed to bid for wireless spectrum -- as much to influence the direction of the market as to, you know, own spectrum (or so it seems).
And, good news: The direction they want to push it is towards openness.
These days, I find myself less worried about Google's techno-titanic mastery of all data and more excited about its potential as a force for change in public policy and markets. I'm actually really glad they're getting into that game.
July 6, 2007
All I have to say about the iPhone is it sure took Apple long enough to create the wifiPod. :P
June 28, 2007
Mutation State University
From the Dept. of Alma Mater Promotion:
In the corner of a laboratory at Michigan State University, one of the longest-running experiments in evolution is quietly unfolding. A dozen flasks of sugary broth swirl on a gently rocking table. Each is home to hundreds of millions of Escherichia coli, the common gut microbe. These 12 lines of bacteria have been reproducing since 1989, when the biologist Richard E. Lenski bred them from a single E. coli. "I originally thought it might go a couple thousand generations, but it's kept going and stayed interesting," Dr. Lenski said. He is up to 40,000 generations now, and counting.
In case you glossed over it: "...have been reproducing since 1989." That is, to be clear, an 18-year-old experiment. And counting!
June 25, 2007
Brave New Biotech World
I know this has been like The Next Big Thing for a long time, but it's sort of starting to happen. From Kottke:
A company called Lifeforce has received FDA approval to store white blood cells for people as a "back-up copy of your immune system." The idea is that those pre-diseased cells could be reproduced in the lab and infused back into your body when needed to fight off infection or deal with the aftermath of chemotherapy.
Soon there are going to be little bits of us stored everywhere. I'm serious! I know! It's weird!
June 17, 2007
USB Pinkie Drive
My favorite thing in my All-Ett these days is the impossibly tiny Kingmax 2-gb Super Stick. The dimensions? 1.3" x 0.1" x 0.5". Two gigabytes. That's more capacity than my high school desktop PC. And it's made of reinforced steel or something, honestly. Best part? It costs $16.
Why hasn't this device taken over the world yet?
June 11, 2007
David Brin's Respect
Discover Magazine has a short interview up with science fiction author David Brin. They ask him how he's chalked up such a good record as a prognosticator, and this is what he says:
Peering ahead is mostly art. We all have tricks. One of mine is to look for "honey-pot ideas" drawing lots of fad attention. Whatever's fashionable, try to poke at it. Maybe 1 percent of the time you'll find a trend or possibility that's been missed. Another method is even simpler: Respect the masses. Nearly all futuristic movies and novels -- even sober business forecasts -- seem to wallow in the same smug assumption that most people are fools. This stereotype led content owners to envision the Internet as a delivery conduit to sell movies to passive couch potatoes. Even today, many of the social-net and virtual-world companies treat their users like giggling 13-year-olds incapable of expressing more than a sentence at a time of actual discourse.
Good, prescient stuff throughout.
And! If you haven't read the thrilling tale of the Streaker and her neo-dolphin crew, then by all means, do so immediately!
June 5, 2007
Threadless for Bumper Stickers
Ha! I bet you thought I was posting an actual link to a site that was, in fact, Threadless for bumper stickers. But if such a thing exists -- which it must -- I'm not cool enough to know about it. Enlighten me, o ye crowd-wisdom.
June 1, 2007
The Diamond Age Starts... Now?
Physicists just figured out how to address a single carbon-13 nucleus as a memory register -- a quantum bit -- at room temperature. That's an important distinction; all quantum memory to date has relied on freakish near-absolute-zero conditions in the lab.
P.S. It also involves lasers. Of course.
May 31, 2007
No, Not That Vista
Okay, see if you can guess what this refers to:
VistA stands as perhaps the greatest success story for government-developed information technology since the Internet itself.
Wow, right? The answer lies in Thomas Goetz's NYT op-ed.
(His blog Epidemix is subscription-worthy as well!)
May 24, 2007
Like a Giant Flower
This is beautiful: a new solar power plant in Seville. It doesn't use photovoltaics, though; instead there's a field of mirrors focusing sunlight on pipes of water. Steam = power. Simple!
No time to clip a picture, but go check it out. It's angelic.
May 21, 2007
Was just booking a flight on southwest.com and saw this:
It's not the site; it's the real costs Firefox add-on, which I installed a few weeks ago and promptly forgot about. Apparently this is one of its least-impressive displays; there are other examples here.
So this is really cool, right? On the web, you don't have to wait for labeling regimes to change... you can just rig up your own view of commerce.
That said, I will concede that the CO2 values shown here did not affect my purchase decision in the slightest. It was only after I bought the ticket that was moved to calculate how much CO2 my Toyota would spew out if I drove it to LA instead: a little over 300 pounds.
May 20, 2007
Coming Soon: A Bunker With a View
May 5, 2007
A Glimpse of the Retro-Future
(Note: This blog post is essentially Jarah's blog post, with an additional layer of attribution. I love how recursive blogging can be. I'm trying to think if I've ever seen a blog post retain the entire meme trail of an item before. How awesome would it be to see "Wired via Jarah via Matt via" at the end of a post? Can you guys think of anything like that?)
April 26, 2007
Hackety Hack makes Ruby sort of like BASIC. From the fellow who brought you Why's Poignant Guide to Ruby, it's a downloadable program (basically the Ruby language, the Gecko browser, and some helpful libraries) designed to introduce geek wannabes to the world of programming. For a slightly less kid-oriented approach, check out Try Ruby, which is a browser-based version of the same thing by the same guy. (MetaFilterrific.)
April 25, 2007
Black Rim Glasses
Ethan Kaplan's blog is consistently good. Witness this post on user-generated content where he brings it around to Walter Benjamin in the end. He is a technology guy (perhaps... THE technology guy?) at Warner Brothers Records, so he straddles the line between new worlds and old in interesting ways. Worth subscribing.
April 20, 2007
A Bit of Foolscap, Talking to the Ether
Despite how dorky it looks, I am a little bit excited about this new Amazon.com e-book reader. It's almost entirely because it has high-speed wireless internet access. That's the whole point of an e-reader, I think: If I just want to tote around Harry Potter, books work fine. But if I want to tote around Bloglines... hmm!
April 16, 2007
Too... Much... Smartness
This Cornell class-blog, Info 204, is blowing my mind in a sort of cyclical combustion cycle where just as I feel like I've processed one of the posts, another one comes along and everything going BLAM. The class covers "how the social, technological, and natural worlds are connected, and how the study of networks sheds light on these connections."
April 8, 2007
Military Jumping Beans
That is all.
April 5, 2007
Help Me Invent a Need for This Tool
Seriously, this 3D scanning/printing stuff is poised to take off. It seems super-exciting, but the problem is, I have no idea what I would actually want to make with a desktop factory. I am sure this simply betrays a lack of physical imagination on my part. Any ideas?
April 1, 2007
Doing Things No Human Could Do
Robot bricklayers! I don't know about you, but I always find these industrial robot arms hypnotic: so massive, yet so fast and so precise.
P.S. The link is to the Monocle site. I got a chance to check out the printed magazine this weekend and it is pretty awesome.
March 26, 2007
March 18, 2007
Thiago, What Are You Working on Down There?
Michigan teen makes small fusion reactor in his basement. No seriously, it's real. I'm pretty sure the greatest technical achievement of my tenure as a Michigan teen was, like, connecting to BBSes.
As long as we're talking about science: Remember the world accent quiz? Well, the results are in. The U.S. accents -- Alabama and Wis-CAHHHN-sin -- were a cinch, while the accents from Bolivia, Italy, and Morocco stumped almost everyone.
March 12, 2007
Make the Web Fun
Ask.com is doing a nice job with things these days. For instance: Here's where I spent most of this sunny San Francisco day!
March 8, 2007
Natural Social Networks
This seems smart: social networking sites run by airlines. Of course, the target isn't people like me, who always just grab the cheapest fare on Orbitz; it's business travelers, e.g. the Southwest devotees who fly from San Francisco to LA twice a week. I mean, I feel like these folks have a thin, oh-it's-you-again social network built already.
What other businesses regularly convene groups of people in the same space who might have something in common?
Here's my nomination: grocery stores! What if Whole Foods set up a social networking site? I actually think it could become like the best dating site in the world pretty quickly. Either that or the most awkward. Maybe both.
February 22, 2007
Free Multimillion-Dollar Startup Idea of the Day
A mashup that allows users to create Pop-Up Videos out of YouTube videos. You'd get acquired by Google for $25MM, easy. And it would be soooooo hott.
February 19, 2007
Riffing on an Arthur C. Clarke idea about the unpredictability of science, Kevin Kelly is musing about expected and unexpected inventions (via Infocult). Clarke actually created a chart of inventions or discoveries most scientists could have foreseen before they came about (e.g. automobiles, flying machines, telephones), and ones they couldn't have predicted (e.g. sound recording, relativity, atomic clocks). Kelly does the same thing, putting organ transplants, the cell phone, and the test tube baby in the realm of the expected, and DNA fingerprinting, radar, and artificial sweeteners in the unexpected camp.
The criterion, Kelly explains, is the "perplex the ancient" test. If Da Vinci were brought back to life, would he be utterly mystified by the technology, or would he grasp the concepts behind it?
For instance, genetically modified crops would surprise no one, because the technique is simply breeding by another means. On the other hand, the underlying concepts of DNA fingerprinting would be mysterious, magical, problematic, and take great lengths to explain. The World Wide Web is the long sought after universal library and answer machine. But virtual reality doesn’t have a good analogy.This got me wondering -- what if you tried a perplex-the-ancient test with things outside of technology? Say cultural developments, for example. What in contemporary culture that might astound the savviest anthropologists of old? Would the end of privacy (great article, btw) shock Mr. de Tocqueville? Would Oscar Wilde have foreseen Who Wants to Marry a Millionaire?
File under: Society/Culture, Technosnark
February 16, 2007
This is supposedly a list of the ten biggest databases in the world. But I am suspicious: I really feel like the U.S. federal government ought to rate more of those top spots. What about Social Security? Or some sort of crazy Medicare database?
Also, could YouTube's database really be larger than, say, Visa's?
Anyway, I'm still linking to it just because I love the idea of Really Huge Databases. Any other contenders you can think of?
February 8, 2007
The Tale of Teddy Ruxpin 2.0
But in the meantime, while we thought about what sort of things the Home Server might do, I came up with the (again, patented, but the patent dropped) idea of an internet-connected teddy bear that contacts a web site to tell stories. People would tell stories to the web site, and in return for these stories, they would be paid per listener. Bear purchasers would pay a monthly subscription fee. The child would get access to every single story ever told via the breadth of the lazyweb, and the parents could configure the bear to tell only certain kinds of stories (e.g. nonviolent, child age 4-6, Jewish, with a moral message, etc. Stories would be reviewed and tagged.)Excerpted from one of my favorite MetaTalk posts of all time. (Waxtastic.)
February 6, 2007
Search Is a Folksonomy
This is a notion that popped into my head during a discussion with our search vendor today: online search is a folksonomy. Every search a user performs could be seen as a tag she's applying to the result she ultimately clicks on. Over time, you could imagine a page featuring a tag cloud formed of all the searches that got people to that page.
Maybe that's an insight obvious to everyone but me, but it felt novel. It seems we always talk about how tags could help search (hand in hand with the discussion of how no one actually uses/understands tagging, which may not be so true); why don't we talk more about how perhaps the most common activity performed on the Internet is actually a form of tagging?
Bonus: The tag cloud you'd see if we did this for all pages on Snarkmarket would feature "snarkmarket" in giant letters, and then the following phrases, getting progressively smaller: breck girl, media galaxy, googlezon, listenings, robin+sloan, matt thompson, shipbreakers, homeless by choice, matt+thompson, by your command, giantess, media+galaxy, chicken porn, breck+girl, robin+sloan+and+matt+thompson, eminence gris, snarkmarket "this i believe", mothball fleet, "by your command".
And that would be my favorite tag cloud ever.
February 5, 2007
Old Man Minsky
[Your new book] "The Emotion Machine" reads like a book about understanding the human mind, but isn't your real intent to fabricate it?
The book is actually a plan for how to build a machine. I'd like to be able to hire a team of programmers to create the Emotion Machine architecture that's described in the book -- a machine that can switch between all the different kinds of thinking I discuss. Nobody's ever built a system that either has or acquires knowledge about thinking itself, so that it can get better at problem solving over time. If I could get five good programmers, I think I could build it in three to five years.
From a little later on:
Has science fiction influenced your work?
It's about the only thing I read. General fiction is pretty much about ways that people get into problems and screw their lives up. Science fiction is about everything else.
Also, Minsky says wistfully of the old Bell Labs: "I worked there one summer, and they said they wouldn't work on anything that would take less than 40 years to execute."
February 2, 2007
Finally, GMail for Everything!
I've been waiting for this for months, ever since I first heard Google was testing out the ability for users to manage external email accounts through Gmail. Every few weeks, I'd peek back into my settings and see if I'd been added to the group of users this feature had been rolled out to. And at last, the moment is here.
I love Google the way Winston loves Big Brother.
January 11, 2007
TIME.com has another of its great, oblique photo-essays up: Thirty years of Steve Jobs. It's actually pretty remarkable to flip through. Jobs is so quintessentially American.
November 29, 2006
The Lost Millennium
This ancient calculator is unbelievable. Second century B.C.!
Dr. Charette noted that more than 1,000 years elapsed before instruments of such complexity are known to have re-emerged. A few artifacts and some Arabic texts suggest that simpler geared calendrical devices had existed, particularly in Baghdad around A.D. 900.
Somebody I know once claimed, only half in jest, that if it wasn't for the Dark Ages we'd have landed on the moon by like 1200. Big ol' imperial space-galleys or something.
November 22, 2006
As a reporter/producer, I never had to make presentations. I told stories with images, audio, and text -- using Flash, Photoshop, Premiere Pro, Word, and the like. My first month at the Star Tribune, I found myself having to use PowerPoint. Initially disdainful, I sniffed around for a few PPT tutorials, and stumbled across this blog. As well as provided helpful tips, the blog espoused an approach to PowerPoint that helped me to see it as just another storytelling medium.
The PowerPoint I created last October still lives on in bits and pieces today, in presentations I've given all over the Twin Cities. And I always get pretty good reviews.
November 19, 2006
Like a Brain, Like a Heart
Take a look at this graph and try to tell me the internet isn't going to eventually wake up and, like, try to find other internets to play with.
November 1, 2006
A beta version of The Django Book -- a guide to the Web application development framework Django -- is being released free online, chapter by chapter. OK, nothing new there; I think it's now illegal in 38 states to write a book about technology without either blogging the writing of it or posting it under a General Public License. What's interesting is the system that the authors have cooked up for allowing comments on every paragraph. It could get totally overwhelming, if not implemented just right, but I think they've implemented it just right. Sweet.
October 29, 2006
I can't tell you what I find so incredible about it, but I spent about 45 minutes just staring at this Flash program yesterday, and I don't regret a minute of it. Turn down your speakers before you visit.
October 5, 2006
Treatise on Nihilism
P.S. I am a little embarrassed to say I bought the shirt. The rules: It will be worn only on weekends. In the confines of my apartment. While playing board games.
September 29, 2006
September 28, 2006
Google, you have just released Google Transit trip-planning apps for five new cities.
SAN FRANCISCO IS NOT AMONG THEM.
What the shiz, Goog? Do you not love your people?
I mean... TAMPA??
Update: See the comments. Who knew?
September 17, 2006
Skyscraping for 70 Years
Man, businessweek.com is the sleeper news site of the year. They consistently have cool stuff. For instance: Seven Decades of Skyscrapers, an article (and more importantly, slideshow) about the work of Skidmore, Owings & Merrill. Check out slide four, the solar telescope. Slide 10 looks to be pretty hot as well.
September 14, 2006
No Seriously, It's the WifiPod
September 10, 2006
We're Not a Film Company... We're a Flatness Company
How should Kodak save itself? Get into the laboratory-grown meat business, of course.
Come on, you pretty much have to click that link.
August 23, 2006
In The New Atlantis this month there's a review of two books on shipping containers (middle item) -- the TCP/IP packets of modern trade. (Come on, you are all blog readers out there, you know what I mean.) Somehow I find this incredibly evocative:
[...] McLean inaugurated the era of containerization on April 26, 1956 by transporting 58 containers from Newark to Houston aboard a ship called the Ideal X.
Also: It is said that the container cranes at the Port of Oakland were the inspiration for George Lucas's AT-AT walkers. It's highly plausible.
August 21, 2006
Second Life and Macromyopia
3pointD transcribes a fascinating keynote talk by Mitch Kapor at the Second Life Community Convention this weekend.
Also, he gives a name to an effect I am constantly citing:
One thing that’s very important to keep in mind is something called Macromyopia. For people who are inside a new phenomenon like Second Life, we tend to overestimate the short-term effects. We think more great things are going to happen sooner than they typically do. Conversely, we underestimate the long-term impact.
Or: In the short-term, things change slower than we expect them to. In the long-term, they change more than we ever imagined they would. Now I know what to call it!
August 20, 2006
It's the Center of the Universe, I Hear
Earth: just another failed planetary nucleus. Aww.
August 17, 2006
A Pixel the Size of Everything
Browsing the site for Ask a Scientist, a cool lecture series here in SF, I stumbled across the coolest link ever. Down in the bottom-right corner of the page, it says: "Want to get freaked out? Click here."
Go ahead, try it.
Every time I see that thing my brain folds.
August 16, 2006
This is an awesome idea: a cutting board with an integrated scale, allowing you to measure your ingredients as you slice 'em.
I've long wished that the task of measuring was better integrated into the cooking process. I've been on the lookout for a set of containers to hold my flour, rice, sugar and other dry goods, with lids that double as measuring cups. Let me know if you see anything.
August 14, 2006
News from the World of Science
Fun stuff recently on EurekAlert:
- Mercury sucks
- Social networks and problem-solving
- The virtues of intuitive eating
- Migratory birds calibrate their internal compasses at sunrise and sunset
- Slime molds are survivors
Speaking of science: Here is a depressing graph.
August 3, 2006
From chapter 4 of The Singularity Is Near:
Although we have the illusion of receiving high-resolution images from our eyes, what the optic nerve actually sends to the brain is just outlines and clues about points of interest in our visual field. We then essentially hallucinate the world from cortical memories that interpret a series of extremely low-resolution movies that arrive in parallel channels. In a 2001 study published in Nature, Frank S. Werblin, professor of molecular and cell biology at the University of California at Berkeley, and doctoral student Boton Roska, M.D., showed that the optic nerve carries ten to twelve output channels, each of which carries only minimal information about a given scene. One group of what are called ganglion cells sends information only about edges (changes in contrast). Another group detects only large areas of uniform color, whereas a third group is sensitive only to the backgrounds behind figures of interest.
"Even though we think we see the world so fully, what we are receiving is really just hints, edges in space and time," says Werblin. "These 12 pictures of the world constitute all the information we will ever have about what's out there, and from these 12 pictures, which are so sparse, we reconstruct the richness of the visual world. I'm curious how nature selected these 12 simple movies and how it can be that they are sufficient to provide us with all the information we seem to need."
July 28, 2006
I've been a bad blogger. When the site I'm working on is launched (aaaaany minute now), I'll make it up, I promise. But since I can't sleep and am up at kind of an ungodly hour, I'd like to take a moment to geek out over Google's answer to Sourceforge. Sourceforge drives me nuts. There's tons of good stuff there, but how's anyone supposed to find it? GCode is much prettier. Of course, the Googletrons say, "We really like SourceForge, and we don't want to hurt SourceForge." I say fiddle while they burn, Eric Schmidt. Fiddle while they burn! OK, back to my cave. (Waxtastic.)
July 21, 2006
The WifiPod... by Microsoft
Hey, pay attention to this Zune stuff from Microsoft. The emphasis on wifi, social networks, and maybe even gaming is interesting, and the whole thing smells more Xbox-y than Vista-y to me. (Which is a good thing.)
Prediction: J Allard will run Microsoft in ten years.
July 13, 2006
San Francisco Interactive City Summit
I think this looks pretty fun: a free, open conference about interactive cities -- e.g. ideas at the intersection of urban planning, technology, networks, media, mapping, local social networks... or whatever else you think fits. It's here in San Francisco on August 7 and 8. Sign up if you're in the area!
July 12, 2006
My Power Strip Always Makes Me Cry
Come to think of it, the power strip has needed re-inventing for quite some time now. Here's a cool new system called E-ROPE designed by students. Energy-efficient, too!
The snarky commenters clearly have not had the struggles with traditional long, uniformly-spaced power strips that I have. Arghhh.
July 6, 2006
The Happy Hive Mind
Cambrian House: anyone can submit an idea, anyone can vote for or against that idea, anyone can contribute the code/creative work to execute that idea, and the folks who do get paid.
July 5, 2006
Smack Dat Hadron
Let me just tick off the things I love about this article in Seed Magazine.
The Large Hadron Collider (LHC) currently under construction at CERN is the greatest basic science endeavor in history.
Check. Giant ominous-looking machinery?
Um... CHECK. Big goals?
All these superlatives exist for one reason: To understand the universe.
Check and mate. Seriously, even if you know the basics of the LHC (*cough* don't we all *cough*) it's worth a look -- Seed has gathered short, provocative notes from a crew of smart physicists. It's good reading.
P.S. That blue thing up above? It transforms into a robot.
June 30, 2006
It's Like a Slow Internet for Cars!
Gems of the U.S. interstate from NPR. The highways just turned 50!
June 27, 2006
Science Press Release of the Week
From the always-interesting EurekAlert! feed.
May 15, 2006
Usability Testing in Uganda
Matthew Flannery is co-founder of Kiva, one of my favorite new non-profits. On his excellent blog, he's just posted a video of a Ugandan client using the Kiva site. If you're a web designer, or at all interested in the issue of the digital divide, you should watch it. It's actually a bit harrowing.
In related news, the merchant I helped fund via Kiva just paid back 10 percent of his loan! Nice!
May 10, 2006
The amazing Jonathan Harris is at it again, having completed another super-interesting project with a fantastic interface. (Actually, a pair of them.) This time, he and his collaborator Sepamdar Kamvar have outdone themselves with We Feel Fine, a Java applet that offers a peek at blogged emotions, in aggregate or as snapshots. WFF also enabled a spin-off project called Love-lines, done in Flash. Play around with these for a while, they'll awe you. (Infosthetic.)
May 8, 2006
Wired at the Walker
Thursdays you'll often find me at the Walker Art Center, cell phone at my ear, wandering from exhibit to exhibit and occasionally punching in digits as I stare at the works of art. It's because the Walker offers this pretty fantastic service called "Art on Call," which lets you listen to the curators (and often the original artists) talking about the exhibits.
Now the Walker's hatched up a plan to lend visitors free iPod nanos, pre-loaded with the "Art on Call" tracks. A great idea. But what's really awesome is the thought the Walker folks have put into hacking the iPods to make them dunceproof. I love this museum.
(BTW: the Walker Channel is really a tremendous resource. Free video of talks by some of my favorite artists, from Ang Lee to Todd Haynes to Paul Auster. Highly, highly recommended.)
May 2, 2006
Grand Theft Auto-matic
DARPA's next Grand Challenge (the one they usually hold in the desert, where they race robot cars over inhospitable terrain) will be held in a simulated city next year. The unmanned vehicles will have to handle traffic and deal with intersections. (Wired.)
April 19, 2006
April 14, 2006
MetaFilter has long had one unbreakable rule: Thou shalt not self-link. Thou mayest e-mail thy link to thine fellow MeFites, but never, never must thou posteth said link to the front page of MetaFilter.
This rule kept a lot of crappy Web hobbyist sites from being posted to MetaFilter, I'm sure. But it also meant that MeFites who made something legitimately post-worthy often wouldn't get their stuff linked on the site until it had already become popular somewhere else. So Matt Haughey created MeFi Projects, where members could pimp their own stuff to their hearts' content. Other members could vote for the stuff they liked best, and post it to MeFi if they wanted.
And it just got totally better. Matt Haughey has made an archive of the most popular projects by month.
Favorite new discovery? Roundtuit: a community blog for posting the great ideas you'll never do.
File under: Gleeful Miscellany, Technosnark
March 24, 2006
Why haven't I seen the Web 2.0 Mashup Matrix before? It's great! You can just go through and instantly see what two Web-2.0-y things haven't been mashed up yet, and have at it. E.g. noone's put together Del.icio.us and EVDB yet! Here's your chance to get angel funding!
March 20, 2006
"It's like a search engine... except... big."
(How much do you love that home page, though? The box COMPELS you to type.)
March 19, 2006
World of Wallstreetcraft
It's tongue-in-cheek but I like it: Sun says they power the world's biggest multiplayer online game.
It's the stock market.
March 16, 2006
My Personal Supermap
Via Unmediated, the GPS-enabled TrackStick has a very limited, but possibly very interesting function: "It tracks where it goes, and it remembers where it's been." Although Telespial Systems, the company behind TrackStick, seems to be most excited about its snooping potential -- Spy on your kids! Watch your employees! -- I love the idea that I could keep it in my pocket for a few months and produce an incredibly detailed map of my life.
March 14, 2006
Infinite Storage? Here You Go
Whoah. So apparently it's the Amazon Grid. No consumer-ish interface but it seems like it would be, like, a day's work for a web developer to make one.
March 12, 2006
Nicholas Carr has a run-down of an Economist article about Quaero, the European public/private search engine project:
The effort's "stunningly ambitious" technological goals, writes the Economist, "show that Quaero is intended to be far more than just another would-be Google, but a leap forward in search-engine technology." Quaero is, for instance, being designed to allow images and sounds to be used as search terms, in addition to traditional keywords [...]
That sounds cool! And in the wake of a few too many underwhelming new offerings from Google, this rings true:
One thing Quaero has going for it is focus: While Google, Yahoo and Microsoft all have complex business interests extending well beyond search, Quaero does not. It has the kind of clean slate that Google had ten years ago when it came to life in a university.
Hey, shades of Regulating Search here: Maybe there's something to be gained by thinking of search engines as utilities, with the same kind of public/private DNA.
March 9, 2006
For the first week since I've been keeping track of it, Firefox is more popular than Internet Explorer with visitors to Snarkmarket (for the week beginning March 1 and ending March 8). 46.64% of visitors over the past week used FF, compared with 41.24% who used IE. IE won out by a slight margin (44.56% to 42.85%) over the month from Feb. 8 to Mar. 8, but FF is trending up:
1/11-2/8: IE (49.25%), FF (39.07%)
12/14/05-1/11: IE (47.97%), FF (39.71%)
11/16-12/14: IE (51.7%), FF (35.73%)
10/19-11/16: IE (59.2%), FF (25.6%)
The Future of Photos
The iPod Moment: When a technology no one knew they wanted becomes indispensable.
Before the iPod came along, no one was sitting around saying, "You know, it would sure be nice to have a portable library of all the music I could ever hope to listen to." A year after it first came out, I was still asking what the big deal was. After all, portable CD players that could play MP3s had been around for a while without totally taking off, and they could carry a decent amount of music. Who needs to have every song they own in their pocket?
Then I was given an iPod, and suddenly that need was mine. Yes, Master Jobs, I understand now. It was a fundamental shift in music delivery. I will never question you again. Lead me.
Yesterday, I got into a long conversation with my boss about iPod moments for other technologies, especially photos. And it reminded me that I had to blog about Memory Miner.... Read more ....
March 6, 2006
I'm not going to link to a single thing from Infosthetics.com, 'cause the whole site is so darn best. (If you're into information visualization. Whoo!) But the site is filled with interesting visual experiments, most of which I haven't seen anywhere else, and I'm surprised I haven't run across it before.
It's one of many wonderful links in a particularly stellar Things Magazine entry, up to and including:
And I'm not sure I understand this anecdote, but I certainly intend to repeat it:
When James Ivory entrusted Anthony Hopkins with the construction of that fantastic character, the servant in The Remains of the Day, Hopkins at a point had a problem of a conceptual nature and asked for help. Ivory advised him to talk to an old Windsor butler, an expert on the subject. Hopkins invited him to tea. They sat down and chatted for a while, but in fact, when the meeting came to an end, Hopkins had a feeling that this old servant had not told him anything. He walked him to the door and as he was about to leave, determined to extract something from the character, he blurted out, "Tell me, finally, what is a servant?" The old man turned, thought about it for a second and said, "A servant is someone who, when he walks into a room, makes it look emptier than it was before."
February 28, 2006
Dammit, Apple. Wi-fi, not hi-fi. What do you think this is, 1973? I've seen frickin' iPod speakers.
February 23, 2006
File Under: Best invention ever. GE has made a cheap plastic so water-repellent even honey slides right off it. Check out the video at GE's Global Research Blog (side note: check out the rest of the blog too; pretty interesting). You may have to right-click on the video and download it to view the full thing.
What does this portend? For one thing, ketchup (or shampoo or honey, etc.) bottles where all the ketchup slides right out with no coaxing. Technology Review imagines self-cleaning buildings and cool medical applications. (via Everywhere)
February 21, 2006
Mashup Camp: Where's Waldo?
February 20, 2006
I'll post any interesting (geeky) notes in the extended entry.... Read more ....
February 16, 2006
Dropped Your Powerbook in a Volcano? No Problem
SG: This is our most famous computer: it's a laptop that was rescued from the bottom of the Amazon River. A cruise ship hit an underwater barge, and sank down to the bottom. And the woman, an amateur diver, several days later, against all international law, broke in with a Maglight flashlight. Went down two flights of stairs underwater. Green, dark water. Found her stateroom. Remembered to bring her key, and rescued her laptop, and got it to Drive Savers. And we recovered all the data for her.
DP: She must have had some REALLY important emails.
Remember, kids... back up your hard drive.
February 13, 2006
Cogs in the System
February 12, 2006
The future is clearly multi-input touch screen interfaces. I mean, maybe the crazy infrared LED refractimacation causes syphilis or something, thus rendering my prediction totally off-base. But otherwise, just tell me whom to buy stock in, and I'll start liquidating my 401(k). Just watch the video with a bag around your head, so it's not too messy when your mind gets blown.
PS: Proof that I am, after all, fundamentally old-school: my first thought after seeing this was, "Whoa! If this stuff were in an e-book reader, we could replicate the interface of an actual book!!"
February 11, 2006
Firefox No Longer Fugly
For those of you who've suffered too long with ugly Firefox themes, I have great news. Someone has finally created a Netscape theme, both beautiful and attentive to detail. If you're using Win XP, also install the pretty Media Center theme Microsoft has made available, and your desktop will be hott like Infangelina.
February 10, 2006
This Lifehacker tip on tagging your songs in iTunes is actually hella handy. Most of my songs lack the metadata to make the "smart playlists" useful. I'm totally changing that right now.
February 8, 2006
Bill Joy's Six Webs
In a lecture at MIT, Bill Joy explains that there's not just one web:
[...] the "far" web, as defined by the typical TV viewer experience; the "near" web, or desktop computing; the "here" web, or mobile devices with personal information one carried all the time; the "weird" web, characterized by voice recognition systems; the "B2B" web of business computers dealing exclusively with each other; and the "D2D" web, of intelligent buildings and cities.
So rad it hurts. I love the image of the B2B web chugging along, all those servers just wrapped up in their weird silent conversations...
February 7, 2006
OK, despite it conforming pretty well to my Web 2.0 tired-ass design checklist, I actually think Yahoo!'s test of a new home page looks purty. And what is this about Yahoo! video games?
Google: High in Fiber!
Every week it seems another story comes out about Google's oh-so-mysterious plans for the "dark fiber" it's been purchasing. Does anyone else suspect the reason for the proliferation of this story is the sexy, noirish sound of the words "dark fiber"? Would we have heard twice about this if the story involved Google exploring "wavelength-division multiplexing" technologies?
February 1, 2006
Two Whole New Worlds
To ignite the public imagination with the possibilities of life on other planets, a group of researchers from NASA and SETI have created an elaborate scientific vision of what alien worlds might look like. Their projections appeared in a National Geographic special last fall, and are currently on display at the London Science Museum.
The scientists started out by imagining two Earth-like planets -- "Aurelia" and "Blue Moon" -- with some key differences in atmospheric density, orbit, etc. Then they performed some crazy advanced computer simulations and came up with super-detailed visions of the types of lifeforms that would inhabit these alternate worlds.
For example, the incredible denseness of the atmosphere on Blue Moon makes the evolutionary leap from sea animals to flying animals much more straightforward, producing a species of airborne whale-like creatures. Aurelia's synchronous rotation means sunlight is a precious commodity, so trees become tree-animals, moving slowly on tentacles to maximize their exposure to the sun.
Tentacular tree-animals? Flying whales? Crazy, right?
Ha. Probe the Internet a little and you'll find all sorts of folks criticizing the NASA/SETI scientists for being too conservative in imagining other planets. Carbon-based life forms are so boring, says the Fortean Times. Why not silicon, like on that one Star Trek episode? (Wikipedia's rather critical entry on the project tells us the tendency for scientists to assume all life must be carbon-based is often called "carbon chauvinism." New favorite thing.)
OK, I know I said I wasn't generally a fan of science fiction, but if
sci-fi SF authors all had hott interactive Flash applications (and a blog, no less!) to illustrate their visions, I think I could dig it.
An article in this month's Wired about the project piqued my interest, which led me to the Nat'l Geo presentation, which is the main attraction. Make sure you watch the movies and listen to the audio commentaries.
January 30, 2006
Country Without Wires
GrameenPhone, the biggest cell phone provider in Bangladesh, just hit six million subscribers. Worth noting:
- That's number is still a small fraction of the country's entire population (145 million), but it's growing very very fast, with the last million added in just two months.
- My first cell phone was from GrameenPhone! That's right: I first experienced the wonder of wireless calling technology... in Bangladesh.
For my money, this is actually more important work than Grameen's microlending.
January 25, 2006
Only Read This If You Are A Serious, SERIOUS Web Geek
A bunch of folks, Google tells us, have studied thousands of Web pages to see what (X)HTML authoring techniques are most prevalent. Well, Google just completed another study like this, with a sample size of just over a billion pages, giving us a pretty definitive guide to what's going on in the world of Web markup. Their writeup of the study's conclusions is highly snarky and readable, and rather fascinating if you, too, are geeky beyond redemption (or if you have a hand in deciding what Web standards should be).
The heaviest snark comes into play in the writeup of how people use the
meta element, which usually contains the stuff they're trying to highlight for the search engines. Saddest fact: a totally useless HTML expression (
<meta name="revisit-after">), invented for a defunct search engine nobody ever used, is more popular than the standards-beloved
<em> tag. Fun fact: The New York Times uses its very own HTML element,
January 23, 2006
Nokia has ported Apache to Symbian, its mobile phone operating system. In non-dorkese: It's possible every phone could be a web server.
What that means, exactly, or when it will be practical, I have no idea. But I think it has the smell of significance.
January 20, 2006
"I Have a Master's Degree... In Science!"
P.S. Ten Snarkpoints to anyone who knows the source of the headline quote.
January 17, 2006
I Like the Way You Move
Whoah. I just got a glimpse into the year 2016, and it's all dancing robots.
January 7, 2006
Now This Is a Promising Piece of Equipment
At CES, Sanyo announced a tiny videocamera that records HD to a memory card. For $800. Say whaaa?
January 6, 2006
The Hammer and the Octopus
Nick Carr on Bill Gates' CES presentation (which I am embarrassed to say I watched online):
So what does Gates talk about? The "digital lifestyle" with "software at its center." Maybe robots want digital lifestyles, but human beings don't. Human beings want lives.
It's true. Somehow Apple's competitors still don't get that the iPod's brilliance is its interfacelessness. It is about as close to a hammer as you can get with a music player. It doesn't really have "features" (or at least it doesn't feel like it); it just does its thing. And lets you do yours.
I suspect Apple's new PVR is going to be the same way, and make Microsoft's system look awfully gangly by comparison.
January 5, 2006
Play Well (With Robots)
(Via The Long Tail.)
December 31, 2005
Instant RSS Feeds from E-mail
Best previously-undiscovered Bloglines feature ever: At the bottom of the Bloglines menu, on the 'My Feeds' tab, you'll find a link that reads "Create e-mail subscriptions." Click on that link, and you will be taken to a magical place where Bloglines will generate a random e-mail address for you. Any e-mails sent to that address will show up as an RSS feed in whatever Bloglines folder you specify. Excellent.
December 27, 2005
Micro vs. Macro in a Duel to the Death
Get ready: I am about to compare Wikipedia to Wal-Mart.
Chris Anderson says the magic of Wikipedia (and other internet systems, e.g. Google) is that they work on hugely macro "probabilistic" scales. Think of it like this:
To put it another way, the quality range in Britannica goes from, say, 5 to 9, with an average of 7. Wikipedia goes from 0 to 10, with an average of, say, 5. But given that Wikipedia has ten times as many entries as Britannica, your chances of finding a reasonable entry on the topic you're looking for are actually higher on Wikipedia. That doesn't mean that any given entry will be better, only that the overall value of Wikipedia is higher than Britannica when you consider it from this statistical perspective.
OK, but what are the broader consequences? Might not this statistical optimization of "value" at the macroscale be a recipe for mediocrity at the microscale -- the scale, it's worth remembering, that defines our own individual lives and the culture that surrounds us?
So here goes: This seems analogous to the debate over Wal-Mart.... Read more ....
File under: Snarkpolicy, Society/Culture, Technosnark
December 19, 2005
More Snapshots from the Uncanny Valley
What do you think? Knowing that none of these faces belongs to a human, do you find them freaky, or actually kinda hot? Do any of them work for you? How about when you compare them to this set of faces? Does it help if they're not looking at the camera? Are you wigged out yet? (Ferreterrific.)
December 16, 2005
A Google of One
But beyond that, the RAD Lab's vision itself is amazingly radical. They want to do for internet apps what the web did for information publishing. That is: lower the barrier of entry to zero. They write:
If we succeed, the next killer Internet app will be written, deployed, operated, at Google-like scales, by a single programmer.
That is so audacious! I love it!
December 14, 2005
'Pedia Still Astonishingly Awesome
Only eight serious errors, such as misinterpretations of important concepts, were detected in the pairs of articles reviewed, four from each encyclopaedia. But reviewers also found many factual errors, omissions or misleading statements: 162 and 123 in Wikipedia and Britannica, respectively.
I'm pretty darn awed by that.
If you've been watching Romenesko's letters this week, you might have caught Karen Heyman's letter about Wikipedia's problems. A snippet:
Unless you already know a field, you can have no idea that an apparently definitive entry presents only one side of an ongoing fight between specialists. That it may be changed, and changed back again, hardly helps matters. This, btw, is the best explanation as to why simply sitting back and saying, "It's okay now, it's changed," ultimately would not have worked for Seigenthaler. Chances are high that later somebody would have come along to "fix" the correction.
Wikipedia is a fantastic idea, a wonderful service, with entries that often reflect great effort and care. Unfortunately, inevitably, as it's grown, the flaws built into its original design have become more obvious. Egalitarian editing may be a noble goal, but the reality is that if Wikipedia is to truly fulfill its promise, it needs a way to vet contributors, to let users know whether an entry on neuroscience was written and edited by a senior professor, a student who just took Psych 101, or a layperson who's paraphrasing an old issue of Scientific American. Certainly prankster Brian Chase's initial belief that Wikipedia was a joke site says a great deal about how some of its entries appear to the general public. If Seigenthaler's complaint actually leads to more accountability, far from hurting Wikipedia, he may ultimately have saved it.
I'll cross-post my reply to Ms. Heyman below:... Read more ....
December 12, 2005
The "Web 2.0" Design Aesthetic
Browse through a gallery of the ever-more-crowded world of Web applications released in beta. Good Lord, our Web design is becoming homogenized. Almost everything looks like the love child of a Google application and OS X. Predominant color scheme: secondary colors on a white background. Font of choice: almost invariably a rounded sans serif, usually in lowercase. Rounded edges and gradients are the new black. Talk balloons are everywhere (exhibits a, b, c, d, e, etc.).
Much of this stuff is Good Design, but it's so ubiquitous that it's become visual static. There's absolutely nothing wrong with a good gradient or a rounded sans serif font. But if those are the only hallmarks of the design, please try again. I say this not as any sort of a self-styled designer (I'm totally not a designer), but just as someone who sees a lot of websites. I'd guess that much of the Web 2.0 backlash is a reaction to the pre-fab*, lifeless aesthetic it's spawned.
* Not that prefab is always bad. I'm definitely going to this exhibit this week.
December 9, 2005
Dudes. Our Googling monkeys tell us that more than half of Snarkmarket's cosmopolitan, discerning, tastemaking audience still uses frickin' IE. I don't know if IE is still the hellish experience it was before I switched over, leaving you in constant peril of attack by nasty viruses and annoying popups. But I do know from the occasional moments when I'm forced to use it that it remains a far inferior browsing experience than Firefox for several reasons. Chief among those: 1) tabs, 2) extensions, 3) configurability, 4) display.
Watch the Rocketboom entry of Dec. 2nd, note the responses of the IE users surveyed, decide you don't want to be in such company, and sacrifice the one minute and thirteen seconds it takes to install Firefox.
December 8, 2005
Spymaster in Three Easy Steps
Step 1. Go to Microsoft's new Windows Live Local maps.
Step 2. Click "Locate Me." Just have it use your IP address.
Step 3. Switch to Bird's Eye mode, if available.
Your mileage may vary... but it knew exactly where I was. And I could see myself in the window. No, just kidding. But not by much.
What Do You Call a Star Wars Database That Anyone Can Edit?
December 3, 2005
Okay, I am at the Regulating Search conference at Yale, and will post notes in this entry as the day progresses.
Just got done with the first panel, which I was on.
Andrei Broder from Yahoo has a neat high-level outline: Search is transforming from syntactic (e.g. matching keywords to text on a page) to semantic (e.g. understanding what it is you're actually talking about), and will continue on to "information supply" (e.g. no explicit searching -- information just appears as you need it).
Now, the panel on regulation.
Barbara van Schewick drops an interesting factoid: In terms of eventual transactions, there's a drop-off of more than fifty percent from the first search result to the second! Wow.
Renata Hesse, from the antitrust section of the Department of Justice, responds briefly to the idea of Google FOIA... with horror. Just because it has the potential to create so much work! (Conference attendee Michael Zimmer thinks it's interesting, though.)
In response to a question, Yahoo's Andrei Broder distinguishes between "navigational searching" (e.g. I am looking for the University of Chicago law faculty blog, but don't know the URL) ) and "informational searching" (e.g. I am looking for a good constitutional lawyer). Apparently about 25% of all searches are the former.
Lunch break! And the entry break as well. Read on.... Read more ....
December 2, 2005
The End of the Internet
Here's a scary, thought-provoking essay by Doc Searls, spinning out the implications of this exchange between a BusinessWeek reporter and the CEO of SBC:
How concerned are you about Internet upstarts like Google (GOOG), MSN, Vonage, and others?
How do you think they're going to get to customers? Through a broadband pipe. Cable companies have them. We have them. Now what they would like to do is use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it. So there's going to have to be some mechanism for these people who use these pipes to pay for the portion they're using. Why should they be allowed to use my pipes?
The Internet can't be free in that sense, because we and the cable companies have made an investment and for a Google or Yahoo! (YHOO) or Vonage or anybody to expect to use these pipes [for] free is nuts!
It's on the backs of these "pipes" that all the content on the Internet is delivered to us, Searls points out. And the companies that laid these pipes did so at considerable expense. And Searls draws together comments from industry execs and drafts of legislation to show these companies gearing up to collect on that investment:
The carriers have been lobbying Congress for control of the Net since Bush the Elder was in office. Once they get what they want, they'll put up the toll booths, the truck scales, the customs checkpoints--all in a fresh new regulatory environment that formalizes the container cargo business we call packet transport. This new environment will be built to benefit the carriers and nobody else. The "consumers"? Oh ya, sure: they'll benefit too, by having "access" to all the good things that carriers ship them from content providers. Is there anything else? No.
Searls imagines three scenarios: 1) The one where the telcos get their way. 2) The one where municipal WiFi and private investment (like GoogleNet) carries the day. 3) The one where we users of the Internet reframe the debate from being about "pipes" and "packets" and "carriers" to being about "markets" and "worlds" and "places." In other words, the Internet isn't just a lot of bits of content ("property") going from one end to another. It's a place where people go to create and connect. "We go on the Net, not through it," Searls says.
This is a vast simplification of Searls' argument. Much good stuff is in there, including his pointers to worldofends.com, where he and David Weinberger have written up some fascinating thoughts on things like why the Internet is stupid.
Go read it, and also read the if:book entry that pointed me to it. Since running across these, I've started to pay a lot more attention to what the telcos seem to be fighting for, and Searls' guess doesn't seem very outlandish at all.
PS: I can't imagine any developments, no matter how fiendish, would actually herald the End of the Internet, but it makes a nice attention-grabber. Sorry. :)
File under: Society/Culture, Technosnark
November 28, 2005
The New Procrastination
So I'm going to this conference on Saturday. Looks to be a room full of super-smart academics, lawyers, and technologists. And me.
We're all to write a short position paper ahead of time, so as to facilitate a running start on the conversation; it's only a day-long event. The papers were due today, and I got mine in, but without leeway to do what I really wanted to: post it here ahead of time.
That's the new procrastination: not waiting until the last minute (although I did that too), but specifically waiting until it is no longer reasonable to call on your blog readership for comments and critiques.
Anyway, at the conference, I'm on Panel 1, which aims to
review the wide range of what search engines do and their importance in the information ecosystem.
industry participants, computer scientists, and analysts will flag major trends in search engine technology and try to predict future developments, with the goal of pointing out those trends that will create new conflicts and new litigation.
I actually had a tough time with this; I didn't want to just make a bunch of random, breezy predictions about video search or super-cool maps or whatever. So I spent the day on Saturday trying to come up with something that really got me excited.
Position paper after the break. It's already turned in, but of course I'll have to talk about it (and other things) on Saturday, so comment! Comment!... Read more ....
November 22, 2005
November 20, 2005
Google as Wal-Mart
If you love Google-ology, then you gotta read this column by Robert X. Cringely!
P.S. When I was little my dad would bring home issues of "InfoWorld," an IT-business magazine, and the only part I cared about (or could understand, really) was Robert X. Cringely's column in the back. I have loved him ever since.
November 17, 2005
Mo' Laptops, Mo' Problems
Hey, they unveiled the prototype of those $100 laptops we've blogged about before.
In the Washington Post, Seymour Papert says: "It will change ... the way children everywhere think about themselves in relation to the world."
CNN's story quotes Nicholas Negroponte like this: "One laptop per child: Children are your most precious resource, and they can do a lot of self-learning and peer-to-peer teaching. Bingo. End of story."
But Ben Vershbow at if:book says:
Sorry to be so snide, but we were watching the live webcast from Tunis yesterday... it's hard not to laugh at the leaders of the free world bumbling over this day-glo gadget, this glorified Trapper Keeper cum jack-in-the-box (Annan ended up breaking the hand crank), with barely a word devoted to what educational content will actually go inside, or to how teachers plan to construct lessons around these new toys. In the end, it's going to come down to them. Good teachers, who know computers, may be able to put the laptops to good use. But somehow I'm getting visions of stacks of unused or busted laptops, cast aside like so many neon bricks.
There is a grain (maybe several grains) of cagey wisdom there, and some useful caution. All the same, I'm excited to see what happens with these things.
(if:book features some of the most thorough thinking around. I totally recommend the feed.)
Update: Great, detailed on-the-scene interview with the CTO of the $100 laptop project by Andy Carvin. I love the internet!
November 10, 2005
Waiting for the Mozilla Phone
Here is a long O'Reilly blogpost that sums up some recent developments in the world of telephone software. Let me strongly second the author's argument that if we had an open, hackable mobile network, things would get awesome really fast.
Have I mentioned I hate my Verizon phone?
November 9, 2005
CSSVista: Live CSS editing with Internet Explorer and Firefox simultaneously. Hoooott. [/geek] [oh wait]
October 28, 2005
Setting the Table for a Feast We Will Not Share
This line in a recent Boing Boing post stopped me in my tracks. George Dyson went to Google, and writes:
The mood was playful, yet there was a palpable reverence in the air. "We are not scanning all those books to be read by people," explained one of my hosts after my talk. "We are scanning them to be read by an AI."
I think there are two ways to read that line... both interesting, but one really interesting (and kinda creepy).
October 25, 2005
Yet Folks Still Use Webex
Has anyone else heard of Freeconference.com?
"FreeConference.com offers a terrific value to consumers – free conference calls – with no strings attached. The company deserves credit for coming up with an innovative online business model that actually works... The site makes it easy to learn about FreeConference and to do business with the company."
That's not (to my knowledge) promotional text written by the company's PR reps. It's what one of the judges in the Webby Business Awards competition wrote when the company won first place for telecommunications in 2003. And this Google Answers checkup on them is solidly positive.
Why isn't this company everywhere right now?
Basejumping to Conclusions
After hearing about Google's BigTable data-organizing scheme the other day, posters at Google Blogoscoped started musing about the possibility of a Google database where anyone could list and organize oceans of content. Then Tony Ruscoe discovered that Google had recently added the subdomain base.google.com. Then the site went live. Briefly. Long enough for folks to capture some screenshots and an official description:
Post your items on Google.
Google Base is Google’s database into which you can add all types of content. We’ll host your content and make it searchable online for free.
Examples of items you can find in Google Base:
• Description of your party planning service
• Articles on current events from your website
• Listing of your used car for sale
• Database of protein structures
October 20, 2005
Let's Be a Multiple-Planet Species, Shall We?
I know some of us here on this blog aren't too keen on funding space science, but you gotta admit there's something compelling about the NASA chief's argument in this WaPo interview.
Or maybe he's wrong to be worried about 'mass extinctions.' Are we sufficiently advanced and resourceful that we could survive a cataclysm here on earth?
October 18, 2005
Let's Chat About E-Paper, Shall We?
The WaPo's Frank Ahrens hosts one of washingtonpost.com's signature online chats with the CEO of E-Ink. He kicks it off with a really fun introduction that includes this:
But, wait, there's more. Later this week, I'd like to write a story for the paper Post (or the "fiber media" as some electronic media folks call it) off of today's interview and discussion. Because a newspaper is at bottom a business, we need to know what our customers want. So your voices will be critical to the story I'm going to write. Tell us about your newspaper habits--how you read it, what you want and so forth. I'm sure Russ would be interested, as well.
How cool is that?? Read the interview; it's quite good.
October 17, 2005
A Font of Fontage
October 12, 2005
Snarkommenter Saheli has written out her thoughts on creating a massive global database for volunteering, an especially useful resource post-disaster. Go read Saheli's thoughts, at Socialtext and on her blog. I posted some thoughts on her blog and I'll cross-post an excerpt in the extended entry of this post.
October 11, 2005
The Results-Based Community
Jakob Nielsen gives a succinct overview of user interfaces from command-line to WYSIWYG. Clicking on menus to choose commands represented an improvement over command-line interfaces when the range of available choices was still small, Nielsen says, but today, menus provide a poor navigation interface. Noting Microsoft's new UI announcement, Nielsen heralds the age of results-based user interaction, where you choose what you want your application to do by selecting from a gallery of possible outcomes.
October 10, 2005
Wired is blogging now, but so far nothing blows the mind. More importantly, Stanford won this year's Grand Challenge! If you'll recall, last year, DARPA promised a million bucks to whichever team could create a driverless vehicle that would automatically navigate a treacherous obstacle course. No team won. In fact, the best-performing vehicle conked out after eight miles on the 142-mile course. This year, DARPA upped the ante to $2 million, and voila! A winner.
Update: Previously unknown fact -- Congress wants a third of all military vehicles to be riderless by 2015.
October 4, 2005
What Could Possibly Go Wrong?
Controlled artificial tornadoes: another excellent renewable energy scheme... and/or the setup for a crazy technology-runs-amok sci-fi story.
OSX in Your Browser
Wow. This software looks like it could be pretty awesome for OSX devotees using Windows boxes. And the Web site is a thing of wonder. Buggy, but it's amazing how much of OSX's functionality they built into a Web page. Brilliant! (Via.)
October 3, 2005
Google + Sun = ?
PaidContent reports that Google and Sun are announcing some joint thing tomorrow. Awesome speculation ensues.
October 1, 2005
Can a Fella Get an e-Book Around Here??
So I have been printing a lot of stuff out recently -- you know, big articles from the WaPo, New Yorker, Atlantic, WorldChanging, etc. that are just too long to read on the screen. And I enjoy reading the printouts on the train -- but lately the pile has been growing awfully large and dorky-looking. (Not to mention wasteful!)
What if I could just flag these documents for later perusal on an e-book?
A Chinese company has a Librie competitor that is apparently a lot more open-ended than the DRM-laden Librie itself. Could somebody hurry up and import these things?
September 28, 2005
$100 Laptop Photos
I really hope this happens. I really hope this happens. The folks at the MIT Media Lab have been talking about this $100 laptop for what seems like forever. Today they released some photos of their prototype (see the gallery). Nicholas Negroponte calls the project the most important thing he's ever done in his life. I think I agree. How awesome would it be if millions of very poor people could have WiFi-enabled laptops with their regenerative power supplies? (Via if:book.)
September 27, 2005
September 25, 2005
Waiting for Nanobot
Ray Kurzweil's most stunning observation is also his simplest:
When you're dealing with a process that grows exponentially, everything happens at the last minute. Think about it: If things double with every step, then you can be just one step away from completion but still only half-done. So it's misleading (perhaps even unnerving) when you're stuck inside the process, racing through the eleventh hour, and it looks like you're going to fall way short. But trust the exponent, Kurzweil says, because it rules everything.
And he has lots and lots of graphs to back it up.
All of Kurzweil's exponential traceries have to do with information technology. But that's okay, he says, because everything is just information technology -- or if it's not yet, it soon will be.
Going further: Kurzweil thinks the purpose of life is the expansion of knowledge. (It's a very Googly outlook -- and sure enough, Kurzweil has nothing but glowing things to say about the Goog.) So his story is actually pretty simple: Information technology grows at an exponential pace. It carries us with it into a future of omniscience and omnipotence. Done and done!
That's pretty broad, but Kurzweil doesn't shy away from specific predictions, either. By 2010, he says:
- Computers will disappear;
- images will be written directly to our retinas;
- we'll have a high bandwidth connection to the internet at all times
- electronics will so tiny they're embedded in the environment, our clothing, our eyeglasses;
- we'll have full immersion visual-auditory virtual reality;
- and augmented real reality, too;
- we'll use virtual personalities as a primary interface; and
- we'll have effective language technologies.
Beyond that? The nanobot revolution. Artificial blood cells that make Olympians of us all. Radical life extension. The transcendence of biology. Expansion of human (or neo-human or whatever) civilization out into the cosmos. Then the omniscience/omnipotence thing.
Okay. That's Kurzweil. Now it's the snarkotron's turn.... Read more ....
September 23, 2005
Another Peek Into the Mind of God
September 21, 2005
Awesome Tools of the Day
This is an old link, but seeing as I need to decorate my workspace, I've been casting many a curious eye at the Rasterbator, which lets you make giant, nicely pixellated mosaics of any image you've got.
OK, I've written exactly six posts so far in September, and the month is almost over. For those of you who didn't know, this is because I've been in the process of moving to Minneapolis. Ruminations on moving will happen once I'm sitting at my computer in my apartment, which won't happen for at least a week. For now, Robin does a pretty good job, no?
The Singularity begins the moment when humans create a technology more intelligent than themselves. Singularity theorists like Ray Kurzweil argue that the rate of innovation on earth has been increasing exponentially since before we got here, and now rests on the brink of outpacing human ability to keep up with it. When that happens, the theory goes, humans become obsolete and machines take over, innovating faster than we can possibly imagine.
For a primer on the concept, try this Vernor Vinge essay, what some call the first articulation of the Singularity. Vinge is something of a Singularity pessimist; in most of the outcomes he posits, life gets pretty bad for humankind. Kurzweil, whose new book is the catalyst for this post, takes a much cheerier view; the Singularity means humans will pretty much be over mortality, poverty and disease.
The kicker? Whenever they estimate how soon we've got till the Singularity, Vinge, Kurzweil and others talk in terms of years. Not millennia, not centuries, barely even decades. (Vinge: "I'll be surprised if this event occurs before 2005 or after 2030." Kurzweil: "By 2030, a thousand dollars of computation will be about a thousand times more powerful than a human brain.") Folks like Kevin Drum think 30-40 years are a generous estimate. (Drum: "Seems to me that the Singularity should be right on our doorstep, not 40 years away.")
Interestingly, the Long Bet between Kurzweil and Mitchell Kapor on this topic (the very first Long Bet) has folks split exactly 50/50 as to whether it will play out like Kurz says.
More on the Singularity and Kurzweil's book:
- KurzweilAI.net:: Tons of articles from Kurzweil and his fan club.
- InstaPundit: An interview with Kurzweil.
- Acceleration Watch: More Singularity theorists.
September 15, 2005
Yahoo's Instant Search
Not to be a total nerd or anything, but Yahoo's Instant Search is hot.
Type "san francisco weather" or "al pacino" or whatever. I think it's notable for its creativity -- I certainly wasn't sitting around clamoring for this feature, but now that I have it, I'm like, ohhh yeah -- that's pretty handy.
September 13, 2005
Like Ender's Game, Except for Nerds... Oh, Wait
This summer camp for startups sounds more like an inventive short story than something real. Entrepreneur Paul Graham's crew paid gangs of young coders several thousand bucks to buckle down and create products this summer -- in return, Graham & co. get a 5-7% stake in their companies.
September 7, 2005
They Got It All Wrong!
If the iPod Nano was itself the iTunes phone, then we'd be in business.
August 28, 2005
A True Notebook
Was at the Sony Style store the other day and saw the T-series VAIO -- they had it sitting atop a stack of books and, whoah, it looked like just another book. And a trade paperback at that.
Then, this morning I saw a a nice mention of the T-series over on Bart Decrem's blog (he is CEO of the company that's making that new "social browser" Flock). So this led me to investigate a little further.
One thing I didn't realize: These machines come with built-in cellular cards, so they can go online via Cingular's network as well as via WiFi. Now, check this out: CNET reports that it's $80 a month for unlimited data. That's kind of a lot... but at the same time, think about it: You could pop this thing open anytime and just be online. Cingular has a 3G network so it's a fast connection, too. VoIP, y'all.
A couple of implications: One, if I had a couple of Gs to drop I would totally drop them on this thing. Two, did you hear that Apple has been trying to recruit VAIO engineers for its new Intel-based Powerbooks? Hmm. Could be cool.
August 23, 2005
OMG!!11! Google LOL
OK, we all knew Google was releasing an Instant Messenger client, but how pissed will the gearheads be when they discover it actually isn't configured to work with AIM/MSN/Yahoo/etc.? Yet, at least. They say in their FAQ that they're working on it. Currently accessible only with a Gmail username.
August 16, 2005
Color Images Projected Into Thin Air, Anyone?
August 10, 2005
Enter the Black Dog
I wish I was badass enough to actually need a tiny Linux server that plugs into any computer's USB port and takes it over.
Plus, it has a biometric thumbprint thing! It's so Snow Crash!
August 8, 2005
The lack of screenshots, feature listings, or even an explanation of exactly what it is all prevent me from guessing whether Meetro doesn't suck. (You have to actually download and install the app, which is totally played out. See more snark from Jason Pettus. And Stowe Boyd at Corante actually did post a screenshot.) But it appears to be a location-based social networking thingie that guesses where you are based on WiFi signals and tries to find other folks around you who match your profile. See also: Mates | PlaceSite | Dodgeball.
I think this location-based networking business has huge potential, especially as cell phones and other ubiquitous devices become sophisticated enough to partake in it.
When digital social networking is paired with analog social gathering places, I think people will go a bit nuts for it. Imagine a sort of venue-oriented version of the new HotOrNot Meet Me site. (The principle behind the site is that you look at a rotation of photos of random people, clicking yes if you'd like to meet each person, and no if you have no interest. If you click yes, your own photo goes into the other person's photo rotation, and if they click yes for you, you're both notified.) Or say you're relaxing in the park, and you decide you want to pull together a pickup Ultimate Frisbee game. Send out the Meetro bat signal, and bam! Ten other people chilling in the park decide they're down for it, and the game is on.
One downside for Robin -- MeetRo.com is now taken. MeetRoSlo.com, however, totally still available.
July 18, 2005
So you've got four windows open on your desktop, and the one you want to drag your file into is at the bottom of the stack? A French researcher has imagined a method of leafing through overlapping windows to get to the one you want, almost like a book.
I think you'll have to watch one of the video demonstrations on this page to fully comprehend the coolness of this user interface scheme.
July 14, 2005
E-Paper's Here, Pt. 2
June 25, 2005
One Ring to Rule Them All
June 21, 2005
Supernova: Jeff Weiner
Jeff begins with what he calls "an exercise in sizing knowledge." Enter the query "real estate," and you come up with over 100 million results on Yahoo! search, Google, and MSN search. But what if you were seeking a more contained, possibly more valuable font of information? Say, everything Jeff's mother has learned in the last 26 years of selling real estate in Larchmont, NY.
That is knowledge that could be available on the Internet, Jeff says, but isn't, because of the technological barriers and the disincentives (or nonincentives) that keep it from getting there. According to Jeff, that's what Yahoo!'s here to fix.
The Yahoo! search vision, he says, is to "Enable people to find, use, share and expand all human knowledge."
- Find: Enable people to find what they are looking for
- Use: Search not for sake of searching, but to achieve a purpose
- Share: Sharing knowledge with people you connect with and connecting to people who you share knowledge with
- Expand: If we're successful at doing all of the above, Jeff says, this one's a freebie.
He points out that the first letters of each keyword in this mission statement spell FUSE, and helpfully defines the word.
He talks about finding a restaurant review on Yahoo! Local Search. Because both he and the writer of the review happened to be members of Yahoo! 360 ("happened" is probably a strong term here; he's Vice President of Yahoo! Search and the other member also worked at Yahoo!), he was able to better evaluate her review.
Relating a moment of extraordinary serendipity at a speech he gave in eastern Asia, Jeff sounds mildly Scientological -- This is the manifestation of the vision!
But then, we all have our L. Ron moments. Jeff leaves us with some helpful links:
Supernova: Mobile, Connected World Panel
Evan Williams, Mena Trott, Caterina Fake, Lili Cheng and Amy Jo Kim have gotten off to a rollicking discussion of our private selves, our public selves, and our cyber-selves. They're extending Jonathan Schwarz's talk on trust.
Trust structures on the Internet are complicated. Not just because it can be difficult to quantify who you trust to provide you information*, but also because it's difficult to control who accesses your online persona. And even the implications of this are not so clear-cut. Amy Jo Kim mentions that when she blogs, she must accept having no idea which strangers are reading it or how they'll use the information. Evan Williams points out that for some, the concern isn't strangers, but acquaintances or coworkers. Caterina Fake paraphrases a David Weinberger anecdote -- in the world around us, strangers mean danger; on the Internet, they mean connection.
So how do we create trust networks that can serve these diverse approaches? I didn't hear any direct answers from the panel, but it's a big question, so I forgive.
Interesting question from Amy Jo Kim: The kids who are blogging today, taking photos every day, writing their lives in public, what expectations are they creating for the future?
* I love my mother very much, but two of the last three e-mails she sent were quickly debunked in a trip to Snopes.com. Sorry, mom, you don't get to filter my news.
Jonathan Schwartz of Sun has the following pithy view of the near-future: It's all about trust and authentication. The defining characteristic of the next wave of web stuff will be: Who has access to what?
Also, a book reco: Empires of Light, a history of electricity. Analog to new networks and systems? You bet, Schwartz says!
He awaits the day when a mobile operator says: "We're open! Do whatever you want on this network." As do I... as do I.
Schwartz has a blog. He calls it "an extremely valuable tool." Cool.
June 20, 2005
Supernova: Virtual Wooorlds!
I'm psyched for this session 'cause Ian Bogost, the curator of Water Cooler Games, is here. He's also into "persuasive games" -- that is, games that do more than just entertain. Also here: the CEO of Sennari, a company that creates mobile economies (!), and a guy from Linden Labs.
Whoah dude, I didn't know this: In the Linden Labs game Second Life, players retain full IP rights to the stuff they create in-game. That is great!
ALERT: The phrase "value chain" just appeared. What is this, a presentation at a business school or something??
ALERT: Uh-oh, "extract value" has joined it. Save us, Ian Bogost!
Bogost is up. Talking about advertising in games. Noted: Disney's Virtual Magic Kingdom, a simulation of... a simulation.
In the world of advertising, Bogost says, the primary medium, TV, is being degraded. What ad guys know about is buying media space. So it's like: Hey, find me a new medium! Video games, YES! This company Massive is selling videogame ad-space in a very traditional way. So maybe you'd be playing Snarkmarket Adventures 3 and see a Volkswagon billboard. Done.
Bogost asks: Is that the best we can do?
He enumerates some qualities of online media: It's spatial, encyclopedia, participatory, and procedural.
The one that's particularly missing from videogame advertising is procedural, he says. (By "procedural" he means "rule-based" or "cause-and-effect.") Advertising that shows how products work -- not just how they look or feel.
Supernova: The Long Tail
At the Supernova conference, day one, at Wharton West in San Francisco.
Chris Anderson talking about his Big Idea: the Long Tail.
Gah! Dave Goldberg of Yahoo's music service says they have 5 billion (!) song ratings entered by users. 150 million a month. That's a lot of data! Also: They stream a billion songs a month and 350 million music videos a month. YE GODS.
Goldberg also says TiVo missed the opportunity to make their recommendation filters really good and useful. (They are currently quite bad and bizarre.)
David Hornik of August Capital keeps using the phrase "extract value." Very business.
Jeremy Allaire of Brightcove says big studios are spending $50-100 million each just to clear the rights on all their old content so they can distribute it over the internet.
Bottom line: Whatever, I'm sure the Long Tail is very long and very profitable. But I think being a "Long Tail company" per se would be boring. Making content is where it's at.
June 14, 2005
June 13, 2005
June 12, 2005
Thirty gigabytes on a super-cheap credit-card sized disc? Invented by a company whose English-language site hasn't been updated since 2002? Gosh, why does this sound so familiar? (Unmediated.)
External USB Hard Drive
This is rather tempting, except for the fact that I don't need it at all. A 200-gigabyte USB 2.0 external hard drive, for only $108. Expires tomorrow.
June 9, 2005
Poster Geek for Google Maps
So I'm on MUNI this morning, glancing at some guy's Examiner, when who do I see peering back at me but...
Adrian Holovaty! Looking devious, I might add.
Adrian, a longtime Friend of Snarkmarket, is featured prominently in this AP story about Google Maps hackery. Rock on!
June 7, 2005
Easy Google-Map Hacking?
I wouldn't keep posting these Google Map hacks if they didn't keep getting so darn awesome. The ability to customize Google Maps has been around for a few months now. But it has typically involved things like sending requests to various servers via Python proxies and altering XSL stylesheets (i.e. not for the faint of geek).
Fill out a simple form, and bam -- you've got a map. Your map nodes can have images, links, captions, etc. The next logical step is either Robin or I actually making a map with one of these things. With this one, I might just try. (Unmediated.)
June 5, 2005
There's also a brilliant list of color palette resources out there, but I only have the link bookmarked at work. I'll put it here when I find it again.
May 27, 2005
Hey, Look, Downtown San Francisco
Keyhole is now Google Earth, and as a registered Keyhole user (read: huge nerd), I got to download a beta of the new app. No big differences have jumped out at me yet, except, you know, the 3D models of all buildings.
I'm sure they won't be 1980s-gray for long.
May 23, 2005
Found in Translation?
You might have heard that Google let 100 journalists into the Lair last week for a rare "Factory Tour," previewing some of the goodies we can expect to see in the coming years. (It all sounds a little bit Willy Wonka-ish.) But did you hear this?
One fascinating area Google is aggressively exploring is automated language translation. Engineers have been studying the massive collection of translated documents that the United Nations keeps on its Web site -- as well as other document collections -- to develop a program that can automatically translate back and forth between documents.
To date, the company has examined about 200 billion words to train its system on the structures of various languages.
"If we can make every piece of the Web, every document, accessible to everybody, that will contribute something to the world,'' said Alan Eustace, vice president of engineering and research. "And that's what this project is aimed at.''
Google showed off a few translations it had performed using the new technology, from Arabic to English and from Chinese to English. They appeared nearly flawless.
The way language translation works now, apparently, is that people have created programs telling computers how different languages work. But with the complexity of language, given all its exceptions and colloquialisms, this doesn't work very well. (First sentence of this paragraph taken from English to French and back: "The translation of language in manner functions now, apparently, is that people created with computers of programs saying how the various languages function." Which is actually comparatively good.)
Google's taking a Rosetta Stone approach, teaching the computers to really learn languages by statistically analyzing existing translations. Philipp at Google Blogoscoped gives his thoughts on where this could lead.
May 15, 2005
The Future Is Now
I love the thought that our children are growing up used to having domestic robots in the house. Robots for them are slightly dim but friendly vacuum cleaners, not fearsome weapons or fantasy toys.
May 9, 2005
One of your favorite websites lacks an RSS feed? Roll your own.
April 19, 2005
Holovaty Strikes Again
When I'm bored at night, I watch downloaded episodes of Battlestar Galactica.
When Adrian is bored, he augments Google Maps with local public transit routes.
I don't live in Chicago, obviously, but come on. I had to to install this thing, just to see it. And whoah dude: It's really cool. The magical thing is that Adrian has just seamlessly inserted new stuff into the Google Maps interface -- no muss, no fuss, no other site.
A comment on Adrian's post led me to this full-on toolset for Google Maps hackery. Awesome.
I see a vision of a Flickr-enhanced Google Maps dancing before me... just out of reach...
April 14, 2005
Snarkmarket Is Waiting for Its Review Unit
Matt's been waiting for this for a long time: a cell phone with a tiny laser projector.
March 14, 2005
March 5, 2005
Those Came From Where??
Matt Yglesias reminded me of this 1999 Wired article about containerized shipping -- possibly the most interesting thing ever. Every time I drive past the Port of Oakland and see all the multi-hued containers stacked up, by mind gets slightly expanded.
February 26, 2005
So, I was thinking, maybe we could mimic the wicked efficiency of natural systems and usher in a new era of hyper-successful, totally sustainable human production? Just a thought. WHOAH.
February 23, 2005
Silly Google. Some day a hobbit will find your evil ring and destroy it.
February 20, 2005
Via MetaFilter, check out the cornucopia of Flashtasticness that is The Greatest Story Never Told digital storytelling contest. Including such greats as Craziest and Help.
February 17, 2005
This Is Going to Suck
Yeah. So. Robot army of the future:
Robots in battle, as envisioned by their builders, may look and move like humans or hummingbirds, tractors or tanks, cockroaches or crickets. With the development of nanotechnology they may become swarms of "smart dust." The Pentagon intends for robots to haul munitions, gather intelligence, search buildings or blow them up.
"The lawyers tell me there are no prohibitions against robots making life-or-death decisions," said Mr. Johnson, who leads robotics efforts at the Joint Forces Command research center in Suffolk, Va. "I have been asked what happens if the robot destroys a school bus rather than a tank parked nearby. We will not entrust a robot with that decision until we are confident they can make it."
Let me just say, for the record, that I am so not excited about "swarms of 'smart dust' " doing our bidding on the battlefield.
Robot soldiers in general, though: I'm actually torn. On one hand, fewer people in wars = good. On the other hand, robot soldiers. Also, it seems like it would be easy for government to carry on cruel wars of oppression with robot soldiers, you know?
February 8, 2005
One More Beautiful Map
A more rational mind than mine would begin fearing the power of Google.
February 3, 2005
Mirror, Mirror... Holy Crap, Where Did Those Bags Come From?
As you know, I love technology, but I'm also sometimes skeptical of its net utility, because it so often imposes burdens to match its benefits.
So I get really excited when I hear about technology that simply helps us experience something new. No device to carry around, no batteries to charge, no software to learn -- just a novel way of looking at the world.
Keyhole, the program that lets you zoom in on your neighborhood via satellite imagery, is one example. Here's another I just found:
Accenture Technology's lab in France is working on a mirror that reflects your future self.
In brief: Cameras watch you during the day and keep track of how much time you spend on the couch or in front of the fridge. Then, a computer uses the implicit lifestyle information to extrapolate your mirror image forward five years -- showing a fatter face, less hair, sallow cheeks, whatever.
Here's the rationale:
"Helping people visualise the long-term outcomes of their behaviour is an effective way to motivate change" [...]
The problem, of course, is that no one will ever buy the ugly-mirror.
But I could totally see this being a great science museum exhibit. You'd tap out answers to a quick lifestyle survey, then step into a private viewing chamber to see your future. It'd be creepy, and cool.
Link from we make money not art.
January 27, 2005
So I heard that Amazon's A9 search engine had some new local yellow-pages function. Whatever. Big deal.
Then I actually tried the new local yellow-pages function.
Take a minute to explore that link, the result of a search for 'canvas cafe' in San Francisco. Note that there is:
- a picture of the establishment in question, and, indeed,
- pictures of every establishment on the street. And you can
- walk up and down the street using the buttons at the bottom.
You're looking at the coffee shop I visit when I want a pot of tea. And if you click "Slider" on the bottom of the page -- under "Other Businesses Along 9th Ave" -- you'll be looking at the burger joint I visit when I want a delicious bacon heartstopper.
That's nuts! I love it! The steady digitization of the physical world! And soon -- mark my words, soon -- we will be tapping into this database from our mobile phones.
They've only photographed ten metro areas so far. But still: pretty good start.
Either Google is a) freaking out that they didn't think of this first, or b) chortling with amusement, secure in the knowledge that world.google.com will soon make A9's effort look like a fourth-grade diorama.
But I'm betting on -- and rooting for -- the former.
P.S. I eat here, too! Waughhhhh!
January 21, 2005
P.S. Everybody already has Gmail, right? If not, e-mail me (robin at this site) and I'll hook you up. It is the Stephen Buckley of webmail. (See below.)
December 22, 2004
Well, I Know What I Want for Christmas
In the back of Carlos Owens' southern Alaska yard, an 18-foot-tall steel robot is taking shape in the dim light of the winter afternoons.
Ahhh. Happy holidays.
December 9, 2004
Lawrence.com Lays It Down
Web feeds (RSS)
We've added web feeds (a.k.a. RSS feeds) to many of our site's pages -- including best bets, MP3s, blogs, venues and bands. This is full-blown computer geek orgasm-type stuff. It's incredibly useful, and, in a few years, it'll be all the rage among today's AOL users. Be the first among your friends to really "get it" -- try RSS now! For full information, check out our RSS page.
*I don't know what that's supposed to mean, it just sounded right
The site has a longer tutorial, too, aptly titled: "WTF is RSS?"
Which reminds me: Snarkmarket has an RSS feed! Subscribe to it, dawg!
May I say, in closing, that I wish there was a Lawrence.com for every city in the U.S.? It would be to bars, restaurants, and local bands what craigslist is to jobs, apartments, and lonely dudes.
December 7, 2004
Link Dumper (or) Apollyon's Cruel Laugh
I was just looking at Snarkmarket's traffic stats, and noticed that one of our top referrers, or pages from which people come here, is currently http://www.adminshop.com.
And dude: This is where you buy the program that spams blogs.
It's called Link Dumper, and I feel like I've just looked into the Eye of Sauron.
Check out the description:
Spread your sites into the farest reaches of the internet! Link Dumper is an amazingly powerful little WIndows tool that you can input site name, URL, and description into, which it then automatically inserts into open "linkdump" sites all over the internet. Linkdump is a term for sites that allow people to contribute links which are then instantly shared with thousands of people, and they have become very popular during the last 6 months. This really is the perfect way to start a site, as linkdump surfers are of the kind to further spread (over e-mail, forums, linkdumps, et cetera) URLs they find interesting, shocking, pornographic, or funny.
Gahhh! It's so offensive in its blithe practicality!
December 1, 2004
This Must Be What Neal Stephenson's Dreams Are Like
OMG, this rules. From Saheli Datta's blog:
*The dream also involved some friends juggling torches on a Shakespearan stage set inside a magnificent library, while other friends and I watched and flitted about the mezanine with billowing scarves tied to our wrists--nanofabric scarves that were catching information from the WiFi network and displaying it to us as we danced. That, I think, will stay in dreamspace for a while.
There is also a whole post about a Bruce Sterling speech and "spimes." And a nice Snarkmarket shout-out.
But nanofabric scarves, people!!
November 8, 2004
Arrrr! There's a Bounty on This Software!
Ransom is a software publishing model where the rights to the source code remain restricted until a set amount of money is collected or a set date passes, at which point the code is automatically freed under an OSI/FSF-approved license.
(Red Ferret for President.)
October 12, 2004
Behold the Wonders!
Segway 2.0 is here. I predict new sub-genera of homo sapiens will develop merely to perfect its use. Entire new languages will sprout only to describe it. For one shining moment, all the fighting and the violence in the world will stop so all humankind can admire its life-giving gleam. And it will put an end to cancer and world hunger. Let me be the first to say, this thing is going to be bigger than God.
October 8, 2004
I had to read this post about the science of global warming over at Chris Mooney's blog a few times before I got it. But I guess that's to be expected when you're dealing with such an incredibly complicated and delicate system.
Er, I'm talking about scientific publishing, not global warming.
Anyway, check it out: Maybe the earth's natural temperature fluctuations are wilder than we think, says well-respected scientist. Analysis TK.
October 6, 2004
Maybe you've heard the hype about that free Internet phone app called Skype? Well, check it: It actually does rule that much. Totally clear, zero lag, better than a cell phone. (Especially when your apartment is apparently lined with lead.) And free, free, freeee.
The EST/PST time difference has stymied my calling for weeks, 'cause by the time my cell-phone minutes are free, it's midnight in the far-off east. No longer!
July 26, 2004
The Free Software Experiment
On my old computer, I had a ton of software I'd acquired during college on education licenses and ... by ... er ... other means. Photoshop, Flash, Dreamweaver, Cool Edit, and assorted other programs. My new computer was built by a coworker for about $350, and I'm currently trying to populate it with good software. But I'm staying on the up-and-up this time. I'm sticking to all software I can afford. Thanks to the $20 copy of Win XP my coworker got me when he visited Microsoft, I didn't get Linux, although I considered it. Here's what I've found so far in my foraging for software, all free:
- Expressions 3: My drawing program. MS took over the company that had made this software, and has for some reason made it free. I've been quite happy with it so far, although there's a bit of a high learning curve, especially if you're used to doing all your illustration in Photoshop.
- Star Office 7: My office suite. Sun's slightly gussied-up version of the open-source OpenOffice project, which I have maligned in these very pages. I haven't had to do any hardcore word processing at home yet, and I don't know if I will, but I think if I do, this should be fine.
- The GIMP: My photo-editing program. Pretty capable, although I'm so used to Photoshop that I haven't had the patience to really get down and learn it. I could not deal with the interface until I got this plugin, though.
- RagTime Solo: My page layout program. A beauty from a German software company. The version of the software for personal use is free, and is exactly the same as the commercial-use version, which costs Њ845.
- Audacity: My audio-editing program. Decent, although I'm considering trying ProTools Free again now that I have a computer than can handle it.
- Picasa: My photo organizer. Why I need a photo organizer, considering Win XP's photo display features are perfectly satisfactory, I don't know, but it's free. Whatever.
Other sundry free excellent software:
- Blender: 3d graphics program. Just check out the image gallery.
- Sea3d: An amazing open-source 3d Settlers of Catan application.
- Notepad++: An HTML text editor that's oddly comforting.
July 12, 2004
Space probes, why do I love thee?
Because you represent the best in us: ingenuity, long-range planning, a sense of wonder.
And because you deliver such rockin' images!
Hmm, then again, when we were talking about space probes vs. renewable energy, Matt did argue, "It's a matter of priorities. Pretty pictures vs. a tremendous growth industry with clear economic and environmental advantages."
Sigh. But where else am I going to get such awesome desktop patterns?!
(Thanks to MemeFirst for the link!)
July 10, 2004
So, as you may or may not know, depending on your degree of nerdiness, Google busted out Gmail a few months ago, with a gigabyte of storage space and some hot new features.
In response, Yahoo! anted up with 2GB of storage. And now they've acquired Oddpost, a funky e-mail company with like 12 users but a great reputation for innovation.
So clearly Yahoo! is saying, Bring it. Which is awesome, because when Google brings it, it gets brought.
I predict in two years we'll be choosing between 100 gigs of free Gspace and a Yahoo! account where you get paid a dollar every time you send an e-mail.
June 29, 2004
Future of Open Source
Skip this month's sensationally-headlined Wired article "The Linux Killer." But check out its sidebar, an article about how Linus Torvald's laissez-faire approach to sourcing Linux is causing the enterprise legal headaches today.
To put Linux on more solid intellectual property footing in the future, the company has to become a little more corporate and a little less Dangermouse. It has to be a lot more meticulous about making sure all of its code is properly licensed to and by developers, keeping a thorough library of who-coded-what. In fact, the company may send Torvalds and the developers to re-write all the code that's already been written, making sure to pull any proprietary code out of there.
My experiences with open-source technology have been dim so far. I tried working with OpenOffice for several months on my last computer, because Microsoft Works documents only work in MS Works and MS Office was too rich for my blood. The software just had an amateurish feel about it, it crashed my computer regularly, and the interface was unintuitive (it was a little too open-source; i.e., I felt like I had to code a macro to get it to register a carriage return). MS Word may be a fascist, irrational piece of crap technology that mucks up my documents twice as often as it improves them, but at least it deceives me into feeling I have a modicum of stability there.
Open-source browsers have been a mixed bag. There's nothing wrong with Opera or Mozilla, per se, and especially on my old computer, I would go through weeks of heavy Opera usage, but the tangible advantages I would get from making them my primary browser and customizing them to fit snugly with Windows the way IE does (yes, yes, another proof that MS is eee-vil) seem small. It's not all that inconvenient to me to download yet another patch to fix yet another gaping security flaw every few weeks. Ha ha.
I love the idea of open-source — distributed creation, flexibility, affordability ... it sounds like the future. I refer to WikiPedia regularly, and I've long dreamt of an open-source-ish screenplay-writing website. I flirted with making Linux the OS for my new computer before some coworkers nabbed me a $20 copy of XP. Robin's grooving on Firefox, and I may download a free copy of StarOffice (I work at an educational insitution, after all). Talk to Robin long enough and he'll dangle before you visions of an open-source TV network.
My question: what's the future -- Linux or Googlezon? Open source or OpenSource™?
Or, as this Wired article suggests, are the two beginning to grow together towards a murky middle?
June 21, 2004
Step Off, Surly Bonds
Space.com is covering the flight of SpaceShipOne out of Mojave, Calif., today. If successful, it will be the first time a privately-funded and -built craft has ferried a human being into space, 100 kilometers up. Boing Boing has some more info. SpaceShipOne, like NASA's space shuttle, actually glides back to earth and lands, which I find totally appealing. I never liked that splash-down business.
June 4, 2004
Distance Between Two Points
A coworker wanted to find out how far away Mexico City was from San Diego, CA, in miles. MapQuest and MapPoint gave the driving distance, but she needed to know the distance as the crow flies. A little Google-fu turned up this pretty cool distance calculator. Just in case you ever wanted to know.
May 5, 2004
Because Really, Who Has a Toothed Wheel These Days?
Back in the day, you needed some pretty serious gear to measure the speed of light:
[Fizeau] shone a light between the teeth of a rapidly rotating toothed wheel. A mirror reflected the beam back between the same gap between the teeth of the wheel.
There were over a hundred teeth in the wheel. The wheel rotated at hundreds of times a second -- therefore thousandths of a second was easy to measure. Light was reflected from mirrors more than 5 miles apart. This also helped him making accurate measurements.
(That's from what-is-the-speed-of-light.com, your one-stop shop for speed-of-light related inquiries.)
Nowadays it's much easier: All you need is a microwave.
(Link via Boing Boing.)
April 26, 2004
I'd Like Some Personal Audio Entertainment Services, Please
So, this is pretty interesting: Clive Thompson has a new column up about the online music service called Rhapsody.
For iTunes, as you may know, you pay $0.99 and get a music file that you can play or burn onto a CD. Rhapsody's different: According to Clive, $10 a month gets you access to the service's entire library of music -- but you don't get any of the files. You just get the music, streamed to your speakers.
This reminds me of the excellent book "Natural Capitalism" which argued that we ought to get more goods as services instead of products. Example: Light bulbs. Does anybody want a light bulb? Does the product itself deliver any satisfation? No -- clearly, it's household illumination services that we're after. But to get them we must purchase fungly light bulbs.
So wouldn't it be interesting, the book suggests, if instead of peddling bulbs, General Electric sold lighting services for some small annual fee. It'd cost the same as a year's supply of bulbs, and we'd get the same ultimate product: Light. But now, absent the need to maximize bulb sales, it'd be in G.E.'s interest to aggressively innovate super-efficient light technology -- 'cause then it would cost them less to provide the service, and they could take the difference as profit. (Thus, "Natural Capitalism," in which good business is aligned with good environmental values.)
Interface, a commercial carpet company, is in fact moving towards selling interior comfort services, not carpet. They have lots of info at their sustainability site. And apparently it hasn't totally ruined the company or anything.
It's interesting to extend this model to music. What is it we really want? Well, if it's merely personal audio entertainment services, then Rhapsody is a great idea. Rhapsody is in the business of providing music, not selling tracks, so it can behave differently, right? It can spend its research dollars on faster, smoother, cooler music-playing technology instead of elaborate copy-protection schemes. Excellent.
But, not every commercial activity fits the "Natural Capitalism" service model. Think of automobiles: Yeah, we want transportation services; but many of us also want a car. The product-ness of it -- having it in the driveway, having your junk in the back seat -- is important.
So is music more like carpet or more like a car? For me, it's probably carpet. I think I could groove on Rhapsody, as I a) am musically clueless, and b) score no points with anyone for having cool music.
In general, though, I suspect it's more like a car: People want sounds to listen to, yeah, but they also want to own music -- music as cultural signifier, music as collection. Music in the driveway. (Although, as Clive points out, with iTunes and its ilk you "own" your music in a somewhat more limited sense than you did with CDs and tapes.)
Anyway, how about you? Car or carpet?... Read more ....
April 19, 2004
'The Secret Sauce'
The debate over funding for renewable energy research (see below) hinges in some ways on a simple question: Can we count on private companies to invest adequately in new inventions?
Across industries, the answer to this question seems to be "no." Although the potential rewards are great, so are the costs, as well as the chances of coming up with a big fat zilch. Companies, like people, are risk-averse; losing a million on a dud of a research project feels a lot worse than making a million on a winner. And what if you invent something that has nothing to do with your business? Most companies don't even bother.
So, I was interested to see that Nathan Myhrvold, erstwhile founder of Microsoft Research, is starting a private company devoted entirely to invention. Evan Schwartz writes in Technology Review:
The new venture, Myhrvold says, has no mission other than to invent what the inventors believe should be—or can be—invented. "Invention is the secret sauce," Myhrvold says. "It has the highest concentration of value compared with any task in a company. But because it’s so risky, it also has the lowest amount of focused effort." Showing what can happen when that effort is intensified is Myhrvold’s main reason for creating the laboratory, which he is funding in part from his own Microsoft-made fortune.
What a coincidence. I'm funding Snarkmarket in part from my Microsoft-made fortune.
April 15, 2004
Tales from Deep Space
That shape would fit with data from NASA's Wilkinson Microwave Anisotropy Probe, a satellite the size of a car, launched in 2001, now stationed at a point in space between the Earth and the Sun. It monitors cosmic background radiation, the energy left over from the Big Bang.
Let's just be clear: It's a Volvo launched into space that has successfully made its way to a little gravitational cradle a million miles away, and now it's detecting heat waves 13 billion years old.
And some scientists, looking at that data, are like, "Dude, the universe is totally a horn."
How is this stuff even possible??
Anyway, the horn shape is pretty cool. The flare at the end has the spatial properties of the video game "Asteroids": If you fly out beyond the edge of the universe, you just appear on the other side.
It's still just a hypothesis. Meanwhile, the WMAP satellite is out there, tethered between terra and stella, staring into space, gathering more data.
April 14, 2004
You know that bit in my last post about people being all gung ho about a technology at first, then glimpsing the consequences of its misuse and reflexively banning not the misuse, but the technology?
April 6, 2004
Go On, Get Outta Here, Find Somebody You Don't Already Know
Forget the global ultra-computer stuff. It's all about your former college roommates.
I like this blog note by David Weinberger, who was at a social software conference at Microsoft last week:
Shelley Farnham of Microsoft Research talks about the social goals of social software: To have meaningful relationships with friends. Research shows that we use technology primarily to interact with our friends, not strangers. ...
It's so true. If I sort my e-mail messages by sender, it looks like this: Friend X, Friend X, Friend X, Friend X, Friend X, Friend X, Friend X, ... some dude ... Friend Y, Friend Y, Friend Y, Friend Y, Friend Y, Friend Y, etc., etc.
That's sort've a banal observation, but I don't know that it's occurred to me before. I have often thought and spoken of the Web in terms of its ability to weave disparate people together. It's the Global Village, dawg! But the blogs I check most frequently -- and with the greatest interest -- are my friends'.
Steven Johnson, a top-notch science journalist also at the Microsoft conference, is afraid that too many people are behaving like me. Weinberger paraphrases:
Then he talks against the echo chamber idea: The Net is an echo chamber compared to what, he asks incredulously? TV? Even if you just follow bloggers in your general universe of interests, you're still following links out to more diverse ideas than ever before. He points out that the criticism used to be that the Net was nothing but flame wars. Now the criticism is that it's echo chambers. But, he worries, we are creating these social network tools in order to decrease our contact with others.
I'm not really worried. Friends are our primary connection to new ideas and new people. Sure, I get a lot of good stuff from the NYT Mag; but I think I get even more from my friend Penny. And the NYT Mag never introduced me to anybody.
And it's always extra good because Penny knows me, knows what I like -- and not just in a shallow "Robin ... likes ... articles about ... robots" kinda way. Penny knows my sensibility; and that's something only friends -- not super-computers, not magazine editors -- can claim.... Read more ....
The Age of Google
Follow the links in this chain:
Google announces a free e-mail service on April 1. And the joke is, it's not a joke.
How can Google give everybody a free gigabyte? Easy -- a side-effect of having the world's biggest, coolest search engine is that you have the world's biggest, coolest computer. Rich Skrenta expounds:
Google is a company that has built a single very large, custom computer. It's running their own cluster operating system. They make their big computer even bigger and faster each month, while lowering the cost of CPU cycles. It's looking more like a general purpose platform than a cluster optimized for a single application.
While competitors are targeting the individual applications Google has deployed, Google is building a massive, general purpose computing platform for web-scale programming.
This computer is running the world's top search engine, a social networking service, a shopping price comparison engine, a new email service, and a local search/yellow pages engine. What will they do next with the world's biggest computer and most advanced operating system?
But what good is a huge general-purpose computer without general data to purposefully process? Google has that, too. Jason Kottke writes:
So. They have this huge map of the Web and are aware of how people move around in the virtual space it represents. They have the perfect place to store this map (one of the world's largest computers that's all but incapable of crashing). And they are clever at reading this map. Google knows what people write about, what they search for, what they shop for, they know who wants to advertise and how effective those advertisements are, and they're about to know how we communicate with friends and loved ones. What can they do with all that? Just about anything that collection of Ph.Ds can dream up.(His version has all sorts of links and stuff. Check it out.)
And then Kottke, looking forward, asks: "Who needs Windows when anyone can have free unlimited access to the world's fastest computer running the smartest operating system?"
Famously, one of Google's central corporate directives is "Don't be evil." Can a world-spanning ultra-computer that knows all and tells all stick to that credo? Or might computational power corrupt the same way that political power does?
And what about a positive corollary to that rule: "Do good"? Can the power of a Google-sized computer help us solve our real problems?
(Shout-out to Penny for all the links.)
March 31, 2004
The End of the World
Google has redesigned. Always humorous to see the folks on MeFi go all headless chicken. Much like they did the last time Google "redesigned" the two words on its home page. Or the time UPS swapped their ugly brown logo for an ugly brown logo. Or -- ack -- the time Poynter Online got prettier.
How is it that with all the drama in the world, we have space in our minds to bug out whenever some company we have little personal connection to changes a font on its website?
March 26, 2004
This article promises more than it delivers, but it's worth reading anyway. It's about efforts to induce and detect emotion in video game players. The argument is undeniable — if game consoles could figure out players' emotional responses, they could make games much more effective. The article talks briefly about some experiments at a Scotland university to detect emotion by measuring the force of button depression, for example.
While a game that played off your emotions would be hella cool and probably equally creepy, I think developers need to take it back a step. I haven't played a game in years that drew me in emotionally the way Final Fantasy II (IV in Japan) did a dozen years ago, or old text adventures like Wishbringer years before that. From the games I've played recently, I'd argue that developers need to rediscover the art of story. Then they can talk to me about sophisticated emotion detection systems.
March 23, 2004
The Voice Inside (And Just Below) Your Head
I don't want to make this into a gee-whiz-lookit-that kinda blog here, but this is so 2001 it hurts:
It turns out that when you read or talk silently to yourself -- you know, the ol' internal monologue -- your brain actually still sends subtle signals to your tongue and vocal cords. They're not strong enough to make you start gabbing, but they are strong enough for a computer to detect. A super-smart NASA computer.
So NASA now has the beginnings of a system to recognize this "subvocal speech," which they will use to, you know, control rovers and stuff.
But mark my words -- mark my words! -- in the year 2025 we're all going to be wearing little dingbats on our necks and mumbling to our cars and cell phones.
(Thanks to Gizmodo for the link!)
March 15, 2004
After I read this Wired article, I was grooving on DARPA for a good little while. If you haven't heard about DARPA's Grand Challenge, here's the dish. They posted a prize of $1 million for any engineering team that could make an unmanned vehicle capable of driving from L.A. to Las Vegas. The Grand Challenge was a race for all the qualifying vehicles, to see which was the best.
Such a great idea, right? There's no way to spark innovation like a contest. The favored teams would all spend two and three times $1 million to build their vehicles anyway, so this was all about the thrill and prestige of victory. And DARPA could pick and choose from all the technological wonderworks these teams would dream up to make something truly revolutionary.
And on top of it all, DARPA's Grand Challenge website was fun and happy-looking; not at all what you'd expect from some stuffy government project. The FAQ included down-home humor, like: "it is expected that most teams will modify existing off-road vehicles for the Challenge, although who knows what could slither or crawl across the starting line."
Well, the race was Saturday. None of the vehicles got even eight miles past the starting point. So, it was kind of a bust. But good times were had.
Still kind of grooving on DARPA ... until I read this article. Eerie reminder of Total Information Awareness. Reports of DARPA's plans to build a giant floating surveillance blimp to watch entire cities and track individual civilians. And, creepiest of all (to me at least), notes about DARPA's research into technology that can grow and heal itself.
The rational, naive side of me says, "No, this is good. Self-repairing humanoid machines can clearly only be used for totally benign purposes, and will of course remain at all times under human control, despite DARPA's efforts to produce military technology that can mimic humans' heuristic capacities and awareness of their environment." The even more rational, and now utterly paranoid, side of me says, "The Terminator is now governing California."
Step three: panic!
March 11, 2004
'Printing' Houses, Layer by Layer
This sounds rad:
A robot for "printing" houses is to be trialled by the construction industry. It takes instructions directly from an architect's computerised drawings and then squirts successive layers of concrete on top of one other to build up vertical walls and domed roofs.
It'd be like a mud hut for the 21st century. A really, umm, laborious and complicated mud hut. But it could have weird, curvy walls!
February 6, 2004
My Right Hemisphere Made Me Type This
So I'm reading The Blank Slate: The Modern Denial of Human Nature by Steven Pinker, whose books I have always really enjoyed.
I'm only forty pages in, and already it's blowing my mind.
Okay, so you know how we think of ourselves as "I"? That is, even if you subscribe to the notion that the mind is just a bunch of biochemical processes in the physical brain (as I do) you still tend to think of it as the mind. There's a particular you, a central nexus that gets sensory input, makes decisions, and all that.
Except there's totally not. Check this out:
One of the most dramatic demonstrations of the illusion of the unified self comes from the neuroscientists Michael Gazzaniga and Roger Sperry, who showed that when surgeons cut the corpus callosum joining the cerebral hemispheres, they literally cut the self in two, and each hemisphere can exercise free will without the other one's advice or consent. Even more disconcertingly, the left hemisphere constantly weaves a coherent but false account of the behavior chosen without its knowledge by the right. For example, if an experimenter flashes the command "WALK" to the right hemisphere (by keeping it in the part of the visual field that only the right hemisphere can see), the person will comply with the request and begin to walk out of the room. But when the person (specifically, the person's left hemisphere) is asked why he just got up, he will say, in all sincerity, "To get a Coke"--rather than "I don't really know" or "The urge just came over me" or "You've been testing me for years since I had the surgery, and sometimes you get me to do thing but I don't know exactly what you asked me to do." Similarly, if the patient's left hemisphere is shown a chicken and his right hemisphere is shown a snowfall, and both hemispheres have to select a picture that goes with what they see (each using a different hand), the left hemisphere picks a claw (correctly) and the right picks a shovel (also correctly). But when the left hemisphere is asked why the whole person made those choices, it blithely says, "Oh, that's simple. The chicken claw goes with the chicken, and you need a shovel to clean out the chicken shed."
The spooky part is that we have no reason to think that the baloney-generator in the patient's left hemisphere is behaving any differently from ours as we make sense of the inclinations emerging from the rest of our brains.
Whoahhh! I love this stuff. For a while during college (okay, like two weeks) I was totally going to go into cognitive science. How did I end up in lame old economics instead?
January 21, 2004
I have nothing additional to say about silica aerogel save that I wouldn't mind a blanket made of the stuff.
(That's another Collision Detection link, yo.)
January 13, 2004
Who's the Actor?
Really interesting MetaFilter thread on how the Oscars should handle the potential nomination of Gollum. If they wanted to give out a Best Supporting Actor nod, who gets it -- Serkis or the animators?
Obligatory link to Gollum's MTV Movie Award appearance (Quicktime, 8 megs).
December 17, 2003
7" x 50" x $1,000
I am not usually one to get all snarked up about new home entertainment technology. But this New York Times article by John Markoff about a new digital TV chip from Intel is pretty crazy:
[Some analyst] predicted that the low-cost display technology, which can be incorporated into the traditional rear-projection television sets, could lead to lightweight 50-inch screens only 7 inches thick for about $1,000, perhaps as early as the 2004 holiday season.
Sure, they wouldn't be "plasma" screens. Which is too bad, because "plasma" anything is ultra-cool. (This is, by the way, a "plasma" blog.) But still, those are some niiice dimensions.
December 16, 2003
Seven Days of Creation
Something about this Wired article totally grabbed me. Well, the headline and deck hed are pretty arresting in combo, but then the article itself did this spectacular job of drawing me into this little dark room with these two scientists, poking at eggs under a microscope. Somehow, the writer gets away with using science jargon without turning me off. I read all the way through. I learned a bit, too. Now I'm all interested in seeing how these experiments turn out.
December 12, 2003
"Monkeyboy" No Longer Just a Harmless Insult
In Friday's New York Times, Nicholas Wade writes of a quest to distinguish man from chimp:
The project received a lift two years ago when a large London family with barely intelligible speech was found to have mutations in a gene called FOXP2. Chimpanzees also have a FOXP2 gene, but it is significantly different. The human version shows signs of accelerated evolutionary change in the last 100,000 years, suggesting that the gene acquired a new function that helped confer the gift of speech.I think we're still underestimating nature's hand in the nature vs. nurture tug-of-war. Yes, genetic differences within the human family are miniscule -- but not insignificant. I'll bet it's a bit of a burden to have a gene that makes it more difficult to speak, ya know?
November 14, 2003
Bet you didn't realize Snarkmarket was running on one of these, did you?
Actually, it's funny, there's a line in that article -- about IBM's Blue Gene/L supercomputer -- that goes "it will use no more power than the average home," and I expected the next word to be "computer" or "refrigerator" or something. But no. It will use no more power than the average home, period.
But that's the kind of power we need. Snarkmarket's brand of ten-dimensional political and social analysis makes protein folding look like frickin' Pong.