Here’s another Netflix list from Friend of the Snarkmatrix Matt Penniman! —RS
As a supplement to Tim’s list, I thought I might offer the following. It attempts to catalog the history of science fiction in film. More specifically: it features films that take a scientific possibility or question as their central premise.
20,000 Leagues Under the Sea (1916)
deep sea life
Metropolis (1927)
robotics, dehumanization
Gojira (1954)
mutation
The Fly (1958)
hybridization
La jetée (1961)
time travel
Planet of the Apes (1968)
evolution
Solaris (1972)
alien intelligence
Close Encounters of the Third Kind (1977)
alien intelligence
Mad Max (1979)
post-apocalypse society
Blade Runner (1982)
robotics
Aliens (1986)
biological weapons
Terminator 2: Judgment Day (1991)
robotics, time travel
Ghost in the Shell 2.0 (1995)
robotics, networked information
Bonus selections:
Robot Stories (2004)
Moon (2009)
Robin is absolutely right: I like lists, I remember everything I’ve ever seen or read, and I’ve been making course syllabi for over a decade, so I’m often finding myself saying “If you really want to understand [topic], these are the [number of objects] you need to check out.” Half the fun is the constraint of it, especially since we all now know (or should know) that constraints = creativity.
Looking to do some sort of survey on film history. Any sort of open curriculum out there like this that runs in tandem with Netflix Instant?
I quickly said, “I got this,” and got to work.
See, trying to choose over the set of every film ever made is ridiculously hard. Choosing over a well-defined subset is both easier and more useful.
Also, I knew I didn’t want to pick the best movies ever made, or my favorites, or even the most important. Again, that pressure, it’ll cripple you. I wanted to pick a smattering of films that if you watched any given, sufficiently large subset of them, you’d know a lot more about movies than when you started.
This is actually a lot like trying to design a good class. You’re not always picking the very best examples of whatever it is you’re talking about, or even the things that you most want your students to know, although obviously both of those factor into it. It’s much more pragmatic. You’re trying to pick the elements that the class is most likely to learn something from, that will catalyze the most chemistry. It’s a difficult thing to sort, but after you’ve done it for a while, it’s like driving a car, playing a video game, or driving a sport — you just start to see the possibilities opening up.
Then I decided to add my own constraints. First, I decided that I wasn’t going to include any movies after the early 1970s. You can quibble about the dates, but basically, once you get to the Spielberg-Scorsese-Coppola-Woody Allen generation of filmmakers — guys who are older but still active and supremely influential today — movies are basically recognizable to us. Jaws or Goodfellas or Paris, Texas are fantastic, classic, crucial movies, but you don’t really have to put on your historical glasses to figure them out and enjoy them, even if they came out before you were of movie-going age. The special effects are crummier, but really, movie-making just hasn’t changed that much.
Also, I wasn’t going to spend more than a half-hour putting it together. I knew film history and Netflix’s catalog well enough to do it fast, fast, fast.
And so, this was the list I came up with. As it happened, it came to a nice round 33.
I made exactly one change between making up the list and posting it here, swapping out David Lynch’s Eraserhead for Jean-Luc Godard’s Breathless. I cheated a little with Eraserhead — it’s a late movie that was shot over a really, really long period of time in the 70s and came out towards the end of that decade. And Breathless isn’t Godard’s best movie, but it’s probably the most iconic, so it was an easy choice.
There are huge limitations to this list, mostly driven by the limitations of the catalog. Netflix’s selection of Asian and African movies, beyond a handful of auteurs like Akira Kurosawa, isn’t very good. There’s no classic-period Hitchcock. There’s no Citizen Kane. There aren’t any documentaries or animated films. And you could argue until you’re blue in the face about picking film X over film Y with certain directors or movements or national cinemas.
But you know what? You wouldn’t just learn something from watching these movies, or just picking five you haven’t seen before — you would actually have fun. Except maybe Birth of a Nation. Besides its famous pro-Ku Klux Klan POV, that sucker is a haul. Happy watching.
I’m not going to recount the long insomniac thought trail that led me here, but suffice it to say I ended up thinking about mission statements early this morning. Google’s came immediately to mind: To organize the world’s information, and make it universally accessible and searchable. I’m not sure what Twitter’s mission statement might be, but a benign one didn’t take too long to present itself: To enable a layer of concise observations on top of the world. (Wordsmiths, have at that one.)
I got completely stuck trying to think of a mission for Facebook that didn’t sound like vaguely malevolent marketing b.s. To make everything about you public? To connect you with everyone you know?
When I read Zadie Smith’s essay as an indictment of Facebook – its values, its defaults, and its tendencies – rather than the “generation” it defines, her criticisms suddenly seem a lot more cogent to me. I realized that I actually am quite ambivalent about Facebook. I thought it was worth exploring why.
I was thinking about the ways social software has changed my experience of the world. The first world-altering technology my mind summoned was Google Maps (especially its mobile manifestation), and at the thought of it, all the pleasure centers of my brain instantly lit up. Google Maps, of course, has its problems, errors, frustrating defaults, troubling implications – but these seem so far outweighed by the delights and advantages it’s delivered over the years that I can unequivocally state I love this software.
I recently had an exchange with my friend Wes about whether Google Maps, by making it so difficult to lose your way, also made it difficult to stumble into serendipity. I walked away thinking that what Google Maps enabled – the expectation that I can just leave my house, walk or drive, and search for anything I could want as I go – enabled much more serendipity than it forestalled. It’s eliminated most of the difficulties that might have prevented me from wandering through neighborhoods in DC, running around San Francisco, road-tripping across New England. And it demands very little of me, and imposes very little upon me. (One imposition, for example: All the buildings I’ve lived in have been photographed on Street View. I’m happy to abide by this invasion of privacy, because without it, I wouldn’t have found the place I live in today.) For me, Google Maps is basically an unalloyed social good.
Google has been very prolific with these sorts of products – things that bring me overwhelming usefulness with much less tangible concern. Google Search itself is, of course, a masterpiece. News Search, Gmail, Reader, Docs, Chrome, Android, Voice – even failed experiments such as Wave – I find that these things have heightened what I expect software to do for me. They have made the Internet more useful, information more accessible, and generally, life more pleasurable.
I was trying to think of a Facebook product that ameliorated my life in some similar way, and the first thing to come to mind was Photos. Facebook Photos created for me the expectation that every snapshot, every captured moment, would be shared and tagged for later retrieval. At my fifth college reunion, I made a point of taking photos with every classmate I wanted to reconnect with on Facebook. When I go home and tag my photos, I told my buddies, it will remind you that we should catch up. And it worked like a charm! I reconnected with dozens of old friends on Facebook, and now I see their updates scrolling by regularly, each one producing a tinge of warmth and good feelings.
But the dark side of Facebook Photos almost immediately presented itself as well. For me, the service has replaced the notion of a photograph as a shared, treasured moment with the reality of a photograph as a public event. I realized all of a sudden that I can’t remember the last time I took a candid photo. Look through my photos, and even those moments you might call “candid” are actually posed. I can’t sit for a picture without expecting that the photo will be publicized. Not merely made public – my public Flickr stream never provoked this sense – publicized. And although this is merely a default, easily overridden, to do so often feels like an overreaction. To go to a friend’s photo of me and untag myself, or to make myself untaggable, feels like I’m basically negating the purpose of Facebook Photos. The product exists so these images might be publicized. And increasingly, Facebook seems to be what photos are for.
Of course that’s not true. I also suddenly realized that I’ve been quietly stowing away a secret cache of images on my phone – a shot of Bryan sleeping, our cat Otis in a grocery bag, an early-morning sunlit sky – that are quickly becoming the most treasured images I possess, the ones I return to again and again.
Perhaps Facebook Photos has made my private treasure trove more valuable.
I use Facebook Photos as an example first because it’s the part of the service that’s most significantly altered my experience of the world, but also because I think it reflects something about the software’s ethos. That dumb, relentless publicness of photos on Facebook doesn’t have to be the default. Photos, by default, could be accessible only to users tagged in a set, for example, not publicized to all my friends and their friends. I’m not even sure that’s an option. (My privacy settings allow most users to see only my photos, not photos I’m tagged in. But I’m not sure what that even means. When another friend shares a photo publicly, and I’m tagged in it, I’m fairly certain our friends see that information.)
Facebook engineered the photo-sharing system in such a way as to maximize exposure rather than, say, utility. For Facebook, possibly, exposure is utility.* I think that characterizes most of the choices that underpin Facebook’s products. With most of the other social software products I use – the Google suite, WordPress, Twitter, Flickr, Dropbox, etc. – I am constantly aware of and grateful for the many ways the software is serving me. With Facebook, I’m persistently reminded that I am always serving it – feeding an endless stream of information to the insatiable hive, creating the world’s most perfect consumer profile of myself.
I don’t trust Google for a second, but I value it immensely. I trust Facebook less, and I’m growing more ambivalent about its value.
I don’t think I want to give up Facebook. I value the connections it offers, however shallow they are. I enjoy looking at photos of my friends. I like knowing people’s birthdays.
But I am wary of it, its values and its defaults. How it’s changing my expectations and my experience of the world.
* Thought added post-publication.
Sorry; this snippet from Matt’s second-day liveblog/Twitter curation of the conversation at PubCamp blew my mind a little bit:
Matt Thompson: One of the most frequent issues NPR.org users have is not being able to find something on our website. The vast majority of the time, that’s because they heard something on their local programming and are searching for it in the national site. If we had shared authentication across the system, we could be able to recognize other stations users authenticate with and show them local content.
So simple, but so powerful.
You’ve got to fine-tune just how local you get to match user expectations, though:
Matt Thompson: Discussion turns to users qualms over things like the Open Graph, turning on WaPo.com, for example, and suddenly seeing your friends’ names all over the page. How does the Washington Post know who my friends are?
But we quickly come back to the simple-but-powerful stuff again:
Matt Thompson: I asked for my pony: a registration system that would just keep track of what I’d read on the site, then let me know when those stories were updated/corrected.
I think we almost need to bring it back to the user end and offer something like a hybrid between the “Private Browsing/Incognito” mode that’s started to get incorporated into web browsers and the browser extension FlashBlock, that disables Flash ads and videos except when you whitelist them.
Call it “SocialBlock” (which sounds way more fun than it actually is). I browse with my identity intact, carrying it with me, but can select which sites/services I offer it to. And it’s just a quick click to turn it on or off.
I’m sitting in the dev lounge during the last of the day’s sessions at Public Media Camp, an unconference for folks interested in public media stuff.
Fair warning: This is not going to be your standard Matt Thompson Conference Liveblog, and will possibly not be interesting in any way. I’m trying out two things: (1) live curation of Twitter (which I haven’t really done), and (2) a Snarkmarket-customized CoverItLive template, that will allegedly not require you to see the title page. I’ll be very excited if this latter thing is true. Update: Not true. Still have to click to see the liveblog. Darn it.
I want to distinguish blogging from reporting, and bloggers from reporters. But more than that, I want to distinguish the first question from the second.
Blogging is pretty easy to define as an activity. It’s writing online in a serial form, collected together in a single database. It doesn’t matter whether you’re doing it as an amateur or professional, as an individual or in a group, under your own byline or a pseudonym, long-form or on Twitter.
Reporting is a little trickier, but it’s not too tough. You search for information, whether from people or records or other reports, you try to figure out what’s true, and you relay it to somebody else. Anyone can report. They assign reports to elementary school students. Or you can be Woodward-and-Bernsteining it up, using every trick you can think of to track down data from as many sources as possible.
Now, both of these are different from what it means to be a blogger or a reporter. The latter are a matter of identity, not activity. I’ll offer an analogy. If someone says, “I’m a writer,” we don’t assume that they mean that they’re literate and capable of writing letters, words, or sentences. We might not assume that they’re a professional writer, but we do assume that they identify with the act of writing as either a profession, vocation, or distinguished skill. They own their action; it helps define who they are.
Likewise, if someone calls themselves (or if someone else calls them) a reporter or blogger, they might be referring to their job or career, but they’re definitely referring to at least a partial aspect of their identity. And just like we have preconceptions about what it means to be a “writer” — a kind of Romantic origin myth, of genius and personality expressed through language — we have preconceptions about what it means to be a blogger or a reporter.
They’re not just preconceptions, though, but practices codified in institutions, ranging from the law to business and labor practices to the collective assumptions and morés of a group.
There are lots of ways you could trace and track this, but let me follow one thread that I think is particularly important: the idea of the author-function.
Traditionally (by which I mean according to the vagaries of recent collective memory), reporters who are not columnists have bylines, but are not seen as authors. Their authority instead accrues to their institution.
If we read a story written by a typical reporter, we might say “did you see ____ in the New York Times?” If other newspapers or journalistic outlets pick up the story, if they attribute it at all, they’ll say, “According to a report in the New York Times…” This is similar to medical and scientific research, where journalists will usually say, “scientists at MIT have discovered…”
Some people within this field are different. If Paul Krugman writes something interesting, I probably won’t say “the New York Times”; I’ll say “Paul Krugman.”
In fact, there’s a whole apparatus newspapers use in order to distinguish writers I’m supposed to care about and writers I’m not. A columnist’s byline will be bigger. Their picture might appear next to their column.* They might write at regular intervals and appear in special sections of the paper. This is true in print or online.
(*This was actually one of the principal ways authorship was established in the early modern period: including an illustration of the author. Think about the famous portraits of Shakespeare. Sometimes to be thrifty, printers would reuse and relabel woodcuts: engravings of René Descartes were particularly popular, so a lot of 17th-century authors’ pictures are actually Descartes.)
Blogs do basically the same thing. Quick: name me three bloggers besides Josh Marshall who write for Talking Points Memo. If you could do it, 1) you’re good, and 2) you probably know these people personally, or at least through the internet.
These guys and girls are bloggers, they’re reporters, they’re opinionated, they have strong voices, and some of them are better than others. But I don’t know what they look like; if they followed me on Twitter tomorrow, I probably wouldn’t recognize their names. Josh Marshall, the impresario, is an author of the blog in a way that his charges are not. Or to take another example, Jason Kottke — whose writing is nearly as ego-less as it can probably get in terms of style, but who still is the absolute author of his blog.
The Atlantic, for better or worse (I think better), took an approach to blogging that foregrounded authorship: names, photos, and columns. There are “channels” through which lots of different people write, and sometimes you pick their names and voices out of the stream, but they’re not Andrew Sullivan, Ta-Nehisi Coates, James Fallows, Megan McArdle, Jeffrey Goldberg, Alexis Madrigal, et al., or Ross Douthat, Matt Yglesias and co. before them.
Now all of these writers tackle different topics and work in different styles, but they’re all authors. Their blogs are written and held together through the force of their names and personalities. Sullivan has a team of researchers/assistants, Coates has a giant community of commenters, Alexis has a crew of rotating contributors. It doesn’t matter; it’s always their blog.
The one person who never quite fit into this scheme was Marc Ambinder. Early on, when the first group of bloggers came in, it made more sense. For one thing, almost all of them wrote about politics and culture. They each had a slightly different angle — different ages, different political positions, different training. Ambinder’s schtick was that he was a reporter. It seemed to make as much sense as anything else.
As time went on, the blogs became less and less about politics in a recognizable sense. Ta-Nehisi Coates starting writing about the NFL, Jim Fallows increasingly about China and flying planes. And then the Atlantic starting putting author pictures up, by the posts and on the front page.
I remember sometime not long ago seeing Ambinder’s most recent photo on TheAtlantic.com and saying to myself, “I know what Marc Ambinder looks like, and that’s not Marc Ambinder.” He wasn’t wearing his glasses. He’d lost a ton of weight — later I’d find out he’d had bariatric surgery. He found himself embroiled in long online arguments where he was called out by name about his politics, his sexuality, his relationships.
Here’s somebody who by dint of professional training and personal preference simply did not want to be on stage. He didn’t want people looking at him. He didn’t want to talk about himself. He couldn’t be a personality like Andrew Sullivan or Ta-Nehisi Coates, or even a classically-handsome TV anchor talking head WITH personality like Brian Williams or Anderson Cooper. He wanted to do his job, represent his profession and institution, and go home.
I’m sympathetic, because I find it just as hard to act the opposite way. By training and disposition, I’m a writer, not a reporter. I’ve had to learn repeatedly what it means to represent an institution rather than just my own ideas and sensibilities — that not every word that appears under my byline is going to be the word I chose. The vast majority of people I meet and interact with don’t care who I am or what I think, just the institution I write for.
That’s humbling, but it’s powerful, too. Sometimes, it’s appealing. One of the things I love about cities are the anonymity you can enjoy: I could be anybody and anybody could be me. If you identify with it and take it to its limit, adopting those values as yours, it’s almost impossible to turn around and do the other thing.
So far, we have lived in a world where most the bloggers who have been successful have done so by being authors — by being taken seriously as distinct voices and personalities with particular obsessions and expertise about the world. And that colors — I won’t say distorts, but I almost mean that — our perception of what blogging is.
There are plenty of professional bloggers who don’t have that. (I read tech blogs every day, and couldn’t name you a single person who writes for Engadget right now.) They might conform to a different stereotype about bloggers. But that’s okay. I really did write snarky things about obscure gadgets in my basement while wearing pajama pants this morning. But I don’t act, write, think, or dress like that every day.
Today Last week, Marc Ambinder reached the end of his tenure as a politics blogger for the Atlantic, and toasted the event with a thoughtful post on the nature of blogging. The central nugget:
Really good print journalism is ego-free. By that I do not mean that the writer has no skin in the game, or that the writer lacks a perspective, or even that the writer does not write from a perspective. What I mean is that the writer is able to let the story and the reporting process, to the highest possible extent, unfold without a reporter’s insecurities or parochial concerns intervening. Blogging is an ego-intensive process. Even in straight news stories, the format always requires you to put yourself into narrative. You are expected to not only have a point of view and reveal it, but be confident that it is the correct point of view. There is nothing wrong with this. As much as a writer can fabricate a detachment, or a “view from nowhere,” as Jay Rosen has put it, the writer can also also fabricate a view from somewhere. You can’t really be a reporter without it. I don’t care whether people know how I feel about particular political issues; it’s no secret where I stand on gay marriage, or on the science of climate change, and I wouldn’t have it any other way. What I hope I will find refreshing about the change of formats is that I will no longer be compelled to turn every piece of prose into a personal, conclusive argument, to try and fit it into a coherent framework that belongs to a web-based personality called “Marc Ambinder” that people read because it’s “Marc Ambinder,” rather than because it’s good or interesting.
My esteemed coblogger tweeted some terrific observations about Ambinder’s post:
@mthomps @robinsloan Now you can blog and be a reporter in a different way from how Ambinder & The Atlantic think of those two things.
@mthomps @robinsloan But Ambinder’s (& others’) conception of “reporter” & Atlantic’s (& others’) conception of blogging are incompatible.
I expect when Tim has more than 140 characters, he’ll nod to the fact that The Atlantic’s website actually encompasses many different ideas of what blogging means – from Andrew Sullivan’s flood of commentless links and reader emails to Ta-Nehisi Coates’ rollicking salons to Ambinder’s own sparsely-linked analyses. And beyond the bounds of the Atlantic there are so many other ideas, as many types of blogs as there are types of books, and maybe more – Waiter Rant to Romenesko to Muslims Wearing Things to this dude’s LiveJournal to BLDGBLOG.
That Ambinder’s essay doesn’t really acknowledge this – that it seems so curiously essentialist about a format that’s engendered so much diversity – disappoints me, because he’s such a thoughtful, subtle writer at his best. His sudden swerve into the passive voice – “You are expected to not only have a point of view” – briefly made me worry that he intends to become one of those print journalists who uses the cloak of institutional voice to write weaselly ridiculous phrases such as “Questions are being raised.”
It puzzles me that the same fellow who wrote that “a good story demolishes counterarguments” would casually drop the line, “Really good print journalism is ego-free.” “What I mean,” Ambinder says, “is that the writer is able to let the story and the reporting process, to the highest possible extent, unfold without a reporter’s insecurities or parochial concerns intervening.” I think I know what type of long-form journalism he’s referring to – there’s a wonderful genre of stories that make their case with a simple, sequential presentation of fact after unadorned fact. The Looming Tower. The Problem from Hell. David Grann’s stunning “Trial by Fire” in the New Yorker.
But there’s an equally excellent genre of journalism that foregrounds the author’s curiosities, concerns and assumptions – James Fallows’ immortal foretelling of the Iraq War, Atul Gawande’s investigation of expenditures in health care. This is ego-driven reporting, in the best possible way. For every Problem from Hell, there’s another Omnivore’s Dilemma. Far from demolishing counterarguments, Ambinder’s mention of “ego-free journalism” instantly summons to mind its opposite.
Likewise, his contention that “blogging is an ego-intensive process” has to grapple with the fact that some of the best blogging is just the reverse. It doesn’t square with examples such as Jim Romenesko, whose art is meticulously effacing himself from the world he covers, leaving a digest rich with voice and judgment so veiled you barely even notice someone’s behind it. In fact, contra Ambinder, I’ve found that one of the most difficult types of blogging to teach traditional reporters is this very trick of being a listener and reader first, suppressing the impulse to develop your own take until you’ve surveyed others and brought the best of them to your crowd. Devoid as it is of links, non-Web journalism often fosters a pride of ownership that can become insidious – a constant race to generate information that might not actually help us understand the world any better, but is (1) new and (2) yours. Unchecked, that leads inevitably to this.
In just the way Marc Ambinder’s post wasn’t necessarily an attack on blogging, this isn’t necessarily a defense of it, or an attack on traditional journalism. If Ambinder recast his musings on blogging in a slightly different way, I’d actually agree with him wholeheartedly. If, as I’ve been arguing in this post, the form is flexible enough to encompass so many approaches, that means every choice contributes to a blog’s unique identity. Perhaps more than any other publishing/broadcasting format, a blog is a manifestation of the choices and idiosyncrasies of its authors.
And I think this is what Ambinder’s experience reflects – his choices and his idiosyncrasies. He chose to blog about national politics – an extraordinarily crowded (and particularly solipsistic) field. To distinguish himself from the crowd, he chose to craft a persona known for its canny insider’s pose and behind-the-scenes insights. I think it was a terrific choice; I’ve enjoyed his Atlantic writing a lot. But there’s little essential about the format that compelled him to this choice.
The title of this post is, of course, facetious. (Although I’d kind of love it if the pointless “Who’s a journalist” debates gave way to pointless “Who’s a blogger” ones.) Of course Marc Ambinder was a blogger – he tended to a series of posts displayed on the Web in reverse-chronological order. Beyond that, there are common patterns and proven techniques, but very few rules. Print imposes more constraints, but some folks find a sort of freedom in that. I hope Marc Ambinder does, and I hope to read the product.