spacer image
spacer image

August 27, 2009

Bill Tozier's thoughts: I can't do it justice since I heard it second-hand from Jack Cowan, but the story of Einstein and... >>

Albert and Kurt

20090827_albertkurt.png

Albert and Kurt, via Nerdboyfriend.

This is my preferred vision of the all-knowing creator figure. He must a) have hair like that, and b) wear a nice unassuming blue sweatshirt.

SERIOUS QUESTION: Would this have been a fun conversation to be in? Like, reflected glow of fame aside, were these guys actually enjoyable to talk to? Any anecdotes or insights?

Robin-sig.gif
Posted August 27, 2009 at 4:16 | Comments (5) | Permasnark
File under: Science

July 11, 2009

Hyperlexia

I had never heard of this disorder before:

In hyperlexia, a child spontaneously and precociously masters single-word reading. It can be viewed as a superability, that is, word recognition ability far above expected levels... Hyperlexic children are often fascinated by letters and numbers. They are extremely good at decoding language and thus often become very early readers. Some hyperlexic children learn to spell long words (such as elephant) before they are two and learn to read whole sentences before they turn three. An fMRI study of a single child showed that hyperlexia may be the neurological opposite of dyslexia.[2]

Often, hyperlexic children will have a precocious ability to read but will learn to speak only by rote and heavy repetition, and may also have difficulty learning the rules of language from examples or from trial and error, which may result in social problems... Their language may develop using echolalia, often repeating words and sentences. Often, the child has a large vocabulary and can identify many objects and pictures, but cannot put their language skills to good use. Spontaneous language is lacking and their pragmatic speech is delayed. Hyperlexic children often struggle with Who? What? Where? Why? and How? questions... Social skills often lag tremendously. Hyperlexic children often have far less interest in playing with other children than do their peers.

The thing is, this absolutely and precisely describes me in childhood, especially before the age of 5 or 6. (This is also the typical age when hyperlexic children begin to learn how to interact with others.) It also describes my son - which is how my wife found the description and forwarded it to me.

You walk around your entire life with these stories, these tics, and the entire time, your quirks are really symptoms. It's a little strange.

Tim-sig.gif
Posted July 11, 2009 at 8:17 | Comments (0) | Permasnark
File under: Books, Writing & Such, Braiiins, Language, Learnin', Science, Self-Disclosure

July 8, 2009

Swimming Out Of The Death Spiral

Tim says,

And now for a note on the dark side of printed books: Michael Jensen, Director of Strategic Web Communications for National Academies and National Academies Press, collects and analyzes data about global warming and ecological collapse. At the AAUP meeting in Philadelphia, he presented "Scholarly Publishing in the New Era of Scarcity," an argument that the combination of financial and environmental necessity compels university presses to move away from printing, shipping, and storing books and towards a digital-driven, open-access model, with print-on-demand and institutional support rounding out the new revenue model.

(I'm posting Part 2 of Jensen's speech - the part that's mostly about publishing - here. Watch Part 1 - which is mostly about the environment - if you want to be justly terrified about what's going to happen to human beings and everything else pretty soon.)

This is one reason I'm kind of happy that we didn't print a thousand or more copies of New Liberal Arts. We can make print rare, we can get copies straight to readers, we can make print more responsible, but mostly we have to make print count. And - of course - share the information with as many people as possible.

Comments (2) | Permasnark | Posted: 10:44 AM

July 4, 2009

Howard Weaver's thoughts: This brings to mind two of my favorite moments from TED conferences I attended in the past: ... >>

Evolution 2.0 (and 3.0 beta)

This is kind of a cool idea. Let's say that evolution writ large is only accidentally about the preservation, transmission, and development of living species, but essentially about the preservation, transmission, and development of information. On this view, organisms are just a means to an end, particularly well-adapted couriers for all of this chemical data.

If that's the case, then maybe there isn't anything particularly special about the specific form of that data (i.e. DNA) or the way it's been transmitted in humans (sexual reproduction). That's just one way of doing things - in nonconscious, nonverbal, or nonhistorical species, genetic transmission, instinct, inherited traditions are the only means you've got. But once modern humans arrive on the scene, with all their increasingly sophisticated means of representing information, then Evolution 1.0, internal transmission of information, isn't the only game in town -- you've also got Evolution 2.0, characterized by the external transmission of information.

Once you reframe evolution in this way, then you can say that our species' rate of evolution "over the last ten thousand years, and particularly... over the last three hundred" is actually off the charts.

So the guy who's arguing this is a physicist named Stephen Hawking. (Maybe you've heard of him - he's awfully smart, and was part of Al Gore's Vice Presidential Action Rangers.) He also says that our tinkering with evolution ain't over:

[W]e are now entering a new phase, of what Hawking calls "self designed evolution," in which we will be able to change and improve our DNA. "At first," he continues "these changes will be confined to the repair of genetic defects, like cystic fibrosis, and muscular dystrophy. These are controlled by single genes, and so are fairly easy to identify, and correct. Other qualities, such as intelligence, are probably controlled by a large number of genes. It will be much more difficult to find them, and work out the relations between them. Nevertheless, I am sure that during the next century, people will discover how to modify both intelligence, and instincts like aggression."

If the human race manages to redesign itself, to reduce or eliminate the risk of self-destruction, we will probably reach out to the stars and colonize other planets. But this will be done, Hawking believes, with intelligent machines based on mechanical and electronic components, rather than macromolecules, which could eventually replace DNA based life, just as DNA may have replaced an earlier form of life.

I can't decide if this is totally anthropocentric, or exactly the opposite. But it's kind of exciting, isn't it? I'm evolving the species right now, just by typing this! And so are you, by reading it! And so are Google's nanobots, by recording all of it in their fifteenth-gen flash brains!

Tim-sig.gif
Posted July 4, 2009 at 5:59 | Comments (1) | Permasnark
File under: Books, Writing & Such, Language, Science, Technosnark

July 2, 2009

Geeking Out, c. 1990

Tim says,

141418-hp12cpic_original.png

I love this; Hewlett-Packard is selling an exact copy of its HP-12C financial calculator for the iPhone.

The iPhone version of the HP-12C is a near carbon copy of the actual machine. It not only looks the same, but it actually runs the same code as do the physical calculators. The iPhone version is actually a bit better than just a clone of the original, though, because HP includes a simplified portrait-mode calculator (the 12C is a landscape-mode device). When used in portrait mode, you can use the number keys, along with all the usual math operators and a couple of other functions such as square roots and memory—perfect for those times when you just need a basic calculator.

The real power of the HP-12C is found when you rotate your iPhone to landscape mode; what appears on the screen then is a photographic reproduction of the actual HP-12C calculator, complete with the gold-brown-orange-blue color scheme that made the original so…endearing? Because the app uses the actual calculator’s code, absolutely everything works just like it does on the real calculator.

I used a calculator just like this to win a middle school mathematics competition - in those days, it was called a "Calculator Competition," because you could (gasp!) use a calculator. There was a school-wide thing, then a regional, and then a state final; it was a whole thing. The state final was the first time I'd ever seen a graphing calculator; that shiz blew my mind.

Comments (2) | Permasnark | Posted: 5:21 AM

July 1, 2009

Robin's thoughts: ...I think I see Sauron...... >>

Volcano, Meet Cloud; Cloud, Volcano

volcano.jpg

Um, wow:

A plume of smoke, ash and steam soars five miles into the sky from an erupting volcano.

The extraordinary image was captured by the crew of the International Space Station 220 miles above a remote Russian island in the North Pacific.

The round hole in the clouds is thought to have been caused by the shockwave of the initial explosion. At the centre lies the billowing mushroom tower of grey and brown ash.

For volcano experts, the most exciting part of the image is the layer of smooth white cloud that caps the plume - a little like a layer of snow on a mushroom.

This cap of condensed air is created from the rapid rising and then cooling of the air directly above the ash column. When moist, warm air rises quickly it creates a cloud.

Tim-sig.gif
Posted July 1, 2009 at 11:24 | Comments (1) | Permasnark
File under: Beauty, Media Galaxy, No Comment, Science
Nav's thoughts: Yeah, I was wondering about this too after Kottke posted it, especially since the undercurrent of... >>

Language Is A Technology That Restructures Language

Lera Boroditsky has a super-interesting essay at Edge on her work empirically testing the proposition that language structures thought. (Blërg - resisting urge to... blockquote.... sigh.)

So Boroditsky's got some clever tests, including asking speakers/writers of a different language to arrange pictures chronologically (Roman languages tend to arrange chronology from left to right, Hebrew from right to left, and fascinatingly, the Kuuk Thaayorre in Australia do it from east to west), and testing incidences of adjectives speakers of languages with gendered nouns assign to those nouns - Germans think keys (male) are hard and jagged and bridges are slender and beautiful, where Spanish-speakers (whose gender assignations switch the nouns) correspondingly flip associations.

But... okay, look. I believe in this thesis. But the tests to my mind are not conclusive evidence. Here's why.

You can't get into a person's head.

Is is that simple? It is.

Because (stay with me) all of these tests don't show that speakers of different language think differently, but that they represent thought differently. The way we write changes the way we talk, and the way we represent thought in space. The way we talk also changes the way we write. And the way we talk changes the way we talk. You don't have any evidence - at least, any evidence that doesn't assume the premise - that Germans actually THINK bridges are more graceful or beautiful than Spaniards do - just that they're more likely to use adjectives with feminine associations with feminine nouns. What this suggests immediately is that language is a complex and interconnected system where terms and kinds group together, and small linguistic changes actually trigger a series of different linguistic associations and values. It DOESN'T immediately prove that language structures thought - understood as something independent from its representation.

Because if language is the vocal and visual representation of concepts, then ALL of Boroditsky's tests are instances of language. Language structures language. And once you assume unproblematically that language directly represents thought, then you naturally discover that thought and language are inseparable. Which is what was to be shown. But this is logically a tautology - even if its empirical specifics of how that tautology manifests itself are fascinating.

Let me reframe this, then. What I think these experiments show is that in moments where we may think we are simply registering our pure and unmediated experience of the world, we're really on auto-pilot - language is in fact doing our "thinking" for us. But this kind of not-quite-thinking doesn't automatically deserve to be called "thought" at all.

Tim-sig.gif
Posted July 1, 2009 at 11:11 | Comments (3) | Permasnark
File under: Braiiins, Language, Science

June 19, 2009

J J Cohen's thoughts: Thanks for the linkback -- and I am happy you enjoyed the post!... >>

Jonah Lehrer and The Fourth Culture

I should have read Jonah Lehrer's Proust Was a Neuroscientist a long time ago. Jeffrey J Cohen's excerpt at In the Middle just bumped it to the top of my list. Here's Cohen:

Whereas C. P. Snow argued in 1959 that we require a third culture, one that bridge the noncommunicating realms of art and science, those scientists who have self-appointed themselves as this culture (especially Steven Pinker) carry a fair amount of animus towards the humanities, believing it enough if they communicate their science directly to the masses. Lehrer argues that not only does such a third culture misrepresent what Snow imagined, it often gets the humanists wrong (and misapprehends their artistic sources as well) by not having listened or read attentively. Lehrer therefore calls for a fourth culture, a space of true collaboration, and it is that call I'd like to quote this morning.

And here's Lehrer (as quoted by JJC):

[A fourth culture] seeks to discover the relationships between the humanities and the sciences. This fourth culture, much closer in concept to Snow's original definition (and embodied by works like [Ian McEwan's] Saturday), will ignore arbitrary intellectual boundaries, seeking instead to blur the lines that separate. It will freely transplant knowledge between the sciences and humanities, and will focus on connecting the reductionist fact to our actual experience. It will take a pragmatic view of truth, and it will judge truth not by its origins but by its usefulness. What does this novel or experiment or poem or protein teach us about ourselves? How does it help us to understand who we are? What long-standing problem has been solved? ... While science will always be our primary method of investigating the universe, it is naïve to think that science alone can solve everything itself, or that everything can even be solved ... When we venture beyond the edge of our knowledge, all we have is art ... No knowledge has a monopoly on knowledge.

For my part -- and it's taken me a looooong time to come around to this view -- I think one of the paradigmatic approaches to this problem of disciplinary edges is to spend a lot of time thinking about media. You simply HAVE to think about the brain, the body, culture, languages and codes, history, society, politics, commerce. Guys like Pinker want to settle old scores, spend a lot of time worrying about relativism. The people who are thinking seriously about media (inside and outside of the academy) have already moved on.

Tim-sig.gif
Posted June 19, 2009 at 6:17 | Comments (1) | Permasnark
File under: Braiiins, Learnin', Media Galaxy, New Liberal Arts, Science

June 8, 2009

La Gaya Scienza

oeawhallup.jpg

According to Jonathan Jarrett,the whole humanities vs. science contention is (at least in part) an artifact of the English language:

This here is the ceiling of the old lecture hall of the Austrian Academy of the Sciences, at least as it translates into English. But, what's the French or German for science? `Science', `Wissenschaft', respectively, both of which also mean just `knowledge'. All the Romance languages have some version of Latin `scientia', which likewise means just `knowledge'. And that's what the artwork here was painted to express, wisdom being handed down by teachers and on tablets to a romantic and fascinated world. All kinds of knowledge.

The idea that science means the Popperian world of reproducibility, experiment and testing, by contrast, is modern and English. It's slowly being enforced on other languages' academies, but it's not something that people in the Middle Ages, where geometry was one of the Liberal Arts, or even the nineteenth century, would have recognised. Even now, the German-speaking states almost all have their Akademie der Wissenschaften, France has the Centre National des Recherches Scientifiques and Spain the Consejo Superior de Investigaciones Científicas, and these are the premier research institutions of the humanities in their respective lands. But in Britain, which I know best, the current split between the Arts & Humanities Research Board, now Council, and the Engineering and Physical Sciences Council, previously the Science and Engineering Research Council and previously the Science Research Council, goes essentially back to the difference between the Royal Society, founded 1660 in some form, and the British Academy, founded 1902. I don't know what the equivalent bodies in the USA would be but it would be an interesting comparison. [Note: My guess would be the American Academy of Arts and Sciences. --TC]

Elsewhere we don't have to have this separation, and one of the most interesting things about Snow's piece is therefore its potential to explain why in fact we do. And, indeed, it's pleasant to see that some people have used Science! and graphs and maps to argue that in fact, we don't, we just think we do. As a computing-in-the-humanities sort of guy, I can get behind that.

I don't absolutely buy this, but I think there is something to it. When I translate "Wissenschaft," I sometimes use "science," but more often I find myself writing "scholarship" - which is as close to a word covering both the humanities and sciences in a traditional liberal-artsy sense.

More to the point, I think the science/humanities divide is less a difference in the way Anglo-Americans and contiental Europeans think about the humanities, than a difference in the way we think about science.

In the US, at least, nearly ALL science is seen as applied science -- that is, closer to the PRACTICE of engineering, or medicine, then it is to history or sociology or (god forbid) comparative literature. None of those things can build a bridge or whup those Communists. But if you start to talk about "research," or especially "scholarship," then you start to see commonalities. Someone doing medical research, even for a for-profit purpose, is in a different business from someone working in a clinical practice, just as a lawyer is different from a law professor.

The beef with the humanities seems to be that there are no corresponding practitioners, no practical applications -- with the possible exceptions of K-12 teachers and professional writers (journalists, novelists, historians who write for trade presses). Couple that with a rump humanism that actively valorizes the uselessness, timelessness, and universality of the arts, and you get some misunderstandings at best and real problems at worst.

The shift that's happening seems to be with the younger generation of culture workers. (Here I'm relying in part on Alan Liu's thesis in The Laws of Cool.) One reason why I think the idea of Liberal Arts 2.0 / digital humanism seems to have some traction is that the work that younger people includes more of what we would traditionally call the humanities, and is governed by an ethos that is closer to what we would call humanism. If we begin to think of our technological galaxy as a media galaxy, then we start to see some clearer points of overlap between science culture and humanities culture.

Somewhere Friedrich Kittler points out that there's only been one time before now that the entire West was governed by the same information technologies. That was during the European Middle Ages, when the university's technologies of the book, the library, the postal service, the lecture, etc. were pretty much the only games in town. If you get bifurcated discourse networks, you'll get a bifurcated culture. You can't just try to understand a cultural rift; it will only close once its precondition changes.

Tim-sig.gif
Posted June 8, 2009 at 3:56 | Comments (0) | Permasnark
File under: Books, Writing & Such, Learnin', New Liberal Arts, Science, Worldsnark

May 31, 2009

The F-Double-Prime Equation Of Love

Mathematician Steven Strogatz, guest-blogging for Olivia Judson:

In all cases, the business of theoretical physics boils down to finding the right differential equations and solving them. When Newton discovered this key to the secrets of the universe, he felt it was so precious that he published it only as an anagram in Latin. Loosely translated, it reads: "it is useful to solve differential equations."

The silly idea that love affairs might progress in a similar way occurred to me when I was in love for the first time, trying to understand my girlfriend's baffling behavior. It was a summer romance at the end of my sophomore year in college. I was a lot like the first Romeo above, and she was even more like the first Juliet. The cycling of our relationship was driving me crazy until I realized that we were both acting mechanically, following simple rules of push and pull. But by the end of the summer my equations started to break down, and I was even more mystified than ever. As it turned out, the explanation was simple. There was an important variable that I'd left out of the equations — her old boyfriend wanted her back.

In mathematics we call this a three-body problem. It's notoriously intractable, especially in the astronomical context where it first arose. After Newton solved the differential equations for the two-body problem (thus explaining why the planets move in elliptical orbits around the sun), he turned his attention to the three-body problem for the sun, earth and moon. He couldn't solve it, and neither could anyone else. It later turned out that the three-body problem contains the seeds of chaos, rendering its behavior unpredictable in the long run.

Guess we shouldn't toss DiffEq just yet.

(Via the Radiolab blog.)

Tim-sig.gif
Posted May 31, 2009 at 9:54 | Comments (0) | Permasnark
File under: Science, Society/Culture

Dating the Past

Historiscientific nerd alert: There's a hot new method of dating historical artifacts, specifically ceramic artifacts, based on their moisture uptake. But there's at least one big problem -- it assumes that mean temperatures are constant. HNN's Jonathan Jarrett has the goods, in a paragraph so well-linked that I've cut-and-pasted them all. (I also changed some of the punctuation and split Jarrett's long paragraph into a few short ones.)

Now, you may have heard mention of a thing called "the medieval warm period." This is a historical amelioration of temperature in Europe between, roughly, the tenth and twelfth centuries. This probably decreased rainfall and other sorts of weather bad for crops, therefore boosted agricultural yield, pumped more surplus into the economy, fuelled demographic growth and arguably deliquesced most European societies to the point where they changed in considerable degree.

However, because of the current debate on climate change, it has become a ball to kick around for climate "scientists," those who wish to argue that we're not changing the climate pointing to it and ice coverage in Norse-period Greenland (which was less than there is currently despite less carbon dioxide in the atmosphere then), while those who wish to argue that we are changing the climate (and, almost always, that this relates to CO2 output, which does seem like a weak link in the argument) dismiss it as legend or scorn the very few and unscientific datapoints, not really caring that the historical development of European society in the ninth to eleventh centuries just doesn't make sense without this system change from the ground. None of these people are medievalists and they're not trying to prove anything about the Middle Ages, so it gets messy, but there is a case about this temperature change that has to be dealt with.

This obviously has an impact on this research. If the sample were old enough, the errors and change probably ought to balance out. But if it were, from, say, the eighth century, then the moisture uptake in the four or five subsequent centuries would be higher than expected from the constant that this research used and the figure would be out, by, well, how much? The team didn't know: "The choice of mean lifetime temperature provides the main other source of uncertainty, but we are unable to quantify the uncertainty in this temperature at present."

We, however, need to know how far that could knock out the figures. Twenty years? More? It begins to push the potential error from a single sample to something closer to a century than a year. That is, the margin of historical error (as opposed to mathematical error) on this method could be worse than that of carbon-dating, and we don't actually know what it is.

Lots of good stuff in the whole, long post, including an annotated run-down of ALL of the ways we know how to date old things.

Tim-sig.gif
Posted May 31, 2009 at 7:15 | Comments (0) | Permasnark
File under: Learnin', Object Culture, Science, Worldsnark

May 28, 2009

Saheli Datta's thoughts: One point I often carp on is that 99% of summary statistics quoted would be better replaced wi... >>

What Kinds of Math Do We Need?

Biologists are debating how much quantitative analysis their field needs; at Language Log, Mark Liberman pivots to linguistics:

The role of mathematics in the language sciences is made more complex by the variety of different sorts of mathematics that are relevant. In particular, some areas of language-related mathematics are traditionally approached in ways that may make counting (and other sorts of quantification) seem at least superficially irrelevant — these include especially proof theory, model theory, and formal language theory.

On the other hand, there are topics where models of measurements of physical quantities, or of sample proportions of qualitative alternatives, are essential. This is certainly true in my own area of phonetics, in sociolinguistics and psycholinguistics, and so on. It's more controversial what sorts of mathematics, if any, ought to be involved in areas like historical linguistics, phonology, and syntax...

Unfortunately, the current mathematical curriculum (at least in American colleges and universities) is not very helpful in accomplishing this — and in this respect everyone else is just as badly served as linguists are — because it mostly teaches thing that people don't really need to know, like calculus, while leaving out almost all of the things that they will really be able to use. (In this respect, the role of college calculus seems to me rather like the role of Latin and Greek in 19th-century education: it's almost entirely useless to most of the students who are forced to learn it, and its main function is as a social and intellectual gatekeeper, passing through just those students who are willing and able to learn to perform a prescribed set of complex and meaningless rituals.)

My thoughts are still inchoate on this, so I'll throw it open -- is calculus 1) a waste of time for 80-90% of the folks who learn it, 2) unfairly dominating of the rest of useful mathematics, 3) one of the great achievements of the modern mind that everyone should know about, or 4) all of the above?

More to the point -- what kinds of maths (as they say in the UK) have you found to be most valuable to your later life, work, thinking, discipline, whatever?

And looking to the future - I don't think we have a mathematics entry as such in the New Liberal Arts book-to-come; but if we did, what should it look like?

Tim-sig.gif
Posted May 28, 2009 at 7:13 | Comments (4) | Permasnark
File under: New Liberal Arts, Science

May 25, 2009

Jake's thoughts: @Dan I agree it's a (great) think-piece, and we're still just watching the previews before the EP... >>

The Soul of American Medicine

If I ever meet Atul Gawande, I'm giving him a high-five, a hug, and then I'm going to try to talk to him for about fifteen minutes about why I think he's special. From "The Cost Conundrum," in the new New Yorker:

No one teaches you how to think about money in medical school or residency. Yet, from the moment you start practicing, you must think about it. You must consider what is covered for a patient and what is not. You must pay attention to insurance rejections and government-reimbursement rules. You must think about having enough money for the secretary and the nurse and the rent and the malpractice insurance...

When you look across the spectrum from Grand Junction [Colorado] to McAllen [Texas]—and the almost threefold difference in the costs of care—you come to realize that we are witnessing a battle for the soul of American medicine. Somewhere in the United States at this moment, a patient with chest pain, or a tumor, or a cough is seeing a doctor. And the damning question we have to ask is whether the doctor is set up to meet the needs of the patient, first and foremost, or to maximize revenue.

There is no insurance system that will make the two aims match perfectly. But having a system that does so much to misalign them has proved disastrous. As economists have often pointed out, we pay doctors for quantity, not quality. As they point out less often, we also pay them as individuals, rather than as members of a team working together for their patients. Both practices have made for serious problems...

Activists and policymakers spend an inordinate amount of time arguing about whether the solution to high medical costs is to have government or private insurance companies write the checks. Here’s how this whole debate goes. Advocates of a public option say government financing would save the most money by having leaner administrative costs and forcing doctors and hospitals to take lower payments than they get from private insurance. Opponents say doctors would skimp, quit, or game the system, and make us wait in line for our care; they maintain that private insurers are better at policing doctors. No, the skeptics say: all insurance companies do is reject applicants who need health care and stall on paying their bills. Then we have the economists who say that the people who should pay the doctors are the ones who use them. Have consumers pay with their own dollars, make sure that they have some “skin in the game,” and then they’ll get the care they deserve. These arguments miss the main issue. When it comes to making care better and cheaper, changing who pays the doctor will make no more difference than changing who pays the electrician. The lesson of the high-quality, low-cost communities is that someone has to be accountable for the totality of care. Otherwise, you get a system that has no brakes.

Tim-sig.gif
Posted May 25, 2009 at 6:26 | Comments (6) | Permasnark
File under: Cities, Recommended, Science, Snarkpolicy

May 13, 2009

It Is Not Logical

Tim says,

Andrew Hungerford -- aka the smartest, funniest dramatist * astrophysicist = lighting director you should know -- has written the best post on the physical holes in the new Star Trek movie that I think can be written.

Basically, almost nothing in the movie makes sense, either according to the laws established in our physical universe or the facts established in the earlier TV shows and movies.

Wherever possible, Andy provides a valiant and charitable interpretation of what he sees, based (I think) on the theory that "what actually happened" is consistent with the laws of physics, but that these events are poorly explained, characters misspeak, or the editing of the film is misleading. (I love that we sometimes treat Star Trek, Star Wars, etc., like the "historical documents" in Galaxy Quest -- accounts of things that REALLY happened, but that are redramatized or recorded and edited for our benefit, as opposed to existing ONLY within a thinly fictional frame.)

If you haven't seen the movie yet, you probably shouldn't read the post. It will just bother you when you're watching it, like Andy was bothered. If you have, and you feel like being justifiably bothered (but at the same time profoundly enlightened), check it out right now. I mean, now.

Comments (0) | Permasnark | Posted: 2:50 PM

May 5, 2009

Touching Each Other's Minds

LAS VEGAS - APRIL 02:  Film critic Roger Ebert...

Image by Getty Images via Daylife

Roger Ebert on the value of a blog's readers:
This has been an education for me. No one will read all the comments except me, but if you did, you could learn all a layman should be expected to understand about the quantum level. You would discover a defender of Intelligent Design so articulate that when he was away for a couple of days, the Darwinians began to fret and miss him. You would have the mathematical theory of infinity explained so that, while you will still be unable to conceive of infinity, you will understand the thinking involved.

The larger post is a lovely, thoughtful meditation on death, identity, and faith:

What I expect will most probably happen is that my body will fail, my mind will cease to function, and that will be that. My genes will not live on, because I have had no children. Perhaps I have been infertile. If I discover that somewhere along the way I conceived a child, let that child step forward and he or she will behold a happy man. Through my wife, I have had stepchildren and grandchildren, and I love them unconditionally, which is the only kind of love worth bothering with.

Also Van Gogh, Tintin, Walt Whitman, and Quantum Mechanics.

Tim-sig.gif
Posted May 5, 2009 at 3:43 | Comments (0) | Permasnark
File under: Braiiins, Science

April 27, 2009

sp's thoughts: The author's a recent college grad and part-time researcher. I, too, am a young (full-time) resea... >>

Google: The World's Medical Journal

A good anecdotal lead. Carolina Solis is a medical student who did research on parasitic infections caused by contaminated well water in rural Nicaragua.

Like many researchers, she plans to submit her findings for publication in a medical journal. What she discovered could benefit not just Nicaraguan communities but those anywhere that face similar problems. When she submits her paper, though, she says the doctors she worked with back in San Juan del Sur will probably never get a chance to read it.

"They were telling me their problems accessing these [journals]. It can be difficult for them to keep up with all the changes in medicine."

Hey, Matt, if you want to sink your teeth into a medical policy issue that's right up your alley, I think this is it.

There's legislation:

Washington recently got involved. Squirreled away in the massive $410 billion spending package the president signed into law last month is an open access provision. It makes permanent a previous requirement that says the public should have access to taxpayer-funded research free of charge in an online archive called PubMed Central. Such funding comes largely from the National Institutes of Health, which doles out more than $29 billion in research grants per year. That money eventually turns into about 60,000 articles owned and published by various journals.

But Democrats are divided on the issue. In February, Rep. John Conyers, D-Mich., submitted a bill that would reverse open access. HR 801, the Fair Copyright in Research Works Act, would prohibit government agencies from automatically making that research free. Conyers argues such a policy would buck long-standing federal copyright law. Additionally, Conyers argues, journals use their subscription fees to fund peer review in which experts are solicited to weigh in on articles before they're published. Though peer reviewers aren't usually identified or paid, it still takes money to manage the process, which Conyers calls "critical."

And cultural/generational change:

The pay-to-play model doesn't jive with a generation of soon-to-be docs who "grew up Google," with information no farther than a search button away. It's a generation that never got lost in library stacks looking for an encyclopedia, or had to pay a penny for newspaper content. So it doesn't see why something as important as medical research should be locked behind the paywalls of private journals.

Copyright issues are nothing new to a generation that watched the recording industry deal its beloved original music sharing service, Napster, a painful death in 2001. Last October, it watched Google settle a class-action lawsuit brought on by book publishers upset over its Book Search engine, which makes entire texts searchable. And just last week, a Swedish court sentenced four founders of the the Pirate Bay Web site to a year in prison over making copyrighted files available for illegal file sharing. And now the long-familiar copyright war is spilling over into medicine.

There's even WikiDoc

And, the article doesn't mention this, but I'll contend there's a role for journalism to play. Here's a modest proposal: allow medical researchers to republish key findings of the research in newspapers, magazines, something with a different revenue structure, and then make it accessible to everyone. Not perfect, but a programmatic effort would do some good.

Speaking of which -- what are the new big ideas on the health/medicine beat? This is such a huge issue -- it feels like it should have its own section in the paper every day.

Tim-sig.gif
Posted April 27, 2009 at 9:28 | Comments (3) | Permasnark
File under: Journalism, Science, Snarkpolicy, Worldsnark

April 26, 2009

katiebakes's thoughts: Ever since Matt mentioned health writers like Atul Gawande yesterday I've been browsing Amazon fo... >>

Every Day Like Paris For The First Time

Jonah Lehrer + Allison Gopnik on baby brains:

The hyperabundance of thoughts in the baby brain also reflects profound differences in the ways adults and babies pay attention to the world. If attention works like a narrow spotlight in adults - a focused beam illuminating particular parts of reality - then in young kids it works more like a lantern, casting a diffuse radiance on their surroundings.

"We sometimes say that adults are better at paying attention than children," writes Gopnik. "But really we mean just the opposite. Adults are better at not paying attention. They're better at screening out everything else and restricting their consciousness to a single focus."

This (in bold) is the money-quote, though:

Gopnik argues that, in many respects, babies are more conscious than adults. She compares the experience of being a baby with that of watching a riveting movie, or being a tourist in a foreign city, where even the most mundane activities seem new and exciting. "For a baby, every day is like going to Paris for the first time," Gopnik says. "Just go for a walk with a 2-year-old. You'll quickly realize that they're seeing things you don't even notice."

I can confirm that this is true.

Also, peep this graph charting synaptic activity + density according to age (via Mind Hacks):

Huttenlocher_Graph.png

Apparently, that's where the real action is: contra Lehrer's article, baby brains don't actually have more neurons than adults, but way more (and way denser) synapses (aka the connections between neurons).

Also, just to free associate on the whole synapse thing: I had knee surgery a few weeks ago to repair a torn quadriceps tendon, and I'm in physical therapy now. Part of my PT involves attaching electrodes to my thigh to induce my quad to flex (this is called "reeducating the muscle.").

Anyways, it is always weird to confirm that we are just made out of meat, and that if you run enough electrical current through a muscle, it'll react whether or not your brain tells it to. That's all your brain is -- an extremely powerful + nuanced router for electricity.

Tim-sig.gif
Posted April 26, 2009 at 7:22 | Comments (2) | Permasnark
File under: Braiiins, Science

March 23, 2009

Tim's thoughts: I use it too. And Facebook. And a lot of IM chatting. And for a while, political forums that gave... >>

There's Solitary and Then There's Solitary

The other day, a group of my friends, including two other PhDs, discussed the high rate of depression among graduate students. "It's the stress," one said; "the money!" laughed another. But I made a case that it was actually the isolation, the loneliness, that had the biggest effect. After all, you take a group of young adults who are perversely wired for the continual approval that good students get from being in the classroom with each other, and then lock them away for a year or two to write a dissertation with only intermittent contact from an advisor. That's a recipe for disaster.

So I read Atul Gawande's account of the human brain's response to solitary confinement with an odd shock of recognition:

Among our most benign experiments are those with people who voluntarily isolate themselves for extended periods. Long-distance solo sailors, for instance, commit themselves to months at sea. They face all manner of physical terrors: thrashing storms, fifty-foot waves, leaks, illness. Yet, for many, the single most overwhelming difficulty they report is the 'soul-destroying loneliness,' as one sailor called it. Astronauts have to be screened for their ability to tolerate long stretches in tightly confined isolation, and they come to depend on radio and video communications for social contact...

[After years of solitary, Hezbollah hostage Terry Anderson] was despondent and depressed. Then, with time, he began to feel something more. He felt himself disintegrating. It was as if his brain were grinding down. A month into his confinement, he recalled in his memoir, "The mind is a blank. Jesus, I always thought I was smart. Where are all the things I learned, the books I read, the poems I memorized? There's nothing there, just a formless, gray-black misery. My mind's gone dead. God, help me."

He was stiff from lying in bed day and night, yet tired all the time. He dozed off and on constantly, sleeping twelve hours a day. He craved activity of almost any kind. He would watch the daylight wax and wane on the ceiling, or roaches creep slowly up the wall. He had a Bible and tried to read, but he often found that he lacked the concentration to do so. He observed himself becoming neurotically possessive about his little space, at times putting his life in jeopardy by flying into a rage if a guard happened to step on his bed. He brooded incessantly, thinking back on all the mistakes he'd made in life, his regrets, his offenses against God and family.

But here's the weird part -- all of this isolation actually serves to select for a particular personality type. This is especially perverse when solitary confinement is used in prisons -- prisoners who realign their social expectations for solitary confinement effectively become asocial at best, antisocial generally, and deeply psychotic at worst.

Everyone's identity is socially created: it's through your relationships that you understand yourself as a mother or a father, a teacher or an accountant, a hero or a villain. But, after years of isolation, many prisoners change in another way that Haney observed. They begin to see themselves primarily as combatants in the world, people whose identity is rooted in thwarting prison control.

As a matter of self-preservation, this may not be a bad thing. According to the Navy P.O.W. researchers, the instinct to fight back against the enemy constituted the most important coping mechanism for the prisoners they studied. Resistance was often their sole means of maintaining a sense of purpose, and so their sanity. Yet resistance is precisely what we wish to destroy in our supermax prisoners. As Haney observed in a review of research findings, prisoners in solitary confinement must be able to withstand the experience in order to be allowed to return to the highly social world of mainline prison or free society. Perversely, then, the prisoners who can't handle profound isolation are the ones who are forced to remain in it. "And those who have adapted," Haney writes, "are prime candidates for release to a social world to which they may be incapable of ever fully readjusting."

I think we just figured out why so many professors are so deeply, deeply weird.

Tim-sig.gif
Posted March 23, 2009 at 11:20 | Comments (2) | Permasnark
File under: Books, Writing & Such, Braiiins, Learnin', Recommended, Science, Snarkpolicy

February 25, 2009

That Coffin Is A Lifeboat

image008.jpg

One of my favorite people, um, ever is Charles Olson -- poet, amateur anthropologist, rector of Black Mountain College back when BMC was quite possibly the coolest place to be in the country. (Olson reportedly said, "I need a college to think with" -- something that I often feel myself whenever I take a stab at thinking about the New Liberal Arts.)

Olson's essay/manifesto "Projective Verse" helped build the bridge between modernist and postmodern literature -- in fact, Olson's sometimes given credit for helping formulate the whole idea of the postmodern.

One of Olson's most important contributions to American letters is his book Call Me Ishmael, a wonderful, idiosyncratic but authoritative critical take on Herman Melville and Moby Dick. Here, for example, are the first few sentences:

I take SPACE to be the central fact to man born in America, from Folsom cave to now. I spell it large because it comes large here. Large, and without mercy.

Olson himself was a giant -- 6'8" -- and knew a thing or two about spelling things large. (If you want to read more, I highly recommend picking up Olson's Collected Prose -- it's all really, really good.)

Now the University of Connecticut is digitizing Olson's notes on Melville -- which would be cool in its own right, but 100% cooler insofar as Olson's notes bring back a world that doesn't exist anymore:

Olson was one of the first scholars to consider the importance of Melville's reading and marginalia.

In the 1930s, Melville's surviving literary manuscripts, letters, personal papers and journals, and reading library were still, for the most part, in the possession of the family and a few institutional or private collectors. The most substantial collection of Melville materials unaccounted for at that point—and the materials that Olson pursued most vigorously—were the "lost five hundred," the approximate number of books Melville's widow had sold to a Brooklyn dealer in 1892. As a young scholar, Olson was indefatigable in his research; when he located a volume from Melville's library in a grand-daughter's home, in a private collector's hands, or on a public library's shelves, Olson carefully transcribed onto 5 x 7-inch note cards complete bibliographic information on the volume, as well as the content and location of Melville’s annotations and reading marks. Charles Olson’s note cards are, in a few important instances, the only account of Melville’s reading marks in books whose location is now unknown. Olson’s notes also provide scholars with Melville’s marginalia in volumes currently in private hands and not readily available to scholars.

In addition to the note cards on books from Melville's library, there are two other groups of cards at the University of Connecticut. On one group of cards Olson captured his notes of interviews and recorded his astonishingly thorough methods for tracking down relatives of those known or thought to have bought books from Melville’s library. Other note cards were used by Olson to record his reading and critical notes on Melville's published works. In all, nearly 1,100 note cards survive.

Unfortunately, when Olson moved away from Melville scholarship after the publication Call Me Ishmael (1947), he stored the results of his investigative work in a trunk in a friend's basement. Countless water leaks over the years damaged the note cards containing the transcriptions and research notes. Some cards were merely soiled; others were fused together in large blocks. After the University of Connecticut purchased the Olson papers in 1973, the note cards were stored separately while awaiting appropriate preservation measures.

That's right -- we can piece together Melville's library from soggy, seventy-five-year-old index cards.

Tim-sig.gif
Posted February 25, 2009 at 1:44 | Comments (0) | Permasnark
File under: Books, Writing & Such, Learnin', New Liberal Arts, Recommended, Science

February 23, 2009

The Logic of Oscar Predictions

Nate Silver, the web's Statistician Laureate*, created a statistical model to predict the winners of the six major Oscar categories. He got four out of six right, missing Penélope Cruz for Best Supporting Actress and Sean Penn for Best Actor. In his postmortem, Silver notes that Kate Winslet's flitting in and out of the category threw off his model, but also offers a broader defense of his approach:

Ultimately, this is not about humans versus computers. The computer I used to forecast the Oscars didn't think for itself -- it merely followed a set of instructions that I provided to it. Rather, it is a question of heuristics: when and whether subjective (but flexible) judgments, such as those a film critic might make, are better than objective (but inflexible) rulesets.

The advantage in making a subjective judgment is that you may be able to account for information that is hard to quantify -- for example, Rourke's behavioral problems or the politics of Sean Penn playing a gay icon in a year where Hollywood felt very guilty about the passage of Proposition 8. The disadvantage is that human beings have all sorts of cognitive biases, and it's easy to allow these biases to color one's thinking. I would guess, for instance, that most critics would have trouble decoupling the question of who they thought should win the Oscars -- those performances they liked the best personally -- from who they thought actually would win them.

... Read more ....
Tim-sig.gif
Posted February 23, 2009 at 6:51 | Comments (0) | Permasnark
File under: Movies, Science

January 3, 2009

The Happening

Matt says,

This really is Lovecraftian:

In an alarming yet little-noticed series of recent studies, scientists have concluded that Canada's precious forests, stressed from damage caused by global warming, insect infestations and persistent fires, have crossed an ominous line and are now pumping out more climate-changing carbon dioxide than they are sequestering.
This fact might be the best illustration I've seen of the unexpected consequences of climate change. "Inexorably rising temperatures are slowly drying out forest lands, leaving trees more susceptible to fires, which release huge amounts of carbon into the atmosphere." What a catastrophic chain of events. How frightening to imagine that global warming is powerful and sinister enough to co-opt the very forces that ordinarily keep it in check.

Comments (0) | Permasnark | Posted: 10:44 AM

December 20, 2008

rAchel's thoughts: before I left Cambridge for the tropics, I visited the Whipple Science Museum (which if you are e... >>

Antikythera for Christmas

Matt can keep his Kindle -- I'll take one of these:

I seriously want to know more about the early history of astronomy. Less the sociology than the psychology of it - what was it that led humans to devote themselves to such long-term, precise observations? A belief in the power of distant gods? Boredom? The urge to find certainty somewhere, anywhere in the cosmos?

Via HNN/Ralph Luker.

Tim-sig.gif
Posted December 20, 2008 at 3:18 | Comments (2) | Permasnark
File under: Learnin', Science

July 31, 2008

Life: Rich with Metaphor

Matt says,

Via Reddit:

Some anglerfishes of the superfamily Ceratiidae employ an unusual mating method. Because individuals are presumably locally rare and encounters doubly so, finding a mate is problematic. When scientists first started capturing ceratioid anglerfish, they noticed that all of the specimens were females. These individuals were a few inches in size and almost all of them had what appeared to be parasites attached to them. It turned out that these "parasites" were the remains of male ceratioids.

At birth, male ceratioids are already equipped with extremely well developed olfactory organs that detect scents in the water. When it is mature, the male's digestive system degenerates, making him incapable of feeding independently, which necessitates his quickly finding a female anglerfish to prevent his death. The sensitive olfactory organs help the male to detect the pheromones that signal the proximity of a female anglerfish. When he finds a female, he bites into her skin, and releases an enzyme that digests the skin of his mouth and her body, fusing the pair down to the blood-vessel level. The male then atrophies into nothing more than a pair of gonads, which release sperm in response to hormones in the female's bloodstream indicating egg release. This extreme sexual dimorphism ensures that, when the female is ready to spawn, she has a mate immediately available.

Comments (7) | Permasnark | Posted: 2:08 PM

June 23, 2008

Large Hadron Countdown

Matt says,

Taylor points to the Large Hadron Collider countdown clock, ticking off the seconds until Earth is destroyed by a black hole colliding with a strangelet or whatever.

Comments (1) | Permasnark | Posted: 7:58 AM

June 5, 2008

Trippy

Matt says,

"What are some mindblowing scientific concepts (proven or hypothetical)?" asks a poster on Ask MeFi. Culled from among these answers is this gem, a quotation from Donna Haraway's When Species Meet:

"I love the fact that human genomes can be found in only about 10 percent of all the cells that occupy the mundane space I call my body; the other 90 percent of the cells are filled with the genomes of bacteria, fungi, protists, and such, some of which play in a symphony necessary to my being alive at all, and some of which are hitching a ride and doing the rest of me,of us,no harm. I am vastly outnumbered by my tiny companions; better put, I become an adult human being in company with these tiny messmates. To be one is always to be one with many. Some of these personal microscopic biota are dangerous to the me who is writing this sentence; they are held in check for now by the measures of the coordinated symphony of all the others, human cells and not, that make the conscious me possible. I love that when “I” die, all these benign and dangerous symbionts will take over and use whatever is left of “my” body, if only for a while, since“we” are necessary to one another in real time."

Comments (0) | Permasnark | Posted: 6:09 PM

February 2, 2008

Natural Magic

Robin says,

Reading "The Revolution in Science 1500-1750" by A. Rupert Hall and absolutely loved this line:

Quite how the authentic philosophy of Plato [...] became the father of natural magic -- magical operations without the aid of demons -- seems to be somewhat obscure.

"Magical operations without the aid of demons"! So awesome! "Hey, uh, listen, so if you want to do some magic... but you don't have any demons... try science!"

I'm enjoying the tone of the book. Hall isn't afraid to make positive value-judgments about the scientific worldview (because, he says, that view actually does provide more useful, more complete theories about the world) but at the same time, he doesn't fail to detail all the weird, religious, dogmatic, and/or occult motivations of many early scientists: Vesalius! Mondino! Paracelsus!

Comments (4) | Permasnark | Posted: 11:15 PM

January 3, 2008

Save the Earth, Read a Paper

Matt says,

Chris Anderson does a back-of-the-envelope carbon footprint calculation for an issue of Wired vs. the same issue online. The results surprised me. (Of course, it being Chris Anderson, it's certainly not as back-of-the-envelope as it comes off; he drops some mad knowledge in the commentz.)

Comments (0) | Permasnark | Posted: 6:17 PM

September 12, 2007

Universe-Hunting

Robin says,

I admit it: I pre-ordered Stephen Wolfram's A New Kind of Science on Amazon.com back in the day... got it the day it came out... and was totally bewildered. I ended up selling it to a used book store.

But I still like the core ideas, to the extent I understand them, which is not much. The crude version is: Stephen Wolfram likes cellular automata, or simple rulesets that, when run recursively, produce interesting and surprisingly complex results, especially when you get them two, three, or more dimensions. In fact he thinks all of math and science (!) has fallen too deeply in the thrall of the equation -- not necessarily a very "natural" thing -- and has completely missed the potential analytic and explanatory power of the cellular automata.

Anyway, the point is, it's provocative even if I don't really get it, and so is his latest blog post:

Of course, as early theologians pointed out, the universe clearly has some order, some "design". It could be that every particle in the universe has its own separate rule, but in reality things are much simpler than that.

But just how simple? A thousand lines of Mathematica code? A million lines? Or, say, three lines?

If it's small enough, we really should be able to find it just by searching. And I think it'd be embarrassing if our universe is out there, findable by today's technology, and we didn't even try.

Of course, that's not at all how most of today's physicists like to think. They like to imagine that by pure thought they can somehow construct the laws for the universe--like universe engineers.

So it's basically theory via Google: Instead of deducing the laws of the universe, you arrive at them via computational brute force. Just try every combination of simple rules you can think of 'til you get something that looks like physics! Easy!

Great images in the post, too, as always. Wolfram famously self-published his book (actually, it's even better: He founded a new company to publish it) because he couldn't find any existing publishers willing or able to reproduce his illustrations at the resolution he demanded. Awesome.

Comments (3) | Permasnark | Posted: 10:13 PM

September 7, 2007

Disease Resistance for the Weekend

Robin says,

We mostly think of individuals as the units of natural selection. When it comes to disease resistance, the unit might actually be the family. That's cool.

Comments (0) | Permasnark | Posted: 4:33 PM

July 2, 2007

One Big Species

Robin says,

Over in the New York Review of Books, and apropos of nothing, Freeman Dyson talks up our biotech future. It gets pretty utopian towards the end, but it's a scintillating read all the same.

This bit is near the beginning:

[Carl Woese] is postulating a golden age of pre-Darwinian life, when horizontal gene transfer was universal and separate species did not yet exist. Life was then a community of cells of various kinds, sharing their genetic information so that clever chemical tricks and catalytic processes invented by one creature could be inherited by all of them. Evolution was a communal affair, the whole community advancing in metabolic and reproductive efficiency as the genes of the most efficient cells were shared. Evolution could be rapid, as new chemical devices could be evolved simultaneously by cells of different kinds working in parallel and then reassembled in a single cell by horizontal gene transfer.

Just a theory... but wow, what a theory!

Don't forget: previous biotech madness.

Comments (0) | Permasnark | Posted: 7:33 PM

May 17, 2007

Brain-Hurt of the Day

Matt says,

From Elizabeth Kolbert's lovely article about CERN's giant Hadron Collider:

It is one of the paradoxes of particle physics that fundamental particles, though pointlike and indivisible, are also generally unstable. In fact, the heavier particles are so short-lived that even to speak of their having an existence seems faintly ludicrous; a top quark, for example, is estimated to last no more than 1 × 10¯²⁴ seconds. (For comparison’s sake, 1 × 10¯²⁴ centuries comes to three millionths of a billionth of a second.)
Och, there goes my head, hurting again.

Comments (0) | Permasnark | Posted: 6:47 PM

April 18, 2007

Print It Out, Fold It Up (But Only in Three Dimensions, Please)

Robin says,

Ooh: SEED has a PDF cribsheet on string theory. I didn't know I needed that 'til just now.

But wait: There are loads of these cribsheets!

Comments (0) | Permasnark | Posted: 10:34 AM

February 5, 2007

Old Man Minsky

Robin says,

An interesting (and short) interview with Marvin Minsky over in Discover this month. This passage from page two is provocative:

[Your new book] "The Emotion Machine" reads like a book about understanding the human mind, but isn't your real intent to fabricate it?

The book is actually a plan for how to build a machine. I'd like to be able to hire a team of programmers to create the Emotion Machine architecture that's described in the book -- a machine that can switch between all the different kinds of thinking I discuss. Nobody's ever built a system that either has or acquires knowledge about thinking itself, so that it can get better at problem solving over time. If I could get five good programmers, I think I could build it in three to five years.

From a little later on:

Has science fiction influenced your work?

It's about the only thing I read. General fiction is pretty much about ways that people get into problems and screw their lives up. Science fiction is about everything else.

Also, Minsky says wistfully of the old Bell Labs: "I worked there one summer, and they said they wouldn't work on anything that would take less than 40 years to execute."

Comments (0) | Permasnark | Posted: 2:14 AM

January 28, 2007

Unhappy Meals

Matt says,

Michael Pollan, whose Omnivore's Dilemma may have been my favorite book of last year, has an excellent essay in today's New York Times Magazine.

Comments (3) | Permasnark | Posted: 8:45 AM

January 8, 2007

Climate Is a Mental Construct

Matt says,

Clive Thompson asks the question of whether the U.S. is geographically unable to perceive global warming. Of course, I'm in Minnesota in January and my lake is still liquid, which suggests the answer is "No."

Liquid!! There are still ducks on it!!

Comments (5) | Permasnark | Posted: 8:29 PM

January 3, 2007

Pharmacy Times's thoughts: I never heard of such a thing before: having to avoid certain foods if you're allergic to latex. ... >>

Soy, Yo

After being cautioned by my mom and sister over Christmas break about growing reports of the perils of soy milk, I undertook some casual Web research to assess these warnings for myself. I was dubious, of course. It's soy! It's ancient! Beloved by healthy Easterners for centuries! I defiantly munched my cheddar-flavored soy crispettes and started perusing Google.

Finding the controversy was easy enough. But further Googling ensnared me in a super-techy recursive loop of a conversation between Bill Sardi and his soy-bashing antagonists, Sally Fallon and Mary Enig. Here, at last, my techno-triumphalist, age-of-mass-culture-is-dead self started scrambling for an "authoritative" source to lead me from this thicket.

And they totally failed me. Snopes said the jury's out. The frickin' SF Chronicle delivered a novel-length shrug disguised as a news report.

Best as I can tell, largely on the strength of this 2000 FDA Consumer report, much of the controversy derives from the fact that we currently like to consume not only soy -- the protein, the marvelous whole food that makes the peanut seem one-dimensional -- but also a number of soy derivatives in pill form, as dietary supplements. These pills or powders are made from the individual components of soy (isoflavones), and holistic health sources like to bottle 'em up and sell 'em to consumers as miracle drugs. But there's no proof these components bring any health benefits on their own, and there's reasonable evidence they might bring some risks.

So the pills, not the protein, are the problem. I think. As far as I know, the FDA still allows foods that meet certain criteria to bear a label saying, "Diets low in saturated fat and cholesterol that include 25 grams of soy protein a day may reduce the risk of heart disease."

I have no particular point in sharing any of this, I just think it raises a few interesting questions. Sorry to those of you who read looking for a boffo insight at the end. For the record, I just finished a delicious bowl of Multigrain Cheerios, drowned in Silk.

mthompson-sig.gif
Posted January 3, 2007 at 6:31 | Comments (11) | Permasnark
File under: Science

January 2, 2007

Animals Dream

Matt says,

I mean, it makes sense, but I'd never really given it much thought. I remember seeing my dogs twitch in their sleep and saying, "Aww, they must be dreaming." But I guess I didn't really believe it, or I didn't really follow the thought through to its conclusion. But I find the image of a dreaming rat retracing its steps through a maze to be a little sad and, er, poignant. Am I a sap?

Comments (7) | Permasnark | Posted: 7:58 PM
spacer image
spacer image