The murmur of the snarkmatrix…

August § The Common Test / 2016-02-16 21:04:46
Robin § Unforgotten / 2016-01-08 21:19:16
MsFitNZ § Towards A Theory of Secondary Literacy / 2015-11-03 21:23:21
Jon Schultz § Bless the toolmakers / 2015-05-04 18:39:56
Jon Schultz § Bless the toolmakers / 2015-05-04 16:32:50
Matt § A leaky rocketship / 2014-11-05 01:49:12
Greg Linch § A leaky rocketship / 2014-11-04 18:05:52
Robin § A leaky rocketship / 2014-11-04 05:11:02
P. Renaud § A leaky rocketship / 2014-11-04 04:13:09
Jay H § Matching cuts / 2014-10-02 02:41:13

Universe-Hunting
 / 

I admit it: I pre-ordered Stephen Wolfram’s A New Kind of Science on Amazon.com back in the day… got it the day it came out… and was totally bewildered. I ended up selling it to a used book store.

But I still like the core ideas, to the extent I understand them, which is not much. The crude version is: Stephen Wolfram likes cellular automata, or simple rulesets that, when run recursively, produce interesting and surprisingly complex results, especially when you get them two, three, or more dimensions. In fact he thinks all of math and science (!) has fallen too deeply in the thrall of the equation — not necessarily a very “natural” thing — and has completely missed the potential analytic and explanatory power of the cellular automata.

Anyway, the point is, it’s provocative even if I don’t really get it, and so is his latest blog post:

Of course, as early theologians pointed out, the universe clearly has some order, some “design”. It could be that every particle in the universe has its own separate rule, but in reality things are much simpler than that.

But just how simple? A thousand lines of Mathematica code? A million lines? Or, say, three lines?

If it’s small enough, we really should be able to find it just by searching. And I think it’d be embarrassing if our universe is out there, findable by today’s technology, and we didn’t even try.

Of course, that’s not at all how most of today’s physicists like to think. They like to imagine that by pure thought they can somehow construct the laws for the universe–like universe engineers.

So it’s basically theory via Google: Instead of deducing the laws of the universe, you arrive at them via computational brute force. Just try every combination of simple rules you can think of ’til you get something that looks like physics! Easy!

Great images in the post, too, as always. Wolfram famously self-published his book (actually, it’s even better: He founded a new company to publish it) because he couldn’t find any existing publishers willing or able to reproduce his illustrations at the resolution he demanded. Awesome.

3 comments

I can do you one better, Robin. I still have my copy. I think I made it through a chapter or so, and then looked at the pictures—which weren’t quite so spectacular as I was anticipating.

Does anyone else find something jarring or surprising in Wolfram taking a swing at conventional physicists? Maybe if he were limiting his critiques to only pure theoreticians I would have less of a problem, but how can a guy who simulates on computers how a world might possibly work based on rules he has made up claim that even the empiricist experimentalists rely on “pure thought.”

Isn’t Wolfram just as guilty of relying on pure thought? He makes up simple rules and sees what their consequences are. How is that better than a theorist who designs a model and tries to see if its implications meet reality?

Still, there is something interesting in the distinction that Wolfram is drawing. It is similar to debates in the 1960s and 70s over the best way to deal with ecological data as part of the US efforts in the international biological program. The two main sides were split over how to model ecosystems.

One side privileged accuracy and precision. They wanted the model to behave like the world, or at least to spit out numbers with good precision that matched measurements made in the field. The method they chose involved massive data sets, from which were derived statistical models involving many undefined (and undefinable) variables.

The problem with this approach, according to its detractors, was that it looked like the world but was in no way illuminating. It was just as complex and opaque as the world was without scientific intervention. They preferred sacrificing accuracy a bit in favor of building a model based on a limited number of variables that were physically-defined.

So they determined the factors that would most affect ecosystems and tried to model them, tweaking the model until it looked a bit like reality.

Which was better? The model that achieved greater accuracy, or the model that could help explain which variables were most significant and why?

That’s a really good point, although I think Wolfram would say that the simple, focused models can come from the ‘brute force’ method too.

In fact, as I think about it here, he’s advocating sort of the OPPOSITE of the statistical approach you describe: Rather than start with complexity and try to work his way back to simplicity, he starts with simplicity and tries to work up to (familiar-looking) complexity. And he does that many many times.

I mean, more than anything else it sounds like the million-monkeys/million-typewriters approach to Shakespeare, right? (Has anybody ever actually tried that? I mean, not with monkeys, but with computer programs. I feel like it would be totally do-able today.)

And I don’t know, there’s something appealing about it: In the same way that a lot of recent web thinking, e.g. Google’s, has revolved around this realization that processing speed and disk space — long the fundamental constraints of computer science! — are now ridiculously cheap & abundant, it seems to me that Wolfram is assessing a radically new landscape of tools and saying hey, maybe this ought to change the way we do this work.

(Wow, that was a convoluted sentence.)

The snarkmatrix awaits you

Below, you can use basic HTML tags and/or Markdown syntax.