Tuesday, March 31, 2009

Phases for Earthlings

This is fake. A bit. It's a composite of two objects taken on two different days using two different telescopes. The moon shot I took this evening. The tiny crescent is Venus, which I photographed on March 17. Venus is below the horizon now, so if you're looking to replicate that shot you'll have to wait a few weeks and then catch it on the upswing just before sunrise. I put them together because they're both in crescent phase, which tells us something.

Until I got into amateur astronomy about a year and a half ago I never gave the moon a second thought. Occasionally I noted that it was out in the daytime; like most cosmic provincials I did not realize that the moon is "out" during the day every bit as much as it is at night. I also did not understand that Venus is both the morning and the evening star, just not both at once. These things are obvious, though, if you think about them for even a few minutes.

When the moon is full it is opposite the sun in the sky, which means it rises when the sun sets and sets when the sun rises. Conversely, the new moon is "new" because it is between us and the sun and therefore the side facing us is lit only faintly by Earthshine, which is not nearly enough light to make it show up next to the sun. Occasionally the new moon gets precisely between the Earth and the sun and we get a solar eclipse, and occasionally the Earth gets precisely between the sun and the full moon and we have a lunar eclipse. Lunar eclipses may start at any time of day or night but they can only occur when the moon is full, so you can only see one at night.

In between new and full moons the moon is either ahead of the sun or behind the sun in the sky (from our geocentric point of view), and by the end of one cycle the time the moon has spent in the sky during our days will equal the amount of time it has spent in the sky during our nights. If you don't believe me, go outside and look, and report back in a month.

If you see a thin crescent moon in a dark sky it will always be fairly close to the sun, either at sunset (waxing) or sunrise (waning), and the horns of the crescent will face up into the sky and not down toward the horizon. Again, this is obvious after a moment's reflection: if the sky is dark, the sun is below the horizon, and if the moon is above the horizon then the side that is lit is the side that is "down" (horizon-wards) to an Earth-bound observer, so the horns of the crescent have to point up. It can't work any other way.

It should also be obvious that you can only see a crescent phase if an object is closer than the Earth to the sun. Think about Mars, the next planet out. Mars looks completely "full" when it is behind the Sun as seen from Earth (or would, if we could see it in the glare), and when it is opposite the sun in our sky (i.e., at the two planets' closest approach). When it is a quarter-orbit ahead or behind, it looks gibbous, but we will never see a "quarter Mars" or a "crescent Mars" as long as we are on Earth. The only objects we will ever see in those phases are Mercury, Venus, and our own moon. It should have occurred to me sooner, back at the beginning of all of this, that it is impossible to see a crescent Jupiter from Earth; such a view can only be had from a vantage point that is farther from the sun than is Jupiter!

If you like, you should be able to prove all of this to yourself by drawing some orbits as concentric rings, drawing in some planets, and thinking about what phases you would see from various vantage points in different orbits. Better yet, you can go outside, look up, and watch it happen. Best of all is when you can look up and understand what you see, because you've made a little orrery out of paper or brass or pixels or in the theater of your imagination. Now you have a little piece of the cosmos whirling around in your head, and though you may forget some details or have to look up to check the phase of the moon, you will never be entirely lost in the sky again. (You may get very irritated with movies and books that include the moon for ambience, because they almost always get it wrong, and in doing so violate not just physics but also geometry.)

Why this, why now? It's the International Year of Astronomy, and all over the world stargazers are gearing up for 100 Hours of Astronomy. One of the goals is to get more people looking up than ever have in the history of our species--and, more ambitiously, to help them understand what they see, and where they are in the cosmos. It's all in honor of the 400th anniversary of Galileo turning his telescope to the heavens, which brings us back to the crescent Venus, seen at top through the roiling chemical stew of the LA basin.

Galileo was the first to see that Venus goes through a full set of phases like the moon. From this he deduced that Venus must circle the sun (right); in a geocentric cosmos it could only ever be "new" or a crescent (left); to be full in such a cosmos it would have to appear opposite the sun in the sky, like the full moon, which it never does for reasons that the diagram on the right makes clear. Galileo's discovery that four little points of light near Jupiter are in fact its moons and circle Jupiter rather than the Earth or sun often gets more press (immortalized in the name of the biggest Jupiter mission ever, for example), but the phases of Venus are easier to see and more immediately understood, and were arguably more important in sealing the coffin of the geocentric cosmos.

I often think of Galileo, squinting through his telescopes, which were attrocious by modern standards--"plagued by every aberration known to optics"--but nevertheless the first window into the real workings of the cosmos that science ever had, and I feel very humbled, and a little lost. Humbled because I am fantastically spoiled--the first telescope I ever owned is better by far than any telescope built in the first 200 years that telescopes existed--and I doubt if I would have had Galileo's perseverance, to just keep looking. Lost because I grew up in America in the late 20th century, with more information at my disposal than all previous generations of humans combined, and it still took me more than three decades to stop, look up, and notice the very basic cycles--phases of the moon, for example--that govern the rhythms of the living world, that have informed the calendars of every human society as long as societies have existed, and that formed the first really solid step in our species' long climb into enlightenment.

So, stop. Just stop. Look up. Keep track of the moon for a month. Draw some orbits and think about what phases mean. Discover, or rediscover, the universe.

Labels: ,

Tuesday, March 24, 2009

You have homework

First, read this excellent post about the behavior of the editors of the Journal of the American Medical Association (JAMA) when someone blew the whistle about a very obvious conflict of interest in one of their published studies.

In particular, pay attention to this comment, on the motivations of journal editors with regard to 'incidents' in their journals. Money quote:
Journals are motivated to downgrade the ultimate resolution as best they can, to avoid doing anything if possible, to make a correction when it should be a retraction, etc. And above all else, even when there is a retraction, to avoid anything that suggests identifying fault.
Now read that again, substituting "scientific societies" for "journals". Did it sound eerily familiar? If not, read this. Then this. Finally this.


Labels: , ,

Sunday, March 22, 2009

What do we want our universities to be?

PZ Myers on universities staying afloat by jettisoning whole departments:
My own discipline of biology is dead without mathematics, chemistry, and physics, and yes, geology is part of the environment we want our students to know. Now it's true that if all we aimed to do was churn out pre-meds, we could dispense with geology; heck, we could toss out all those ecologists, too, and hone ourselves down to nothing but a service department for instruction in physiology and anatomy.

But we wouldn't be a university anymore. We'd be a trade school.

Go read the whole thing.

In my admittedly limited experience, many biology departments are morphing into trade schools already, for pre-meds and moleculoids. Administrations love the publicity that comes from having productive organismal biologists and paleontologists, but we're usually not writing NIH grants for zillion dollar ion reflux pronabulators, which means we're not propping up universities with megabucks of institutional overhead from those grants, which means that when it comes to startup, lab space, getting tenure lines renewed, etc., we often get the short end of the stick.

Fortunately for me, lots of med schools have learned the hard way that moving from cadaver dissection to online virtual anatomy (OMG!!1!!!111!!!!) is a good way to have your students' board scores go in the toilet, so there is at least one place where anatomy will continue to be valued (if not as much as NIH grants) for the near future at least.

But then, I work at a trade school. It's okay for Western to work that way; it's not okay for the University of Florida.

Depressed Academic is depressed.

Labels: ,

Friday, March 20, 2009

Hot volcano-on-volcano action

Gotta interrupt the profundity to bring you a link to some BADASS photos of the new volcanoes being born out of the sea near Tonga. Science here and here, volcano porn here (photos) and here (video--hell yeah!).

Labels: , ,

Saturday, March 14, 2009

Blundering toward productivity, Part 4: cranks, evolution, and humility

If you're new to this series, you might like to read the previous installments first: here they are.

It's pretty common for internet cranks in general, and absolutely pandemic for dinosaur cranks in particular, to argue that Ivory Tower so-called experts are all blinkered by orthodoxy and that outsiders with no technical training are better suited to having the big ideas because they are unshackled by the weight of knowing all that has gone before. These people are almost always wrong, because they keep reinventing the wheel, and the wheels they reinvent are often square. That's why I was careful to specify in Part 3 that you hang out with people who are not afraid to look stupid (check) but also know enough to make useful suggestions.

The advantage of collaborating with friends is that neither of you mind the occasional stupid comment on either side; laughing those off and going on is worth it for all the good ideas that you'll have that you wouldn't have had otherwise. The disadvantages of listening to cranks are that the ratio of good ideas to stupid comments is very low, that cranks almost always mistake the latter for the former, and that almost by definition cranks are immune to being corrected (if they were willing to accept logic, reason, and the weight of evidence, they wouldn't be cranks). Even if you could somehow engineer a polite crank, who would immediately and humbly accept being corrected when he was wrong, you'd still waste most of your time explaining entry-level stuff and never get to the really good questions.

Although in retrospect Mike and I did a lot of this when we were first friends; he didn't know any biology to speak of but had a decent command of math and logic, and I knew a little biology but had never really been schooled in how to think clearly, and we both kind of helped each other up (i.e., took turns smacking each other down) until the conversations we were having anyway started to be applicable to exciting and tractable problems. So if you are an expert you shouldn't waste any time on a crank unless that crank is both potentially remediable and also your friend, and if you're a crank (or any other variety of n00b), learn how to swallow your pride, find a friend who can do likewise, and start climbing together. Maybe that is the definition of a crank: a n00b who mistakes himself for an expert.

It's not that cranks don't know a lot of facts. Usually, they know too many facts; they are blinded by their own command of the esoterica of the field in which they are cranks. The problem is that their command of that esoterica does not automatically mean that they are capable of thinking clearly about it; usually the opposite is true. After several bitter years of realizing how muddled is my own thinking, I now think that everyone, without exception, could stand to improve the clarity of their thought, and that the surest sign of this is thinking that you already think well enough.

This is like Richard Dawkins's definition of evolution as "the one subject that everyone thinks they understand." If you think you understand evolution, you don't understand it, and the more certain you are, the more grave your misunderstanding. Nor is evolution special: I suspect that this is true of any sufficiently rich field. That doesn't mean that there aren't lots of things that we know about evolution, like the fact that is has happened and continues to happen. It just means that we haven't solved it, in terms of reducing the whole field to anything that can be grasped in a blog post, or a lecture, or a documentary series, or a book, or even a career. Saying you understand evolution is not like saying you understand orbital motion, it's more like saying you understand physics. All of it. The term 'evolutionary biology' is a misnomer: evolution isn't an aspect of a more inclusive phenomenon called biology, biology is an instance of a more inclusive phenomenon called evolution.

The partner of useful stupidity is humility. In Part 2 I mentioned two aspects of humility: letting your guard down enough to let out the good ideas will probably let out some dumb ideas at the same time, and sometimes your dumb ideas will trigger someone else's good ideas. Here I am talking about another, deeper level of humility. Not humility in front of another person, but humility before the universe itself. Recognizing that any commonplace object or idea that you take for granted probably stands at the end of an almost impossibly long series of unlikelihoods, most of which have never been explored. Stephen Jay Gould seemed to have a knack for asking questions that most of his colleagues could not even have formulated; he was really good at not just seeing the box and then thinking outside of it, but wondering what it was doing there in the first place. I wonder if he was a biological evolutionist rather than an evolutionary biologist; I suppose Dawkins must be. Too bad they were always at odds (when Gould was still alive), I'll bet they would have had some killer ideas if they could have ever let their guards down around each other.

In any case, this is another way to quickly separate serious cranks from potentially remediable n00bs: cranks lack humility. Not just humility toward other people, but toward the universe. The true crank is beset by the dual delusions that the answers are all straightforward, and that he has them and no one else does (except maybe one or two of his fellow cranks; sometimes they run in packs). Look around for someone who doubts if we're even asking the right questions, and chat that person up instead. Not just instead of talking to the crank--instead of doing whatever it was you had planned for the rest of the day.

Where am I going with all of this? I have no clue. I just intended to write a little bit about e-mail and the value of conversation. And I'm not going to find out tonight, because right now my need for sleep is greater than my curiosity to see what comes next. Stay tuned, though. I'm probably stupid enough to bring this to a satisfactory close, but it remains to be seen if I have the humility.

Labels: ,

Blundering toward productivity, Part 3: smart enough to feel stupid

Part 1 is about goofing off as the spawning ground of good ideas. Part 2 is ostensibly about whether the goofing off part can be circumvented, but really about the value of working with smart people who aren't afraid to look stupid. In this post, I answer the second question from Part 1: how can the process of turning undirected play into good ideas be accelerated?

The obvious answer, which I intended to write about: have a workshop, get a bunch of smart people from different but interacting disciplines together, and give them time to educate each other AND time to freewheel. I got to experience this for real at the sauropod workshop in Germany last November (see here and here). The importance of this is not to be underestimated. This is why we talk about particular institutions having a "critical mass" of workers in a field, and it's why Berkeley was such a fun and inspiring place to be a grad student.

Another answer, which I discovered in the course of writing the previous post: hang out with smart people who you are not afraid to look stupid in front of, and who are not afraid to look stupid in front of you. This is harder than it sounds, because people who know enough to make worthwhile suggestions are prone to being at least a little bit insecure or defensive about their knowledge, especially compared to others. I would be horrible collaborator with many people in my field because I would never let my guard down; it would kill me if they found out how stupid I am capable of being.

Given that, a further suggestion would be to consider collaborating with your best friends regardless of what they work on; by being vulnerably, stupidly open with each other, you might have enough good ideas fast enough to either find something midway between your specialties, or for one or the other of you to fall in love with a problem in the other person's field. Hence the project on rabbit heads with Brian. I wasn't particularly interested in rabbit heads but I am interested in pneumaticity and we figured the project would be about rabbit sinuses. It turns out we're going to do something completely different and much more interesting, which was not on the radar for either of us because neither of us had made the necessary discoveries (you would call them observations, but for us they were discoveries), which has roots in some papers we read back at Berkeley but really germinated in the soil of undirected conversation.

And this accounts for the feeling that I had when I started this essay, that time spent chatting on e-mail is not always a waste of time. Sometimes it's not just productive time, it's the most productive time it's possible to have. Because it is the spawning ground of new ideas.

Chatting on e-mail with distant colleagues is better than exchanging snail-mail letters or not talking, but it's still vastly inferior to meeting in person. In the past year I have spent just over a month with Mike (one evening in LA last summer, two weeks at his house in August, two weeks in Germany in November, one day at the AMNH last month), but out of that month we've gotten two manuscripts mostly written and plans made for at least half a dozen more. A couple we had sketched out on e-mail, but most of them wouldn't exist even in concept if we hadn't had some time to just hang out with fossils. I suppose that is another potential idea-accelerator to add to the list:

(1) have lots of wide-ranging conversations
(2) don't be afraid to look stupid
(3) collaborate with your best friends
(4) in person as often as possible
(5) with the objects of your investigations at hand

If you're an astronomer and there is a physicist you'd like to work with, meet up at the observatory or the cyclotron or more likely the computer lab where you play with your data. If you're a paleontologist or zoologist, go on field trips and museum visits with your collaborators. Happily, that's probably something you were going to anyway. But now you know it's not just a convenience, it's a necessity. And having a few beers together at the end of the day is not a waste of time, it's an investment in your joint idea bank.

The other implication of this last one is that if you are on your own and you've got some time to kill, you should probably go to where your potential data is and just let your mind and body wander. There is a great bit in one of David Quammen's essays in which Quammen is roaming the Montana State University library and he comes across Jack Horner sitting on the floor between two rows of shelves with journals spread out all around him. Quammen says, "Hey, Jack, what are you doing here?" Horner looks up and says, "Having ideas." The best part is that the journals weren't even paleo journals, they were ornithology journals.

That's yet another important point: it's good to have at least a nodding acquaintance with every field that bears on yours. Which, depending on how broadly you think, might be all of them. And what's more, you should get more than a nodding acquaintance with the ones that are likely to be most important. Birds are living dinosaurs, so if you are looking for new ideas to test about dinosaur biology, it makes sense to camp out in the ornithology section. Crocs are also relevant and elephants are not completely irrelevant, but people have been thinking about dinosaurs as big crocs and slow elephants for a long time. The MSU library run-in happened in the early 90s, when the idea of dinosaurs as stem birds had not yet penetrated paleobiological thought (it still hasn't, fully). Even now, if you were looking to really push things, it would be a good idea.

The downside of jumping into a new field instead of just soaking your toes at the shallow end is that it will make you feel stupid. It doesn't matter what line of work you're in, whether it's paleontology or programming or construction, there is something that you are an expert on now that you weren't when you started, whether it is taphonomy or recursive subroutines or knocking down walls. But you weren't an expert when you started, and when you started you probably spent a lot of time feeling stupid. But you learned quickly, partly because you were anxious to get past feeling stupid, and partly because trying dumb stuff is a good way to learn what works and what doesn't.

I am starting to think that becoming an expert can be dangerous, because feeling smart feels better than feeling stupid, and the risk inherent in expertise is that you stay put and never push the field as much as you might by taking a risk, feeling stupid for a while, and mastering another body of knowledge.

I almost hesitate to say that here on teh intert00bz, where it is often alleged that becoming an expert is dangerous for another reason, which leads to the hopefully non-trivial discussion of cranks in the next post.

Labels: ,

Blundering toward productivity, Part 2: is there a better way to have good ideas?

In Part 1 I discussed the fact that, in my experience at least, good ideas almost always arise from undirected conversations with friends and colleagues (e.g., goofing off). I ended the post by wondering whether this process can be circumvented or accelerated. This post is about the first of those alternatives: is there another way to have good ideas other than gabbing with informed friends?

I doubt it. Mike and I have both noticed that on museum research visits we get a lot more done if we're working with someone else than if we're working alone. And usually that is not because we are sharing the load, like one person taking measurements and the other writing them down. It's because we just notice more and ask more questions.

The corollary is make sure you work with the right person. A good collaborator is curious, open-minded, and not afraid to look stupid. When I'm really in the zone on a collaborative project, I say all kinds of stupid things, and frequently have to be reminded of the obvious. And I am lenient when my collaborators say dumb things, because there is more than one kind of dumb statement. We're so used to dumb statements that indicate that someone is not thinking at all that it is hard to realize that sometimes dumb statements mean that someone is thinking very hard. So hard that they can't be bothered to remember little details like gravity or the fact that necks must necessarily have a head at one end and a body at the other.

Well, why not just guard your inner monologue and not let the stupid stuff out? Because that's what you do for the rest of your time, and because that filter that you set up to catch the stupid stuff might also catch the really brilliant stuff. At the very least, it will slow down the flow. There is some necessary humility here. Not just the humility to not be afraid to look stupid, but also the humility to realize that you are not going to get to all the good ideas yourself, and even your wrong or dumb ideas might jar something loose in your collaborator's head.

Hmm. I have read that you write reports when you have found answers that need sharing, and you write essays to find answers in the first place, and sometimes to questions you didn't know you were asking. I started this section thinking that I was just going to write glowingly about the value of batting ideas around with someone else. But I think I am coming to the view that openness, verging on stupidity, is not only necessary, but also the answer to the next question.

For another recent defense of stupidity as a crucial part of science, go here, and please note that the linked paper is free.

Labels: ,

Blundering toward productivity, Part 1: e-mail and goofing off

Another post inspired by someone else's post. Scott Aaronson recently crossed the E-mail Event Horizon, and sent a report from inside.

I have not crossed the EEH, but I have been through brief e-mail storms, during which I have spent an entire working day doing nothing but answering e-mails that can't be put off. Now, when I was a grad student I sometimes blew a whole day goofing off on e-mail, but that was different. Mostly, though, I am able to plow through necessary e-mail in about an hour and get down to the day's work. Although it is still kinda shocking that, on average, I spend an eighth of my workday checking and answering e-mail. On the other hand, there is no question that e-mail is a huge productivity booster overall, at least for me, because it circumvents so many meetings and vastly accelerates the pace of collaborative research and writing.

It is easy to get carried away and let a focused e-mail exchange with a colleague metastasize into a rambling conversation with a friend. Sometimes that eats up whole mornings. That used to bother me, but not so much these days. At least part of that 'virtual community' BS is true. In a traditional office I would be having water-cooler conversations with whatever chumps I happened to be stuck with. Thanks to e-mail, I can have those conversations with a distributed network of my favorite people, many of whom are not in the same town, or the same state, or even the same continent.

I had a minor epiphany on my recent research trip to the AMNH. I was hanging out with Brian Kraatz and we were kicking around ideas for a project on rabbit skulls (yes, really). To an outside observer, it would have looked like the Real Work/Goofing Off split was about 20/80. But the ideas that we ended up chasing all came out of what would have looked like goofing off.

I had a similar experience when I was visiting Mike in England in August. We'd usually end each day at the dining room table, playing games and just freewheeling. One night we got to talking about sauropod vertebrae (gasp!) and some things that do not make sense, and after we'd scribbled up two or three pages of scrap paper with notes and diagrams we looked at each other and thought, "Hey, this could be a paper." It's in review right now; I'll let you know if it ever sees print.

The epiphany I had in New York is not that good research ideas sometimes emerge from the most apparantly random conversations. It's that they almost always do. This is nothing new--when I look back, the guts of most of my papers started out as a few motes of inspiration distilled from undirected yakk sessions with friends and colleagues. When I was just starting out with undergrad research, I'd meet with Rich Cifelli in his office and we'd just bat ideas back and forth. This turns out to be a good way not just to have new ideas but to make new observations. Rich and I figured out a lot of sauropod morphology because one of us would point at some feature and say, "That's weird. Is it always like that?" and then we'd go check. Same thing with Brian and the bunny heads in New York.

So far this is all pedestrian. What I'm really curious about is, can this process be (1) circumvented, or (2) accelerated?

Labels: ,

Monday, March 09, 2009

Another dead snapper tale

First, if you haven't already read Darren's awesome post on turning dead animals into skeletons, do so now. Look out for the amazing line, "Stig and I once microwaved a dead cat and the results were outstanding."

That reminded me that I have told the story of one of my dead snappers, but not the other. As far as I can tell, anyway. So here goes.

I was working at the Oklahoma Museum of Natural History as a grad student, and I had put the word out that I was looking for a big dead snapping turtle. A few weeks later, I got a hit. One of the other grad students had been on a hike at the local lake and seen a dead snapper, so he'd pushed it up into a metal culvert to hide it from scavengers, both human and otherwise. A few days later he told my friend and partner-in-crime, Julian (same Julian as in the other snapper story linked above), a few days after that Julian told me, and a few days after that we finally hopping in Julian's truck and went out to get that thing.

Keep in mind that this was May in Oklahoma, when the temperature and the humidity were both hovering in the high 90s. And that the snapper had been up in that culvert for a week and a half by the time Jules and I went after it, and dead for an unknown additional period.

I waded into the ankle-deep water and dragged the thing out of the culvert by the shell. It was huge, with a carapace 15 inches long and a head three inches wide. Weighed upward of 20 pounds. It was also to the "bratwurst" stage of decomposition, in which the head, tail, and all four limbs were extended and swollen up like unholy sausages (the putative existence of holy sausages is a topic for another post). I didn't want to touch the flesh, which had the texture of gelatin and the rich aroma of rotting horse ass. So I tried to gingerly pick it up by the edge of the shell using only the fingertips of my right hand. Bad idea--as I was turning it over, the entire weight of the animal came down on my right thumbnail, cracked it in half, and bent it back at a 90 degree angle from the quick. I howled, dropped the snapper back in the drink, and ran to shore where I gritted my teeth and snapped the broken nail back down over the bleeding quick where it belonged. Only then did I realize that in my haste I had run smack into a little stand of poison ivy, to which I am seriously allergic.

Somehow we got the dead snapper into a couple of trash bags and into the bed of Julian's truck. Then we went back to my place, put it on the back porch, and took turns showering with Technu to get the poison ivy oil off. I also bandaged my thumb, but ended up losing most of the nail anyway. Not fun.

I wasn't sure what to do with the snapper. Our duplex backed up on a big wild plot at the edge of town, and I was tempted to use ants, but I didn't want to expose the thing to scavengers, which were both diverse (raccoons, opossums, coyotes, dogs, etc.) and abundant. I had used maceration for the mummified snapper but the results were awesomely greasy. I was interested in burying it but had no experience with prepping carcasses that way.

The upshot is that I didn't do anything with it for several days, during which it was sitting on my back porch inside two shopping bags in the 90-degree heat. Jules and I had gotten it on a Saturday.

The following Thursday night Vicki and I were on an evening stroll about the neighborhood, about two blocks from home, and the wind changed just right and we could both smell that snapper rotting. Vicki looked at me and sternly said, "You are going to get up tomorrow morning and bury that thing."

I did. It was simply horrific. I opened the trash bags, grabbed the bottom ends, and pulled up. The snapper slid out on its back. Or rather its remains did. All that was left was a greasy articulated skeleton, a couple of gallons of really evil greenish-black fluid, and about a million grains of white rice. Only they weren't grains of rice, they were maggots. The stench hit me like the proverbial freight train.

I dug a hole about a foot deep in the yard, lined the bottom with a plastic trash bag, slid the snapper in with the shovel, buried it, and covered the spot with a few logs from the woodpile. I say it like I just did all that stuff. In fact it took most of an hour because holding my breath I could only work for about 30 seconds at a time, before I had to go to the upwind corner of the yard and just breathe. The stench was beyond anything I have ever experienced before or since. I didn't know that a scent could be that powerful. I hosed down the porch for a long time, too.

All that summer I watered the logs over the burial plot daily. This kept them moist during the long hot summer, when temperatures got over 110 F for a solid month, and hopefully promoted lots of biological activity in the soil below. I flipped the logs daily to collect rolly-pollies (or pill bugs, if you insist) for my baby box turtles. In August I moved the logs and carefully dug up the turtle. Amazingly, the bones were entirely defleshed and degreased. I cleaned them up with soap and water and they came out shiny white, with no bleach or peroxide. I still have the skull, which is beautiful and impressive, and if I wasn't so lazy I would have included a photo of it with this post. Maybe next time.

Anyway, I've been ardently pro-burial for carcass preparation ever since. Give it a shot, it's a great experience.

Labels: , , ,

Saturday, March 07, 2009

Dr Vector spoils Watchmen

I just got back from the show. I loved it.

It's basically a shot-for-shot remake of the book. In fact, it's astonishing how many of the neat little details from the book survive. The whole pirate comic book is out, but I didn't miss it (and to be honest, I found it a little tiring the last time I read the book). I was also pleasantly surprised to find myself moved by events I already knew were going to happen--to laughter, to excitement, to disgust. Not to tears, but there weren't any real tear-jerker moments in the book, either. Many critics have taken the film to task for being too faithful to the book. I'm not sure how that works, but then I don't understand why most of those dumbasses have jobs in the first place. I didn't think the movie was the "embalmed" version of the book--quite the contrary. It may be a cliche, but Zach Snyder took the book and made it live. I think we may have underestimated him after 300. If it was a violent, homophobic, fascist, historically inaccurate movie, it's because it was a strikingly faithful adaptation of a violent, homophobic, fascist, historically inaccurate comic. Snyder may be a little too good at what he does.

The casting was, frankly, unbelievable. Patrick Wilson couldn't be a better Dan Dreiberg if Dave Gibbons had inked and colored him. But the show is really stolen by Jeffrey Dean Morgan as the Comedian and Jackie Earl Haley as Rorschach. When I read the book from now on, the characters will speak with these voices. Especially Rorschach's. It's so perfect, it's a little scary.

The only departure from the book is at the end. The calamity that unites humanity and averts nuclear war is not the teleported-alien-plus-millions-of-psychically-murdered in NYC. It's simultaneous A-bomb-level blasts in NYC, Moscow, Hong Kong, and at least one other city (Paris, maybe?). These are engineered by Veidt to look like the work of Dr. Manhattan--they are set off by a reactor that Dr. Manhattan has been helping Veidt build, to provide free energy using whatever it is that powers Dr. Manhattan. The Earth unites not against a phantom teleporting alien threat but against Dr. Manhattan, who leaves the planet anyway for the same reasons he does in the book.

I actually thought this was rather neat. Usually when a film adaptation messes with the source material, it degrades the story by introducing Hollywood BS that is against the very spirit of the story. Witness Faramir's "story arc" in The Two Towers and the giant pyrotechnic ending tacked onto Stardust. In the case of Watchmen, the revision actually ties the story back to itself. Dr. Manhattan is a better patsy for the calamity than aliens. He's more believable, I think, in in-universe terms. He's harder to falsify, because he's known to exist, whereas I suppose there is an outside possibility that in the book universe some aspect of the creature or its appearance might break down and betray its origins in the inevitable multinational investigation. It's a better play for Veidt. If Dr. Manhattan wasn't already planning to leave the planet, having the planet united against him would give him another reason. And it's a more believable play for Veidt; instead of having to invent several totally new technologies in secret (gigaton monsters, psychic blast waves, teleportation), all he has to do is harness an existing phenomenon, and he even gets the phenomenon in question to help out. Finally, when Laurie gets pissed at Dr. Manhattan for splitting himself to do research while also making love to her, it's not just some random physics experiment he's working on--it's the very reactor that Veidt will use to trigger the explosions and frame him. So it's a neat bit of storytelling all around; it actually makes the story more coherent instead of less, which I would not have thought possible.

I know I may have to turn in my geek badge and fanboy secret decoder ring, but for that alone I think the filmic Watchmen may be just slightly superior to the book. I expect the mob with torches and pitchforks any minute. Have a nice day.

Labels: , ,