random thoughts on the end of the decade

DreamingMoiHmm, it's been an interesting 10 years. In just about 6 months, I turn 50 and it seems to be making me a little philosophical in my old age. The last 10 years have been, in comparison to, say, my 30s, really good personally, despite some things most people would call tragedies but that I've come to see as either life stages or just ordinary events. I think I've grown and changed more in roughly the last decade than I have in the first 40, with the possible exception of childhood, when pretty much every human being grows and changes exponentially. It's not that I've gained so much more knowledge (though I hope I never stop learning new things), but that I've figured out what to do with what I already know, emotionally and otherwise.

Continue reading "random thoughts on the end of the decade" »

poetry month!

Writer Moi It's Poetry Month, peeps, and somehow, I screwed my courage to the sticking place and signed up to write a poem a day, from prompts, over at Writer's Digest's blog Poetic Asides with Robert Lee Brewer. Tonight I'm frantically composing at the last minute because I had a long day teaching and grading papers. There will be an instant replay tomorrow night, probably, but here's the first one, anyway. It's an origin poem, as per the prompt.  I thought, what the hell? Why not go for the ultimate origin? So I've committed science poetry. Be merciful; it's a first draft.

Start Here

It always starts with light
real and metaphor:
a minuscule point
in the deeps,
one moment quiescent,
the next—
the universe
cracks open.
Fractions later, the shrapnel flies
at the speed limit of sight,
us and anti-us,
bangs around like bumblebees in a bottle
(those will come much later)
smashing itself
back to nothing first, then
smaller, hotter, faster, fortunately
more us than anti.
shimmer into being,
condensing like raindrops
(again, much later). The universe
A chill sets in, the particles dance
for warmth, and couple
the way everything does
in long, cold nights.
Hadrons and leptons snuggle;
deuterium is born,
grows up to be hydrogen.
Soon there’s a periodic family
at the table.

In the space of
a hundred breaths:
light and matter, and
all that matters.

© Lee Kottner, 2009

This poem brought to you courtesy of Chris LaRocco's and Blair Rothstein's Big Bang Page over at U of M. Meaning that's where I got my quick and dirty summary of the aforementioned events.

Damned if you do, damned if you don't

Bitchbutton In the You Can't Fucking Win Department, this just in:

A new study in Psychology of Women Quarterly finds that women who present themselves as confident and ambitious in job interviews are viewed as highly competent but also lacking social skills. Women who present themselves as modest and cooperative, while well liked, are perceived as low on competence. By contrast, confident and ambitious male candidates are viewed as both competent and likable and therefore are more likely to be hired as a manager than either confident or modest women. . . .

Results show how disparate hiring criteria further discriminates against ambitious, competent women. When judging the ambitious women's hirability, a perceived lack of social skills formed the basis of the hiring decision, and the womens' high competence was relatively neglected. For ambitious men, however, perceived competence and interpersonal skills were weighed equally in the hiring decision. Women were doubly disadvantaged because even when female applicants adhered to stereotypic expectations by presenting themselves as modest, they were unlikely to be hired because evaluators emphasized their relatively low competence and discounted their (high) social skills.

The double standard is alive and well: "He's ambitious; she's a bitch." Men are still not expected to have social skills; women are still expected to fill that role in society. Ambition and competence conflict with social skills (where did that one come from?) Women should be modest, not toot their own horns, not have goals and dreams and desires that might conflict with men's. Women who present themselves as confident and ambitious are still seen as dangerous aggressors who threaten the social order and the

Continue reading "Damned if you do, damned if you don't" »

Attention Science Geeks!

My pals, mutual spousal units Sean Carroll (CalTech Physicist, Cosmic Variance) and Jennifer Ouellette (Science Writer, Cocktail Party Physics) talk about the Large Hadron Collider (LHC) at CERN and what it hopes to find (not, of course, mini black holes that will devour the earth, you idiots!) in language we can all understand. And they're so cute and smart!

Watch and learn:

How to Keep Us Down

SciencemoiIt's not just religion, obviously, that's misogynistic, but it's always been interesting to me that this is one of the characteristics that religion and science, often so antithetical to each other, share and for so many of the same reasons. Of course, this is because both spring out of the society around them and are carried out and structured by the people in that society who have the power to make the structure. So if men decide women are too inferior in whatever way to have a personal relationship with God either through study of the texts or through participating in the mysteries (Milton's "He for God only, she for God in him.") little wonder scientists should think the same way about what many saw (and still see) as a new, improved replacement activity.

The reasoning, though is strikingly similar and you'd think scientists would pay more attention to that. Of course, it's to their advantage not to. It's convenient for them to claim that women's brains are not made for math (an old saw rapidly being dulled) or that we don't do science the way it "should be done," i.e., the way men do it. Probably true, but not necessarily bad or wrong. Just different. I'm not talking about the scientific method here, but about the culture of science and the way men and women approach problem-solving.

And of course, there are social and cultural pressures on women now that men don't have to deal with, as a report by the American Physical Society I recently helped edit shows quite admirably (it's still in production so I can't link to it, but APS has a great reading list). This is a factor just as often conveniently forgotten in the interpretations of key scriptures that seem to ban women from positions of authority in the church, while just as conveniently ignoring the scriptures that show them in those positions.

There are also some striking similarities between the two areas in their jealous guarding of knowledge. In both cases, men are are frequently the gatekeepers of the more esoteric aspects of knowledge (see, physicists), intentionally or unintentionally. Personally, I think this is because guys like secret societies and all that. They're forever making exclusionary clubs, from the Royal Society to the Benevolent and Protective Order of the Elks. But religion and science are public endeavors, affecting all of us. (Just look at the Evangelical Right's influence on elections in the U.S., if you don't believe me.) Faith that asks no questions is merely blind, stupid obedience; science that allows no free sharing of knowledge is not just bad science, but dangerous blind itself. In both cases the idea that "it's too complicated for you to understand" is used to keep the general public from asking uncomfortable questions: "Why is Junia, a woman, called an apostle?" (see sidebar) or, "Wait, why should we give you taxpayer money for that science project?"

All this is by way of saying that Richard Dawkins's selection of writers for the new Oxford Book of Modern Science Writing is damned odd. For one thing, there's nary a mere science writer among them; they're almost all scientists, even Rachel Carson, who started her career as a biologist. This is one example of the "father knows best" attitude so many scientists have toward the public: only scientists can truly communicate the beauty and wonder and complexity of science to the rest of you ninnies. This is far from the truth. It is, in fact, a hell of a lot easier to teach good writers about science than it is to teach most scientists to write well, particularly for the public. Most of them have a tendency to include too many advanced details that chase people away, rather than broad interesting ideas that draw them in. My science writer pal Jen waxes eloquent about this frequently in our conversations. The advanced details are important, but you don't start out with those for people with no or little background in the subject, and getting the concepts if you're not a scientist is far more important than understanding the technical details right away. Scientists often have a bad case of "can't see the forest for the trees" when it comes to writing for the public, particularly in their own subject.

And, of course, there are too few women, three, to be precise: biologist Rachel Carson, Helena Cronin, a philosopher who works in sex selection (and who happens to think there are more smart men than smart women—to be fair, she also thinks there are more dumb men than dumb women); and Barbara Gamow, not a scientist, but wife of physicist George Gamow, who is included because of the poem she wrote in response to one of George's lectures. How cute. I say this not to denigrate Barbara Gamow, who was, like many women married to male scientists, extremely supportive of her husband's work and no doubt a sounding board for it, but to illustrate the attitude prevalent about women's role in science: supportive; observer not participator; muse not partner.

Rachel Carson got in, I suspect, because she's hard to ignore; she was so prolific (and a fellow alumna of my alma mater!) and so pivotal in the early days of the ecology movement. But where's biologist Lynn Margulies, who, with James Lovelock, developed the Gaia theory? She's a wonderful writer. Where is primatologist Dian Fossey? Hello? Gorillas in the Mist anyone? Child psychologist Anna Freud? Primatologist/ethologist/anthropologist Jane Goodall, who, like Fossey, wrote extensively for the public? For that matter, where's Margaret Mead? I see physician Lewis Thomas on the list (one of my favorite writers, though he wrote as much about life as about science) but not doctors Perri Klass or Michelle Harrison. Where's oceanographer Sylvia Earle? Or forensic anthropologist Emily Craig? And those are just the ones I can think of off the top of my head.

And we haven't even gotten to the non-scientist, women science writers: Natalie Angier, Dava Sobel, Heather Pringle,or Mary Roach, to name a few.

Hawkins's selection is pretty heavy on evolution (no surprise, given that he's an evolutionary biologist), genetics (again, no surprise), physics, neuroscience, and biological systems. There's not much chemistry, straight-up biology, medicine, and no ocean science or any of the so-called soft sciences like sociology or anthropology. If what he was aiming for was a balanced picture of the wonders of modern science, this book is hardly that, but it's not even a balanced picture of the best science writing. Like the hard sciences, it's very male dominated (and white males at that). Enough with Peter Medawar already. He's not that brilliant. He's taking up space with his multiple selections that could easily have been given to a woman or two, scientist or not.

Dawkins could have done much for women scientists everywhere by recognizing their work in this volume. Instead, he dragged out a lot of the old war horses: Eiseley, Watson & Crick, Gould, Thomas, Hoyle, Haldane, Snow. That's fine in an anthology like this. You need to include the classics and the big guns like Hawking and Einstein. But if you're going to include the likes of Steven Pinker, Oliver Sacks, Brian Greene, Lee Smolin and Kenneth Ford (whom I used to work for), then you need to include some contemporary women scientists too, dammit.

Why make a fuss over this? Because this is how women are systematically pushed out of history, in exactly the same way we were pushed out of recognition of our rightful place in the early church. Simply by excluding us from memory. By being ignored by the big shot males. That's all it takes.

Archaeology, Science, Beer

Beermug_moiJen and I had a great conversation recently about the pervasiveness of science in our lives. It really is everywhere: your furniture (engineering in the milling of the pieces and metal that connects it), the obvious places like your computer and media, textiles (weaving and spinning were some of the earliest technologies); the paint on your walls (chemistry); your transportation (engineering and physics); most of our jobs involve some kind of science, even if we're only pushing electronic paper (computer science). Even agriculture is a science: fertilizers, crop rotation, planting and harvesting technologies.

Then there's beer.

Ben Franklin's assertion that "beer is the proof that God loves us and wants us to be happy" goes farther than any number of scriptures in proving His existence to my mind (even though the quote itself may be a fake). And the quest for substances to "make us happy" has a led to a lot of scientific advancements, not the least of which is basic chemistry (One of my favorite breweries, Magic Hat, actually has a brew called Chaotic Chemistry). Beer is based on the chemical transformation of starch and sugars into alcohol through the use of biological agents (yeast). The fermentation still is one of humanity's greatest inventions, right up there with fire and the wheel, in my personal opinion.

There are scholars who actually spend time studying the history of beer and brewing (why didn't I know these people in college? More importantly, why didn't I grow up to be one of them?) Irishmen Declan Moore and Billy Quinn are two of them, and they set out to discover how Bronze Age Irishmen might have brewed up their IPAs. "This quest" they say in their very important article, "took us to Barcelona to the Congres Cerveza Prehistorica, [this sounds even better than the Medievalists' bash in Kalamazoo which is always a big party, and how did I miss this on my trip to Barcelona?] and later one evening in Las Ramblas in the company of, among others, an international beer author, an award winning short story writer, a world renowned beer academic ["Beer academic"?!? You mean that's a job description? Not a foible? Damn. . . .] and a Canadian Classical scholar - all of whom shared our passion for the early history of beer." Here's Dec and Billy's demo and tasting party, complete with grilled dead pig. Sláinte! And happy Fourth to all you Budweiser-swilling, grilling patriots, carrying on the long tradition of beer and pig-roast.

[Thanks to North Atlantic Skyline for the tip]


Radicalmoi Jen's hubby Sean the Cosmologist has started an interesting discussion over on Cosmic Variance about "why so many academics are hostile to some religions rather than others." For me, this is a very interesting twist on the opposite question, why so many (particularly American) religions are hostile to learning and education. According to a recent study (PDF) by The Institute of Jewish and Community Research, "Faculty feel most unfavorably about Evangelical Christians." Big surprise. Having grown up in a religion that considered going to college about equivalent to choosing to live in a combination brothel and crack house, I find the question of why academics are more hostile to evangelicals not at all puzzling. It's a mutual hostility club caused not just by opposing world views, but opposing value systems.

Some of the things that academics value most are freedom of inquiry and freedom of speech. By contrast, evangelicals value unquestioning faith. Each intellectual challenge to that faith is seen as a test in loyalty and one's ability to bear the burden of ridicule for the sake of one's faith. The dogma of faith is unchanging—except when revealed by God—while, thanks to the spirit of inquiry, secular knowledge, with the exception of basic laws of nature, changes all the time. Even those basic laws are often refined, the way Newtonian physics was refined (or surpassed) by quantum mechanics. Evangelicals often view the effort to understand the wonders of our universe, both macro and micro, as a quest for forbidden knowledge. There are some things that we were just not meant to know, they often assert, usually in stentorian voices with much Bible thumping.

I've never understood that, though I do often despair of the way in which the knowledge we gain is used, e.g. splitting the atom. I think this is one reason science needs the counterbalance of some kind of spirituality. But not one that puts actual restrictions on what we're "supposed" to know. If you believe in some kind of creative deity, why would that deity not just freeze the brain power of its creation instead of giving it the capacity to become more intelligent, and understand more of the universe? Deities can do that, right?

No, that's because it's a test, the faithful say. But it's one the intelligent are going to fail. Intelligent people by nature can't stop questioning without real effort. And making that effort kills a part of them, their essential nature. That's some sacrifice.

What this claim of mystery means usually means, unfortunately, is that you, the little people, are not supposed to know these things. It's okay for the priesthood (literal or political) to know them, but not you. Because knowledge is power. That's one of the reasons that early education should be compulsory and advanced education should be free, for as far as you want to go. Otherwise, you are crippling your populace, and leaving them open to the manipulation of superstitious or just plain power-hungry nutcases. Jim Jones, anyone? Of course, it's far easier to control people who aren't that well-informed. Marking off certain areas as forbidden knowledge is one way to cement that control. The real problem with this, of course, is that if you don't understand your world, you can't make smart decisions about how to live your life. And if only a certain group understand the world, they get to make the decisions. As a rule, academics are in the business of spreading knowledge around to anyone who wants it. That can be a subversive activity in some cases.

It's no wonder academics are hostile right back to people who are hostile to their entire reason for being.

Like so many other prejudices, anti-intellectualism has its origin in fear, mostly of having your entire worldview dismantled, and the more petty but no less real fear of being made to look foolish. I can attest to the fact that it's a little scary to not have any sense of sureness about what the future will bring, either while you're living or dead. It was a relief to know we'd never have an all-out nuclear war because God would never let us totally destroy the earth. On the other hand, it's a little exhilarating, too, a bit like skydiving, I suspect.

But that fear is very real. My mother, not an ignorant or anti-intellectual woman by any means, found the idea of alternate dimensions really frightening. The idea that there might be someone else just like her somewhere else who had made different choices than she had was I think what she found so scary. Somehow, that would invalidate her life in her mind, though it did no such thing. The concept of alternate universes is a little more complex than that, but it does raise interesting "road not taken" possibilities. By contrast, I love the idea that our lives fork and branch at every moment, at ever choice we make, perhaps at every breath, not just for us but for every event. The number of universes is mind-boggling, but that may only attest to our lack of brain capacity to comprehend it. It's not by any means fully accepted in the physics community, but it raises some very interesting questions.

And that's what it's all about, isn't it: the questions.

Teach the Controversy

TeachermoiI mentioned on my other blog that I've gotten back into teaching after a 10-year hiatus, and I'm loving every minute of it. At the moment, I'm teaching a class on journal writing at the College of New Rochelle's South Bronx campus, and though I haven't taught this class before in any shape or form (which makes it a lot of prep work) I'm having a great time with it. I haven't had a group of people in a class that I've enjoyed so much since I taught honors science writing at MSU, one of my alma maters. My students absolutely rock; they're bright, motivated, funny, not afraid to talk back and challenge me. And they are so eager that they teach each other (and me) as much as I teach them. I'm high after every class, just from their energy.

Devil_2 But I digress. In the years since I've been away, especially from teaching science writing, the Creationists have started using a new tactic to get their bogus "science" taught in place of evolution which they call "Teach the Controversy." This is so wrong on so many levels, the main one being that there is no controversy. Evolutionary biology, while termed a theory (which is what scientists, in their caution, call a fully developed and tested set of ideas; and what else would you call that?), as an overarching paradigm is fact. Details are still being worked out, and disagreements about those details break out, but that doesn't mean there's any question about the theory's validity or truth. That's how science works; it's based on argument. There is something like a marketplace of ideas: the more testable facts, the better the argument, the more firmly it becomes an accepted part of the body of scientific knowledge. Intelligent Design, which is the latest thinly disguised Christian evangelical conversion tool, does not hold water, not even in the courts.

I'm not entirely opposed to the idea that there is a Creator out there somewhere. "How" S/He made it happen is less important to me than "if." I think evolution is a completely workable tool for developing life. Just because the human metaphor for making things involves factories and exacting, get-it-right-the-first-time craftsmanship doesn't mean it's the only way to accomplish that goal, especially when it comes to life. Evolution may, in fact, be the most efficient way of producing intelligent life. What looks entirely random and without structure to us, from inside the system, may actually be just be so extremely complex that we can't, at present, fathom it. It may be one of those things that we have to wait until the Post-Human to really grok.

GeocentricWhat's all this leading to? T-shirts. I was highly amused to run across Wear Science's Teach the Controversy designs on one of my favorite science blogs, Deep Sea News. I was so amused, in fact, that I bought myself a sun-yellow messenger bag with this design on it in blue. The one above, with a devil burying all those dinosaur bones, refers to the age of the earth problem and those pesky fossils of creatures that no longer exist that keep turning up. Young Earth creationists have been known to claim that God put them there as fossils when he created the earth. As I've said before, I think that's a pretty cruel and petty God to go obfuscating himself like that. Evolution is so much neater. But we all know the sun revolves around the earth. Right?



From the hilarious xkcd: "a webcomic of romance, sarcasm, math and language." Extremely nerdy and geeky yet very funny. One of my faves.

The concept of "purity" is something of an inside joke in science, and I've never quite understood it. Mostly it refers to the difference between fundamental concepts or knowledge and research for its own sake, and practical, applicable science that solves problems. Both are necessary, though pure science often gets a bum rap from non-scientists for being "ivory tower" precisely because it doesn't immediately solve a pressing problem. What many non-scientists don't understand is that the fundamental explanations have to be there first, e.g., you can't make a successful lighter than air craft until you know what substances are lighter than air.

But purity also refers to the degree of solid factuality in your data: how big the margins of error are, how elegant the experiment, how tidy the solution, how much it stands aloof from whether or not it fits with your preconceived notions, how much it just is. The idea originally arose from the Enlightenment's desire to free itself from superstition and magic in the nascent development of the scientific method, particularly the separation of alchemy from chemistry. I suspect that what it really springs from is the rational mind's horror of baser human qualities like greed, ambition, and the desire for praise and respect, as well as skepticism about the supernatural. The more rational your data, i.e., the less tainted by the messy unpredictability of human emotions, needs, or desires, the more pure the science, supposedly.

Sociology is the poor cousin in the crowd precisely because it studies human behavior and tries to quantify it, with varying levels of success. Sociology's main problem is that it taints the data just through observation; by definition, it's really hard to solve a problem when you are part of the subset being studied, e.g., humanity. But even physicists and mathematicians who deal with the most rational and rarefied of realms, pure mathematical theory, are influenced by their emotions, whether they like it or not. There wouldn't be as much rancor and infighting as there is, otherwise. Sure, desire for money to fund one's projects is a noble motive, but still, the competition can get amazingly cut-throat and sometimes downright nasty. James Watson's The Double Helix about the race to discover DNA, is a classic example of how emotions drive science. Not to mention the rampant misogyny in many scientific disciplines. Whether they like it or not, science is a human endeavor and will always be affected by human behavior, however pure the data.

(An amusing aside, apropos of nothing: if you Google "string theory," your browser tab will read "g string theory" with the "g" logo of Google in front of your search subject. And how lovely that Randal Munroe's mathematician in this comic is female.)

Bourne and the Brain


Cross Posted at Cocktail Party Physcis.

Lee Kottner here, on assignment from Jennifer to cover at least one of the offerings the World Science Festival held in NYC last weekend. Before I get started on the one event I actually got to, let me say how hard it was to narrow it down. So many cool offerings! So little time! It was just like being presented with with a really juicy conference program and having to pick between overlapping sessions: a nerd's paradise, with the bonus that there was also a street fair, movies, and art. Definitely more fun than your average conference (unless it's the Kalamazoo Mediaevalists). This is the World Science Festival's first year, so it's a little rough around the edges yet organizationally, but the line-up is absolutely stellar, and the intersections of art and science couldn't have been more intriguing. Theatre, dance, music, and film were all represented, along with the history of science and the fields of math (or maths, for you Brits out there), physics and astronomy, evolutionary biology, environmental science, epidemiology, genetics, botany, computer science, engineering, and neuroscience. The topics ranged from creativity, space-time, longevity, climate change, and astrophysics to the science of sports, of illusions, of green building and of Disney Imagineering, and plays and films about Einstein, Richard Feynman, Hugh Everett (of the parallel worlds theory) and . . . Jason Bourne.

And yes, that's where I come in, shallow fan of action flicks that I am. But it's the neuroscience offerings as a whole that got me excited about the festival. One of my favorite science writers, neurologist Oliver Sacks, had not one but two presentations, the first on visual perception and the brain, at the Metropolitan Museum of Art, and the second on music and the brain, in conjunction with the Abyssinian Baptist Church choir. I've read most of Sacks's books for the general reader, so normally I'd jump at the chance to hear him speak. But I couldn't resist "The Brain and Bourne" (nothing like Pinky and the Brain, I assure you) with producer/director Doug Liman, psychiatrist/neuroscientist Giulio Tononi, and producer/screenwriter (oh, the multitasking!) James Schamus. (Spoiler alert for anyone who hasn't seen the movie. And, like, what's taking you so long? There are two more already!)

Bourneidentityr2pic1The movie opens with an unconscious figure in a wetsuit (Matt Damon) floating face-up in a stormy Mediterranean Sea. Hauled aboard a passing fishing vessel, Wetsuit Man is discovered (1) to be still alive, though (2) shot twice in the back and (3) to be carrying a stainless steel capsule embedded under the skin of his hip. The capsule contains a laser which projects the number of a blind Swiss bank account. Huh? Wetsuit Man comes to, understandably upset at having objects removed from his body without his consent (or anesthesia), even if they are bullets and weird implants, and discovers he doesn't know who he is. He can walk, talk, play chess, shuffle cards, do pull-ups, tie complicated knots, speak several languages and function on a day-to-day level, but he has no idea who he is or was, or where he's been for the last twenty-some years of his life. For all he knows, he's sprung from the sea like Venus on the half shell. Classic amnesia.

Or at least the Hollywood version. Amnesia of just about any type is actually pretty rare, though you'd never know it from watching soap operas or reading Gothic murder mysteries. But there are several different types of amnesia and a number of causes. The two main types are anteretrograde amnesia, the inability learn and remember new information since the time the amnesia began, and the kind our hero experiences: retrograde amnesia, which involves a lack of memory of the past preceding the time one becomes conscious again. One of the symptoms of dementia is memory problems, but unlike those suffering from, say, Alzheimer's, victims of amnesia retain their cognitive powers and intelligence. They lack only their former memories, or, in the case of anteretrograde amnesia, the ability to make new memories. Guy Pearce's character suffers from this type of amnesia in the 2000 movie Memento, and must constantly write himself notes and take Polaroid pictures to tell himself what he's been doing for the past fifteen or so minutes.

Amnesia can encompass varying stretches of memory—from all of your previous life (global amnesia, usually transient) to just the five minutes before you knocked yourself out in a bike accident—and last for varying periods of time. Its causes include stroke, inflammation (from infections like encephalitis), tumors, oxygen deprivation (from a heart attack or CO poisoning), long-term alcohol abuse, and the classic Hollywood cause: pressure from bleeding between the brain and skull, i.e., a knock on the head. It takes a fairly serious head injury, however, one likely involving a long coma and months of rehab, to induce anything but transient global amnesia.

Bourneidentityr2pic2Wetsuit Man, who eventually decides his name is Jason Bourne on the strength of the evidence he finds in his lockbox at the Swiss bank, also discovers along the way that some of the things he knows how to do are downright scary. In one very subtle scene before Bourne visits his lockbox, he tries to catch some sleep on a park bench but is rousted by the Swiss cops. One pokes him with a nightstick, which Bourne grabs reflexively. If you watch carefully, you'll see him pause and in that pause is the moment when Bourne says to himself, much like Neo in The Matrix, "Hey! I know Jujitsu!" Bourne then handily disarms and disables the cops and runs away, to live to lay movie-fu on other attackers another day. Okay, he doesn't know who he is, but he can take out two trained cops in less than 30 seconds? Wait, it gets weirder!   

As the movie progresses, it's clear that Bourne is not just a martial artist with lightning reflexes (and that he fights dirty as hell), but he knows all about surveillance techniques, weapons, and being followed. Sitting in a truck stop on the way to Paris, he says to his new accomplice, Marie, "I can tell you the license plate numbers of all three cars out front. I can tell you that the waitress is left-handed and the guy at the counter weighs two-hundred and fifteen pounds and knows how to handle himself. I know that the best, first place to look for a gun is the cab of that grey truck outside.  I know that at this altitude I can run flat out for half a mile before I lose my edge. I knew that you were my first, best option out of Zurich.  How do I know all that?  How can I know all that and not know who I am?  How is that possible?"

Excellent question, Mr. Bourne. Is this just another example of Hollywood mangling scientific truth? Well, no, it's not for a change, though I wouldn't have known it without going to this talk. James Schamus started it off by asking Dr. Tononi if this kind of amnesia was actually possible. Surprisingly, the answer is yes, but it is more likely if it has a specific cause. In Robert Ludlum's original book, it's the classic blow on the head that gives Bourne his case of amnesia. Liman, in his research before making the movie, discovered this was unlikely to cause the kind of amnesia Bourne suffered from. Liman twisted the plot a bit and, though Bourne does suffer a break of consciousness after he's shot and falls (is tossed?) overboard with two bullets in his back, his amnesia is purely psychological in nature, arising from an internal conflict.

Brian_diagram_1 Psychogenic amnesia, it turns out, acts just like physically induced amnesia in many ways, but without the trauma. Dr. Tononi studies consciousness and its disorders, so this is right up his alley. True to the nature of the talk, he came prepared with a PowerPoint presentation, the first slide of which showed two PET scans of amnesiacs, one caused by encephalitis, one psychogenic. In both, the right temporal lobe (the yellow bit in the diagram at right) is inactive in almost exactly the same areas, though one brain is completely uninjured. Unsurprisingly, this is the part of the brain that is most closely involved with memory, mostly the storing (the hippocampus is thought to be mostly closely involved with making memory). Recent studies in brain mapping and neuroscience have shown that our brains generally parcel tasks into regions. Our memories are concentrated in one area; our skills, some of which involve muscle memory (proprioception) in another; our pattern recognition in another, and so on. Knowledge and personal memory are not the same thing, either. Our knowledge about a subject is static and factual, while personal memory tends to encompass a linear sense of time and other sensory impressions. Memory, like dreams and oddly like the movies, as Schamus pointed out, is a limnal state: ambiguous and untrustworthy as cops and prosecutors well know.

Hippocampus We tend to think of our memories as fixed and visual. The research of Dr. Tononi and others has shown that consciousness is a process, not just a location, and that our memories are not representational but rely more on reconstruction than recall. There's no rewind or replay button in your head, in other words. When we ask ourselves "Who am I?" or "What happened?" we're not going to get a picture, but a narrative, a story. This story includes not just our memories, but who we tell ourselves we are—our interpretation of those memories. If there's a clash between who we think we are and our memory, guess what loses? Then we become our own unreliable narrator.

In the case of Jason Bourne, as the other two sequels to this movie show, the internal conflict is between the kind of person Bourne thinks he is (one of the good guys who doesn't just randomly kill people) and the things his memory tells him he has done (not-so-randomly kill people). What sparks the conflict is a mission to assassinate a dictator in exile and finding him on his boat with his children in the same room. Bourne can't bring himself to shoot the man while he's holding his daughter and his other children are asleep in chairs around him. Instead of a blow to the head, guilt is the trauma, and Bourne conveniently forgets what he does for a living when he wakes up. It's too awful to contemplate otherwise.

In effect, Bourne becomes the person he thinks he is. Tononi pointed out that people with dissociative disorders, including multiple personalities, don't share the memories, even on a PET scan, that their "others" have. People in dissociative fugues can suddenly forget who they are (usually because of some emotional trauma) and wander off. But unlike Bourne, they generally don't know they've lost something, and will assume another identity, not try to find their old one. This separation can also occur in sleep states, such as the infamous case of Kenneth Parks, who killed his mother-in-law and seriously injured his father-in-law when he was sleepwalking, but had no memory of it. Bourne is in the process of writing a new story for himself, reconciling what he did with who he is now, and in doing so, recovers the memories of who he was. Like Kenneth Parks, until he regains his full memory, Bourne is conscious but not self-conscious.

Originally, this was a big problem for Liman, as a director. Usually, when characters are introduced in a story, the audience is cued on how to relate to them by seeing them in the context of their life: with friends, relatives, their dog, their boss. Bourne has no one and nothing to cue his audience. He's a blank slate. It's only in his journey, in the reconstruction of a new personality, that he becomes interesting and fully aware.

Now, imagine not only having your past be a blank slate, but not being able to imagine a future. Tononi also mentioned the case of Clive Wearing, a British musicologist who developed total amnesia after a viral infection. Although he still knows how to play the piano and conduct music, he has no other personal memories and cannot form new ones, like the character in Memento. Only Wearing's memory is of even shorter duration than Guy Pearce's character. Wearing has none. Most of his waking time is spent "rebooting" his consciousness from moment to moment. His diary consists of the consecutive statements "I am alive! I'm awake now. I am alive!" If that's the entirety of one's self-consciousness, is there a self? Bourne, at least, does manage to find or make a new one, as well as recover his past. But not everyone does.

Cue The Who. Oh wait. That's CSI. I forgot.