A Mind Forever Visualizing

I find reading Oliver Sacks to be, well, unnerving.  His detailed accounts of the fragility of the human brain upset my latent Cartesianism; the reduction of all that I hold dear about consciousness to electro-chemical reactions invites harrowing comparison with my own high school laboratory experiences.  In his recent article for The New Yorker, Sacks discusses the capacity of the brain devoted to visual processing to adapt to blindness, whether in children or adults, suddenly or gradually.
It seems that the brain can, in some cases, retain a visual faculty for processing images without visual input, drawing instead from other senses as well as the imagination.  This, of course, provoked my epistemological paranoia; how would I know if (any of) my senses were subtly but significantly impaired, particularly if they had been so since childhood or earlier?  (For more in this vein, see Sacks’s forthcoming study of this writer, The Man Who Mistook His Life For A Novel.)

The plasticity of the brain to adapt to new mental activities to which Sacks attributes the visual processing of blind people seems to jibe with (my understanding of) what we know about the development of the brain (as the father of a toddler, I am anxiously aware of Conventional Wisdom regarding how best to stimulate and mold children’s brains).  Though trained as a scientist, Sacks inevitably turns to questions of philosophy when the limits of neurology have been reached.  The result here is a guarded claim for self-actualization: the blind may retain their sight, if they truly want to.

More immediately provocative to me were the implications of one of the examples Sacks gives of varying capacities for visual processing:

I first became conscious that there could be huge variations in visual imagery and visual memory when I was fourteen or so.  My mother was a surgeon and comparative anatomist, and I had brought her a lizard’s skeleton from school.  She gazed at this intently for a minute, turning it around in her hands, then put it down and without looking at it again did a number of drawings of it, rotating it mentally by thirty degrees each time, so that she produced a series, the last drawing exactly the same as the first.  I could not imagine how she had done this, and when she said that she could "see" the skeleton in her mind just as clearly and vividly as if she were looking at it, and that she simply rotated the image through a twelfth of a circle each time, I felt bewildered, and very stupid. I could hardly see anything with my mind’s eye—at most, faint, evanescent images over which I had no control.

Proponents of the infamous female inferiority in spatial relations may dismiss Sacks’s mother as a statistical outlier and may cling to evidence of the effects of early hormone exposure upon fetal brain development, but I can’t help wondering if the plasticity demonstrated by blind people who "practice" visualization might not also avail people with predetermined (chromosomally or otherwise) deficiencies in certain mental faculties.  As I consider how best to expose my child to a hyperstimulating world, these questions lose their ponderousness.


SABR Dance

I don’t like baseball, I’m afraid.  My relation to all sports is somewhat checkered, however, so it might be helpful to specify which of baseball’s attributes particularly put me off.  I don’t find it terribly athletic.  I dislike the asymmetrical role played by pitchers, particularly when the pitching rotation makes it difficult to compare a team’s performances against common opponents.  And, of course, it’s as boring as watching crows crap.

What raises this dislike from the level of petulant indifference to caustic denunciation—other than having to publicly fund a for-profit baseball stadium, that is—is the thuggish conflation of baseball mysticism with nostrums of national identity that deters one from applying too much analytical rigor to the history of baseball (and America) while at the same time makes one feel accursed for having missed its halcyon days.

It was therefore with grim delight that I discovered the field of sabermetrics, which purports to challenge traditional measures of skill in baseball.  My discovery was occasioned by the recent publication of Moneyball, which studies the application of sabermetrics by the management of the Oakland Athletics.  After reading about it in a critical discussion of Moneyball on Slate, sabermetrics struck me as nothing so much as how a min-maxing gamer would approach baseball.  Indeed, a New Yorker profile of Bill James hails him as sabermetrics’ "founding nerd."

Detractors of sabermetrics lament that its widespread application would make baseball more technical and (somehow) less exciting.  Walks are boring, stolen bases are fun, goes the argument.  Perhaps.  But if the geeks are right and empiricism comes to crowd out hoary notions of "the fundamentals," baseball fans who prefer their arguments untainted by scientific rigor are invited to follow a sport where no meaningful objective standard for comparing player performance exists and where discussions of player merit remain squarely in the realm of visceral froth.


It’s Like Mensa, But With Less Self-Deprecating Humor

I read about this a few weeks ago, but I had hoped it would, as its proponents expect of themselves, die a quiet, dignified death.  Now that the post-Raines, pre-Keller New York Times has given it play, we’re probably stuck with it for a few more months.  Considerations of clubs-who-would-have-Groucho-Marx-as-a-member aside, it’s curious that people who ostensibly favor non-abbreviated thought would resort to such ham-handed self-branding.

If non-believers truly are such an oppressed minority, they might better serve their cause by sticking with irony, their first and best killer-app.  I’ve seen alternative suggestions for "doomed," "damned," and (as a kind of back-formation from the expropriation of "gay") "gloomy."  This last captures, I think, the pride in stoicism necessary to distinguishing adamant non-believers, but it lacks a pop-culture cachet to really speed the meme on its way.  I know!  Let’s call them "Eeyores":

"That’s right," said Eeyore.  "Sing.  A-le-le, a-le-loo.  Here we go gathering Nuts and May.  Enjoy yourself."

"I am," said Pooh.

"Some can," said Eeyore.

"Why, what’s the matter?"

"Is anything the matter?"

"You seem so sad, Eeyore."

"Sad?  Why should I be sad?  It’s Easter Sunday.  The most miraculous day of the year."

"Easter Sunday?" said Pooh in great surprise.

"Of course it is.  Can’t you see?  Look at all the miracles I’ve experienced."  He waved a foot from side to side.  "Look at the Risen Christ.  Salvation and life eternal."

Pooh looked—first to the right and then to the left.

"Miracles?" said Pooh.  "Resurrection?" said Pooh.  "Where?"

"Can’t you see them?"

"No," said Pooh.

"Neither can I," said Eeyore.  "Joke," he explained.  "Ha ha!"

Or something like that.


Genesis 2:19

Anyone who’s ever had to enter names in a database knows how frustrating it is to process "alternative" names.  Such names seem to be (and often are) wholly contrived without regard to historical, cultural, typographical, or sexual convention.  One suspects that the parents who choose such names do so out of transitory vanity and with utter disregard for the child who has to bear the result of their conceit for his or her formative years.  Everyone has a baby-name horror story; mine is the daughter of a co-worker who named her daughter "Infinitee [sic] Unique."

One is tempted to advocate a policy similar to those of certain European countries which maintain lists of "acceptable" names from which parents must choose.  Such governmental intervention in so personal a matter seems obscene to Americans, but it is part and parcel of societies that lack the American cultural fetishiziation of identity and contempt for governmental registration.  In such societies, no one regards one’s "official" name as having a significant connection to one’s identity; your school records, passport, driver’s license, and marriage certificate may all read "Jean-Luc," but everyone calls you "Yo-Yo."  Like one’s language, ethnicity, religion, and social class, one’s name is considered part of the inevitable baggage for which one can hardly be accountable.  In exercising control over the naming of children, governments in such societies not only standardize record-keeping but also claim to protect children from the stigma of capricious names.  Despite declining marriage rates, European governments also affect a horror of the stigma of illegitimacy and so require children take their father’s surname (in the case of unmarried parents, a father wishing to pass on his surname is required to make a formal Declaration of Patrimony, both earning certain rights and incurring certain obligations).

In the United States, about the only tradition we still honor is self-re-invention.  Accordingly, we make it relatively easy for anyone 18 years or older to legally change their name to anything composed of ASCII characters.  Similarly, Americans have had little trouble accommodating (socially if not bureaucratically) hyphenated surnames for spouses and their offspring.  Nevertheless, changing one’s name is still most commonly associated with either marriage or an artistic career.  Many professionals (typically females) who are accomplished before marriage feel obliged to keep their maiden surnames while their children take what, absent professional concerns, would be their married surnames.

As both an American and an information worker with a fetish for etiquette, I’m not convinced that the "problems" of absurdly contrived first names or cumbersomely appended surnames require a "solution," least of all one imposed by the government.  What I would like to see is a greater social recognition of the congruence between one’s pre-majority name and one’s unemancipated status as a minor.  That is to say, I’d like it to be more commonplace for 18-year-olds to seriously consider changing their names. I don’t want it to be a requirement, of course, but it should be made as easy as possible, like Motor Voter Registration; on your 18th birthday, you receive a Name Change Form in the mail from your state.  I imagine most people would keep the name their parents gave them, but many would not, resulting in greater acceptance of names as impermanent, and a concomitant accommodation for non-traditional names or children with names different from their parents’.  If it becomes a rite of passage for a child to be "re-christened" when he or she attains majority, it becomes less important what surname the child is given at birth, and therefore there is less consequence to what surname the parents adopt at marriage.

As a father, I am well aware of the social role played by surnames in establishing and enforcing paternity.  I expect my proposal would be criticized by both fathers’ rights advocates and those who believe fathers need greater encouragement to support their families.  When my French wife and I married, she replaced her birth surname (identical to her father’s surname) with my surname (identical to my father’s surname).  This was entirely her choice, but of the reasons she gave for this decision the one I most easily supported was that her birth surname is (by both American and French norms) rather long and difficult to pronounce.  Had she demonstrated any affection for her birth surname at all, I would have seriously considered hyphenating, combining, or exchanging surnames, or simply encouraging her to keep hers.  As it was, she took my surname, and so there was little reason to give our son a different surname.

It would be foolish to pretend that I am completely indifferent to the fact that my son has the same surname as his father, his grandfather, and his great-grandfather, and his great-great-grandfather (that’s as far back as I’ve ever researched; before that, they could all be Schickelgrubers as far as I’m concerned).  I would like to think, however, that were he, upon his 18th birthday, to change his name to Raymond Luxury-Yacht, I wouldn’t love him any less.  Indeed, I would be proud of him for participating in a time-honored American tradition.

He’ll have a hell of time with his French passport, though.


Bringing A Machete To A Gun Fight

This weekend we quite enjoyed 28 Days Later despite a gnawing plot hole, which I discuss (warning: spoilers) here.

Setting the movie in Second-Amendment-less Britain was a smart move; berserk zombies aren’t nearly as frightening when one has access to Half-Life-levels of firearms.

I’m always a sucker for Apocalypse movies, but the clincher was director Danny Boyle, he of Trainspotting fame.  More compelling to me, however, is Boyle’s freshman feature Shallow Grave, a more deliciously claustrophobic update of Hitchcock you will never find.  My roommates were lucky to get out when they did.


CRONing It In

Over at the Conspiracy, Sasha Volokh blows the lid off the blogosphere’s own nascent Rick Bragg scandal.


What Conservative Media?

It’s a truism that ideologues from the opposite end of the political spectrum are more conspicuous than those whose beliefs are closer to one’s own, but I don’t really feel the need to second-guess myself when I align with those observers who find risible the notion that liberal bias in the media is all-pervasive and needs to be defied by a plucky band of straight-talking iconoclasts.

Regardless of where one comes down in this "debate," one aspect seems to have gone unnoticed.  Advocates of the liberal bias theory have cited several instances where the appellation "conservative" has been attributed to speakers where their no-less-ideologically-driven opponents have been unlabeled, giving the impression that the liberal view is "normal" and the conservative view is "extreme."  What I find at best curious and at worst disingenuous about these critics is their failure to recognize that this persistent labeling is the result of conservatives’ enormous success at branding themselves.

As liberals cast about for self-flattering explanations for the dearth of lefty equivalents of Rush Limbaugh and Bill O’Reilly, they would do better to attend to the astonishing discipline of Conservatives™ to stay "on message."  Like all other political movements, conservatives vary in their degree of zealousness and orthodoxy.  Just as some liberals disagree over fundamental issues, thoughtful conservatives differ over tactics, goals, and philosophies.  Yet these conservative schisms rarely appear in national debate (a refreshing exception has been occasioned by the recent Supreme Court decision in Lawrence).

It didn’t used to be this way, but in the last ten years or so (ever since a certain Presidential election), the Conservatives™ have aggressively expanded their brand awareness.  In listening to the same Conservative™ pundits on panel shows over several months, I have been amazed that ostensibly intellectual people could maintain the same degree of passion in defending Conservative™ positions on every issue.  This is not to say that there are no rigidly dogmatic liberals, or even that they are few in number.  They just don’t get booked on talk shows.  Liberal pundits seem to care more about exhaustively qualifying their own positions than focusing on the core of contemporary media advocacy: disparaging the opponent’s position.

In an age of increasing media saturation, producers need reliable product to reach target demographics.  Whether a conservative actually shares the Conservative™ positions in all instances is irrelevant; when he appears on Fox or Clear Channel, he puts on his game face and Exposes More Liberal Folly.  A liberal pundit is more likely to try to demonstrate that he is still the Smartest Kid in the Class.  Chris Kattan’s parody of Paul Begala on Hardball accurately (if hilariously) illustrates the fecklessness of liberals who agree to appear opposite Conservatives™ apparently without understanding how contemporary political discourse is presented.  This trend has become so pronounced and liberals have become so outgunned that for a conservative to break into the Conservative™ big-time these days she has to recognize that her frontal lobes are professional liabilities and to adopt hindbrain positions.

In analyzing the effects of Conservative™ influence upon political media and of political media’s influence upon the electorate, liberal journalists have let themselves become so bullied by charges of "liberal media bias" that they
appear to believe that criticism of Conservative™ statements must be muted in the name of maintaining "objectivity."  This fails to recognize that political punditry has fallen into a Prisoner’s Dilemma, and the Conservatives™ have clearly defected.  To paraphrase Justice Robert Jackson, objectivity is not a suicide pact.  No censorship is so oppressive as self-censorship.

Update: Um, yeah.


Waiting For Herakles

Congratulations and condolences to Seattle’s neighbor, rival, and parallel-universe-twin Vancouver, British Columbia, on the occasion of being awarded the 2010 Winter Olympics.  For me, the chief benefit from this decision will be to further inoculate Seattle against ever succeeding in assuaging it’s endemic rubophobia by winning an Olympic bid for itself.

The only other international sports event comparable to the Olympics is, of course, the World Cup.  Though the (modern) Olympics are a few decades older than the World Cup, the host selection process for both events have long become so shot through with tediously sordid politics and corruption that it no one can honestly claim to be shocked by it anymore.  Absent bribes and other tangible (if putatively illegal) criteria, the governing bodies must select hosts based on the often conflicting and certainly orthogonal principles of "infrastructure soundness" (who can afford to spring for the facilities) and "geographic equity" (who hasn’t hosted in a while).  The principle of "athletic tradition" waxes and wanes as new markets are imagined and disproved.  Unlike the World Cup Organizing Committee, which (for reasons having less to do with minimizing the aforementioned problems and more to do with stadium logistics) selects a host country in which several cities participate, the International Olympic Committee bestows its favor upon an individual city (or, in practice, a metropolitan region).  This creates the well-known paradox of Olympic bids: those cities which are best suited to hosting the Olympics are precisely those which least need the publicity and other ancillary benefits promised by Olympic boosters.

Apart from the construction and hospitality industries, I cannot see that a community reaps any lasting economic benefit from hosting the Olympics.  In order to improve their chances, bid cities incur enormous public financial commitments "on spec," and considerations of whether the infrastructure or tax base can support such commitments are glossed over with appeals to "civic pride." A city with real civic pride would be able to soberly weigh the costs of hosting the Games and, if finding them prohibitive, say "No." After being lofted by the rise of Microsoft, Nirvana, Starbucks, and Amazon, Seattle somehow mustered the restraint at the peak of the 90s bubble to refuse to sign onto the developers’ Olympic bandwagon (paying for the House That Griffey Bilked might have played a factor, as well).

Of course, some of those 2012 boosters are now claiming that Seattle will derive some secondary tourism increase in February 2010, as if anyone will want to drive three hours to the border and wait almost as long at Blaine before trying to find parking near the Biathlon range.  I think a more certain windfall is going to the wealthy skiers who own condos at Whistler, but hey, a rising glacier lifts all bobsleds, right?  At least Vancouver probably stands a better chance of getting its federal government to chip in than we would with ours.  So, good on you, Vancouver, and better you than us.