Posted by Ed
One of the movies I saw back in March was The Ladykillers, a remake of the Alec Guiness classic by Joel and Ethan Coen. I didn't find the film especially memorable (my own commentary on this blog ended by describing it as "entertaining enough, occasionally quite amusing, but nothing special overall"; I get the sense that my reaction might have been more negative if I were more familiar with the original Ladykillers produced by the Ealing studio.
Tonight I read Terry Teachout's review of the movie in Crisis, one of the most delightfully scathing commentaries on a movie that I can remember reading. The first paragraph is worth quoting in full, and the second is a decent introduction to the review:
Hope springs eternal in the human breast, except when you’re a critic. Sooner or later, there comes a time when you finally decide to give up on artists who've disappointed you repeatedly. I stopped bothering with Woody Allen, for instance, after Sweet and Lowdown (the only reason I went to see that one was because it was about a Django Reinhardt–like jazz musician), and now there is no possible circumstance not involving the exchange of large sums of money that could induce me to go see a new movie by him. I don't care what other critics say, or even what my moviegoing friends say: I just don't care anymore. Yes, I know it's well within the realm of theological possibility that Allen might someday be touched by grace and make a good movie, but if God doesn’t have better things to do, then I've got bigger things to worry about.
I made a similar decision about Joel and Ethan Coen after the most recent of their films that I reviewed for Crisis, O Brother, Where Art Thou? I’d liked some of their early work very much, Blood Simple and Miller’s Crossing in particular, but as the outlines of their style and the narrow limitations of their interests became clearer over time, I realized that intermittent interest had turned to active dislike. The older I get, the more acutely aware I become of the passing of time, and the less of it I want to waste on experiencing works of art that irritate me. After O Brother, the thought of letting the Coen brothers hustle me two hours nearer to the grave with yet another of their arch films-about-film was simply too depressing to contemplate, so I skipped The Man Who Wasn’t There and Intolerable Cruelty and felt quite pleased with myself for having done so.
It’s revealing, I’m sure, that The Ladykillers, like O Brother, Where Art Thou? before it, makes extensive use of gospel music for parodistic purposes. Once again, the music itself is terrific, but the uses to which it is put are both ironic and quintessentially postmodern: We are clearly supposed to be amused by all those benighted believers rocking joyously in their pews, even though Dey Got Rhythm and we sorry white folk don’t. That’s how postmodernism works—it plays both sides of the street, winking in either direction. That’s how The Ladykillers works, too, and that’s why it’s the last “comedy” by Joel and Ethan Coen I ever plan to see. Whatever else nihilism is, it isn’t even slightly funny.
Posted by Ed
In case you need still more tiding over, here are some more links:
This was the first time I had ever dipped into primary documents, better known in the trade as "manuscript collections." As a self-styled historian, an old English major with a queasy sense of being a highbrow fraud with a first book contract, I felt that I had touched history -- the real, perishable stuff. That night, still agog, I telephoned my wife from Boston and confessed to something deeper than a mere thrill. After years working as a journalist, hazarding educated guesses about the dumb show of business and politics, I had the odd sensation of having burgled Lamont's office, rifled his papers, violated his privacy and unmasked his secrets. To be sure, I had duly submitted call slips, sat at my appointed chair as boxes were retrieved and handled documents as gingerly as I would saintly relics. Yet I experienced the delicious, illicit frisson of being a second-story man, a literary thief, a scholarly voyeur. There was something furtive, lawless and absolutely irresistible about the whole enterprise.
Posted by Ed
I haven't had a lot of time to blog lately, I'm afraid: I've been fighting off a nasty late-spring cold, reading the fascinating Art of Memory (by Frances Yates) and the intriguing Taming of Chance (by Ian Hacking), writing some final revisions to my dissertation proposal, and finishing up my work as a lector for the writing program. One of these days I'll post my commentary on Lynne Truss's Eats, Shoots and Leaves and my thoughts on Peter Jackson's Lord of the Rings trilogy, but until I do, here's a tiny tidbit to tide you over:
CORRECTION OF THE WEEK. If not the month. From today's Boston Globe (not online yet):
Because of a reporting error, Dr. Arleigh Dygert Richardson III, former teacher at Lawrence Academy in Groton, was described in his obituary yesterday as favoring tacky pants with tweed jackets and Oxford shirts. Dr. Richardson favored khaki pants.
Posted by Ed
The finals of the National Geography Bee were held in Washington, D.C., yesterday, and a 14-year-old eighth grader from Kansas went home as the champion. (Click here for a National Geographic Society press result on the finals, here for a press release on the first round, and here for a newspaper article on the event.)
I've watched the results of the competition with interest every year since 1991, when I competed in the bee myself. (I won in Massachusetts, went to the national event, and missed making the top ten when I answered the last of thirteen tie-breaker questions incorrectly. Almost enough to make you superstitious, huh?) I think the competition is a fantastic way to increase interest in geography among middle-schoolers and an even better opportunity to recognize the strengths of kids who are interested in academics; I also think it provides a fascinating look at a lot of issues in contemporary society. (Why do more boys than girls compete at each level of the competition? Should we surprised at the high number of home-schooled children?) I wish that this event could get at least half the publicity of the National Spelling Bee, which seems like a far less interesting competition...
Update: I realized, after typing this entry, that I used the old name for the competition, rather than its new, preferred title. I can't help but think that "The National Geographic Bee" sounds silly, though, so I don't think I'll bother editing the entry above. Is The National Geographic Society really that desperate to make sure it gets recognition for the event?
Posted by Ed
A small bit of news for the Harry Potter fans among you: J.K. Rowling now has a website. (The Guardian describes the launch of the website and the world premiere of the third Harry Potter movie here.)
I wish I could tell you that the site is amazingly exciting, but it's not. It's a little busy for my tastes, but given that I'm at least 15 years older than the site's target audience, I don't really mind; I've learned a handful of things from the site so far, namely, that Ginnie's full name is Ginevra, that there was originally going to be an unpleasant Weasley cousin in the fourth book, and that Crookshanks is half Kneazle. (No, I have no idea what that means, though I'm sure that a quick websearch would enlighten me.) The most interesting parts of the site include some of Rowling's notes on the book, as well as her discussion of scenes she cut from her novels. (Nearly Headless Nick was originally going to sing a song about how he died, for example.)
Overall, then, I'd be really intrigued with this site if I were a kid. Given that I'm a doddering old history grad student, I think it's kind of cute.
Update: The Telegraph features an interview with another children's writer from the British isles, Eoin Colfer. I was never very impressed by Colfer's Artemis Fowl (and not just because I consider it sacrilegious to name a male character Artemis); it's the sort of book that I'd have enjoyed as a kid, but which isn't very appealling to anyone over the age of 14. (It also reads suspisciously like the novelization of a movie, so I guess we shouldn't be surprised that a film version of Artemis Fowl is coming soon to a theater near you...) The interview, though, is kind of entertaining. (via Bookslut)
Posted by Ed
In yesterday's New York Times Sunday Magazine, Christopher Caldwell discusses the recent controversy over whether Vladimir Nabokov got the idea for Lolita from a 1916 story by an obscure German writer; that story, also named "Lolita," describes the male narrator's obsession with a young girl. I didn't find Caldwell's article terribly interesting (the same issues have been dealt with more compellingly by other writers), but it did introduce me to a delightful new word: cryptomnesia.
Here's how Caldwell discusses the controversy:
Earlier this spring, Michael Maar, a literary scholar, speculated that the name Lolita may have been similarly interwound with Nabokov himself. In Die Frankfurter Allgemeine Zeitung and in The Times Literary Supplement, Maar alerted readers to a 1916 short story called ''Lolita,'' by an obscure Berlin writer, Heinz von Lichberg. That von Lichberg later served on the editorial board of a notorious Nazi publication heightened the frisson of scandal.
In the earlier work, as in the later, a first-person male narrator describes an obsession with a young girl named Lolita that entails long travels and ends in death. Maar finds the coincidence of plot, narrative and name ''striking.'' He does not accuse Nabokov of plagiarism, since ''he was a genius on his own.'' (As some are too rich to steal, apparently, others are too smart to crib.) Maar prefers the word ''cryptomnesia,'' a process by which things are learned, forgotten and then mistaken for original inspirations when recalled. Since Nabokov lived in Berlin from 1922 to 1937, Maar asks, could he have been under the ''stimulus'' of von Lichberg's story? If so, what does that tell us about one of the last half-century's most famous -- and notorious -- works of fiction?
Cryptomnesia, then, is a fascinating concept. (One writer defines it as a phenomenon "in which one remembers the content of something to which one had been exposed without remembering the event of reading or seeing it before, so it seems that one is thinking of it for the first time.") Until reading Rosenbaum's article, I wasn't sure whether "cryptomnesia" was an already existing term or a neologism coined by Maar; for all I knew, it might fall under the category of words that don't exist, but should. (Barbara Wallraff writes a "word fugitives" column on this subject for The Atlantic Monthly.) I'm sure I've been a victim of cryptomnesia at one point or another (perhaps even on this blog), and I agree with Rosenbaum that the concept should have a lot of appeal for anyone interested in words and writing. It's also common in business, and I believe there's even a Seinfeld episode where Elaine unintentionally plagiarizes a Ziggy cartoon. My curiosity piqued, I decided to investigate further. It seems that the term has a lively history, connected not only with allegations of plagiarism but with Jungian psychology and investigations of the paranormal.
My first stop was The Oxford English Dictionary, which features the following passage in its section of "representative quotations":
a1901 MYERS Hum. Pers. (1903) I. p. xvi, Cryptomnesia, submerged or subliminal memory of events forgotten by the supraliminal self. Ibid. II. 136 ‘Cryptomnesia’ (as Professor Flournoy calls submerged memory). Ibid. 140 This cryptomnesic automatism. 1916 C. E. LONG tr. Jung's Coll. Papers Analyt. Psychol. 91 The rudimentary glossolalia of our case has not any title to be a classical instance of cryptomnesia. Ibid., The cryptomnesic image arrives at consciousness through the senses. 1961 W. H. SALTER Zoar x. 138 Latent memory (cryptomnesia) is therefore left as an alternative explanation to sheer chance-coincidence.
Cryptomnesia is, literally, hidden memory. The term is used to explain the origin of experiences that people believe to be original but which are actually based on memories of events they've forgotten. It seems likely that most so-called past life regressions induced through hypnosis are confabulations fed by cryptomnesia. For example, Virginia Tighe's hypnotic recollections of Bridey Murphy of Cork, Ireland (Bridie Murphey Corkell), if not deliberately fraudulent, are most likely recollections of events that happened in this life but which she had forgotten.
Cryptomnesia may also explain how the apparent plagiarism of such people as Helen Keller or George Harrison of the Beatles might actually be cases of hidden memory. Harrison didn't intend to plagiarize the Chiffon's "He's So Fine" in "My Sweet Lord." Nor did Keller intend to plagiarize Margaret Canby's "The Frost Fairies" when she wrote "The Frost King." Both may simply be cases of not having a conscious memory of their experiences of the works in question.
The first incidence of cryptomnesia was recorded in 1874, involving an English medium William Stanton Moses. In a seance Moses said he contacted the spirits of two young brothers who had recently died in India. The deaths were quickly verified by a check of the records. But, further research showed that the obituary ran in a newspaper six days before the seance and all information in the obituary was given in the seance and nothing more was added.
A good indication of cryptonesia [sic] is when a person given information containing known errors that have been printed elsewhere. Such an incidence occurred in 1977 when a past-life regressionist hypnotized a 23 year old woman, Jan, on British television. The woman told of Joan Waterhouse, a famous witch of Chelmsford who had been tried and set free in 1566. She gave the date of 1556. Experts were quick to dismiss the recall as cryptomnesia because Jan gave the incorrect date. The date of 1556 was published in a Victorian reprint, of which there were only two copies, one was displayed in the British Museum. It was possible that Jan had seen it. Although she only had a grade school education, her other accounts of the major characters and details of the trial were accurate.
Ultimately, though, there's one language authority whose opinion of cryptomnesia would really interest me. He's dead, unfortunately, but he was a writer whose works showed a remarkable love of language and wordplay, and whose corpus was remarkably learned (though more than a little pedantic); most importantly of all, he now has a personal stake in the question. I'm referring, of course, to Vladimir Nabokov. If he were alive today, I'm sure he'd have something fascinating to say on the subject...
Posted by Ed
This week's New Yorker has a nice profile of Barack Obama, the Democratic candidate for the Senate here in Illinois. Here's a priceless anecdote:
[Congresswoman] Jan Schakowsky told me about a recent visit she had made to the White House with a congressional delegation. On her way out, she said, President Bush noticed her “obama” button. “He jumped back, almost literally,” she said. “And I knew what he was thinking. So I reassured him it was Obama, with a ‘b.’ And I explained who he was. The President said, ‘Well, I don’t know him.’ So I just said, ‘You will.’”
(For more on Obama, check out his campaign blog, which may or may not include his campaign theme song.)
Posted by Ed
I'm just back from a weekend trip to Berkeley, and I'm too tired/lazy/busy to write anything substantive of my own. Instead, here are some recent articles that have struck my interest:
Just as pizza, once a Neapolitan speciality, spread throughout Italy as a result of its popularity in America, so Tantra's reputation in India was significantly affected by its notoriety in Europe. Today, many scholars both within and without Hinduism insist that the sort of hard-core Tantra that White describes never existed and that Tantra has always been solely a technique of meditation. When scholars of this ilk encounter the blatantly sexual statements of the hard-core texts (and the Tantras do contain statements like: "The body of every living creature is made of semen and blood. The deities who are fond of sexual pleasure drink semen and blood"), they interpret them metaphorically, somewhat in the manner in which rationalizing Greeks interpreted their own myths as allegories.
The problem with the book comes not from the abundance of anecdotes and details, but from Montefiore's unwillingness to throw any of them out. That reluctance precludes building any kind of meaningful structure. An enigma of totalitarianism -- one of the things that inspired Orwell, Arendt and others to think hard about it -- was its effort to destroy the difference between the public and the private realms. It made them both equally subject to the absolute demands of ideology. In effect, Montefiore does the same thing, but by reducing everything to the dimensions of trivia.
Before reading "Stalin: Court of the Red Tsar," for example, I did not know that one member of the British delegation to Moscow on the eve of World War II was the author of a book called "Handbook on Solar Heating." And now I do know. But why? That, like Soviet history itself, remains an enigma.
Posted by Ed
A year ago, the online history journal Common-Place featured an article on an upcoming TV show in which historical documentary-making and reality TV would come face-to-face: Colonial House. To produce that program, 26 "colonists" spent six months in a mock settlement on the coast of Maine, talking, working, acting, eating, and thinking as if they were residents of a 17th-century community. Colonial House is going on the air this week, appearing for two hours on May 17, May 18, May 24, and May 25.
I never know what to make of programs like this, and I find them easy to mock. As Common-Place puts it,
It's easy for academic historians to take potshots at the reality-television impulse driving Colonial House and the "Mayflower Project." "They should set it in Jamestown," one colleague mused. "Then all the colonists would be young and male, and we'd see them resort to cannibalism before they all gave up and died in the season finale."
For me, Colonial House raises a series of different questions--questions about what makes the past worth studying and about how we compare the present day to the world of 1628. As Common-Place points out, for example, the show's world-view is informed by its emphasis on a historical "fear-factor." The show's slogan is "No TV. No phone. No electricity. No computers. No experience necessary"; the show emphasizes the "challenges" and "rigorous" of a "harsh" New England life ruled by "stringent" laws. Consider this paragraph from the show's website:
Think colonial life was all about pious Pilgrims, powdered wigs and freedom for all? Think again! Two dozen modern-day time travelers find out the hard way what early American colonial life was really like when they take up residence in COLONIAL HOUSE, public television's latest hands-on history series from the producers of FRONTIER HOUSE, MANOR HOUSE, and the Peabody Award-winning THE 1900 HOUSE.
Perhaps I'm just a snobbish history grad student, but I'm much more interested in how colonists viewed their world than in how they survived a cold winter. According to an article in Slate, the creators of the show hoped to accomplish this goal, but were never entirely successful:
Colonial House's predecessor, the entertaining Frontier House, charted three families' efforts to live like 19th-century Montana homesteaders. Colonial House ups the ante considerably. In addition to eating, sleeping, working, and playing like it's 1628, the 26 participants are expected, as cast member Mrs. Michelle Rossi-Voorhees (the wife of a freeman) puts it, to occupy the "head space" of early English colonists. It's not enough to use crude tools and wear scratchy clothes as they did in Frontier House; in Colonial House, the participants are supposed to think and behave and relate to each other as if inhabiting a different time. If in real life you're an educated, outspoken woman, say, you're expected to mind your tongue in the colony. The idea is interesting in theory, but in practice the premise is too heavy a burden for these otherwise smart, well-meaning people to bear. The harder the participants work at being true to the past, the more they look like products of the present.
The most glaring departure from 1620s life came when one cast member (a graduate student playing an indentured servant) decided to come out of the closet--a decision that seems historically questionable. (Slate quotes the student noting that the governor would probably take him out and have him killed; I have doubts about that, but I doubt that he could have a conversation like that at the time and I suspect that 17th-century Puritans would be confused by the 21st-century conception of homosexuality.)
Slate continues to address the show's tenuous grasp on historical reality in the following paragraph:
And then there's Gov. Jeff. I never thought I'd be in sympathy with a conservative Baptist minister from Waco, Texas, but Jeff Wyers, playing the colony's governor, seems to be the only person who wants the show to be what it was intended to be. Last night, he attempted to model the colony on the Puritan ideal of a utopian "City on a Hill." But when Gov. Jeff lays down the law—no profanity, women must cover their hair, mandatory church attendance on the Sabbath—almost everyone, in his or her own way, rebels. Saucy indentured servant Paul Hunt keeps swearing up a storm; Mr. and Mrs. Voorhees ditch church to go skinny dipping (!); while freeman Dominic Muir sneaks off to town for a beer and a plate of fries. Implementing historically accurate enforcement measures—wearing scarlet letters, being tied to a wooden stake—proves to be a modern pain in the ass, and work is brought to a near halt. Because the colony is expected to be financially viable (project rules dictate that the cast pay off imaginary "investors"), Gov. Jeff capitulates, proving that at least he's well-versed in that most modern of religions—expediency.
This situation does, indeed, sound like a mess, but it might be fun to watch the show anyway. At the same time, I'm not even sure that this picture of colonial life is completely unrealistic historically: in doing genealogical research on my colonial ancestors, I've learned about forebears of mine who were fined by the Puritan authorities for skipping church and swearing in public. (My favorite ancestor was fined for skipping church 14 Sundays in a row--and when it was time to pay the fine, the records say, he "was nowhere to be found.") I've also found that colonial Massachusetts was a hilariously litigious society: certain ancestors of mine were always suing their neighbors or being taken to court, often over matters as trivial as the loss of a silver bodkin.
What would be interesting, I think, is a historical TV show that looks at the conflicts in colonial Puritan society and asks why they came about. (I doubt that what caused a rift between my ancestors and the authorities was a fervent desire to go skinny dipping and eat lots of French fries, after all!) I don't think a reality TV show is the way to do this, but even if it were, the best way to convince participants to enter the "head space" of colonists isn't to give them a list of rules or to tell the women to mind their tongues. You can debate the best way to understand the world of the colonist; you can conceive of your project in historical or anthropological terms, and describe your subject of interest as the "behavior," "attitudes," "beliefs," "mentalites," or "subjectivity" of colonial settlers, depending on the nature of your approach.
I probably shouldn't make sweeping pronouncements about Colonial House without having seen the show, but I suspect that a program like this is more effective as entertainment than as history: it's probably better at introducing viewers to "the historical fear factor," and teaching them ahistorical lessons about how people deal with adversity, than it is at teaching them about the past. What's more, I suspect that I'd believe this even if I weren't a history graduate student. After all, which sounds more interesting to you: a documentary about law-suits and community conflict in 17th-century Massachusetts, or a show in which Governor Jeff gets mad at his skinny-dipping colonists?
Posted by Ed
As many readers of this blog undoubtedly know, Shrek 2 is being released in theaters today. The reviews have been very favorable so far, and I enjoyed the first movie in the series, but I feel an unusual degree of ambivalence about the sequel.
For those of you who don't know, the character Shrek--an ugly, green ogre--was first introduced in a children's picture book by William Steig. Steig died last fall at age 95 and was best known as a cartoonist at The New Yorker, but his book Shrek has won him a cult following in certain circles. A.O. Scott briefly mentions the book's popularity in his review of Shrek 2, concluding with a paragraph that captures some of my distaste for the first movie in the series:
Mr. Steig's "Shrek" is a celebration of ugliness that also happens to be one of the most beautiful children's books ever written, with respect both to its pictures and its prose. Of course it is unfair to compare that slim volume to the franchise it has spawned, which is a phenomenon in its own right. Certainly "Shrek 2" offers rambunctious fun, but there is also something dishonest about its blending of mockery and sentimentality. It lacks both the courage to be truly ugly and the heart to be genuinely beautiful.
On my old blog, I commented on a related topic several months ago when I read a Boston Globe article about the portrayal of beauty in fairy tales. I found one particular comment in that article both amusing and irritating:
Grauerholz hopes the fairy tales will continue to evolve to include ordinary-looking or ugly characters as the heroines or heros, as in the 2001 animated film "Shrek," whose happy ending has a beautiful princess turning back into an ogre and leaving the prince at the altar.
"What I think is so interesting about 'Shrek' is that it's the opposite of that ugly duckling story. The princess becomes the ogre in the end. But that whole movie was about twisting the fairy tales around," Grauerholz says.
This is where my argument gets complicated. I suspect that the creators of Shrek thought that their ending to the movie was extremely clever and creative--in the end, the prince didn't get the princess, and the princess turned ugly! Instead, however, the ending struck me as a cop-out. One of the main questions of the movie was whether two very different people could fall in love with each other, and instead of deciding that they could, the creators of Shrek decided that they'd need to become the same after all. Instead of glorifying ugliness, the movie glorified conformity.
This decision came at the end of a movie that prided itself on being "subversive." (Try googling "Shrek" and "subversive" and you'll get over 2,000 hits!) The first half of the movie relentlessly makes fun of Disney, of past animated cartoons, and of the idea of the fairy tale in general; the villain (Prince Whatever-His-Name-Is) is said to be based on Michael Eisner, and the message of the movie seems to be "We're cleverer than those Disney cartoonists ever were, and you're clever too, since you understand all the sly pop culture references we've included in the script!" Most of these references will fly over the heads of the kids in the audience, and the whole movie has a bit of a self-congrulatory air.
I don't want to overstate my criticisms of the movie, since--like most people--I liked Shrek overall. The movie's humor just feels too cynical, too smirky, and too self-congratulatory for my tastes; it also feels, like many Pixar movies, as if its creators had used focus groups and target audiences to craft the ultimate crowd-pleaser. (Compare it to a quirkier movie like The Triplets of Belleville, which resembles Shrek in its occasional bawdiness but which feels like it came unbidden from the mind of an eccentric genius.) Shrek differs from Pixar movies like Finding Nemo and Monsters, Inc. in that it seems even more determined to appeal to grown-up audiences through bawdy humor or subtle pop culture references, and it seems determined to emphasize its own coolness and sophistication in doing so.
My two main criticisms of Shrek come together in the ending. The first half of the film builds itself up as a witty, self-referential take-down of the traditional fairytale; in the end, however, it backs away from a complete break with the form. Imagine how much more entertaining the movie would have been if Fiona had stayed a human but had still run away with Shrek. The ending would be far less sappy, the new scenario would have lots of opportunities for humor, and the final outcome really would, in a sense, be subversive: Fiona would be sticking it to all the people who thought that Shrek was too ugly! The actual ending, however, is sappy and boring, and doesn't fit very well with the beginning of the movie; moreover, when you strip away the movie's subversive aspirations, the only anti-fairy tale elements of the story that remain are the self-congratulatory and cynical strains of the movie's first half. The movie has a contrarian exterior, but its heart is completely conformist.
Do these same criticisms apply to the sequel? A.O. Scott, it seems, agrees with me:
In terms of its attitude toward the audience, DreamWorks 3-D animation is in some ways the opposite of Pixar, choosing to divide its viewers by age rather than uniting them. The music (including Butterfly Boucher's cover of David Bowie's "Changes" and a rendition by Mr. Banderas and Mr. Murphy of "Livin' la Vida Loca"), the in-jokes and the occasional touches of bawdiness are intended to placate insecure adults while the bright colors and jaunty storytelling enchant their children and teach them to be themselves, like all the other kids with Shrek dolls and ears.
This kind of strategy is hardly uncommon in pop culture these days, and "Shrek 2" executes it with wit and aplomb. The script, by Mr. Adamson, Joe Stillman, J. David Stem and David N. Weiss, has jokes that grown-ups and precocious kids will congratulate themselves for getting, and plenty of broader humor (which actually works better). The movie's goal is to enchant children with an old-fashioned fairy tale while simultaneously mocking and subverting its fairy-tale and nursery-rhyme premises. This is sometimes enjoyable and genuinely imaginative (appearances by the Gingerbread Man, who looks like Mr. Bill of "Saturday Night Live," and the Three Blind Mice are especially clever), but it also leaves a sour, cynical aftertaste.
Posted by Ed
This week's Village Voice features a fascinating article by Rick Perlstein, discussing the White House's consultations with apocalyptic Christian groups before it announced its new Israel policy:
The e-mailed meeting summary reveals NSC Near East and North African Affairs director Elliott Abrams sitting down with the Apostolic Congress and massaging their theological concerns. Claiming to be "the Christian Voice in the Nation's Capital," the members vociferously oppose the idea of a Palestinian state. They fear an Israeli withdrawal from Gaza might enable just that, and they object on the grounds that all of Old Testament Israel belongs to the Jews. Until Israel is intact and David's temple rebuilt, they believe, Christ won't come back to earth.
Abrams attempted to assuage their concerns by stating that "the Gaza Strip had no significant Biblical influence such as Joseph's tomb or Rachel's tomb and therefore is a piece of land that can be sacrificed for the cause of peace."
Posted by Ed
Tired of reading about Troy? (I wouldn't blame you, especially if you tried to read all of my rambling reflections on the movie...) Sunday's San Francisco Chronicle features a review of the autobiography (subtitled "an animated life") of Ray Harryhausen, who did the special effects for 1981's Clash of the Titans:
After years of neglect, Harryhausen and his work are belatedly getting the attention they deserve. The artist recently received a star on Hollywood Boulevard, he appeared in a cameo in "Spy Kids II," and Pixar animators named a trendy restaurant after him in "Monsters, Inc." His new memoir lives up to the subtitle "An Animated Life," as it focuses almost exclusively on his career, from his early short films to the features that showcased his special effects wizardry.
He offers a warm tribute to his mentor Willis O'Brien, the stop-motion pioneer who animated "King Kong," and praises his long-time producer Charles Schneer. He writes enthusiastically about the actors who had to cope with the monsters he created, including Raquel Welch, Harry Hamlin, Maggie Smith and Laurence Olivier. Harryhausen's only harsh words are reserved for unnamed directors who failed to understand the need for his input when shooting sequences that would involve animation, and for misguided publicity campaigns that hurt films at the box office.
Posted by Ed
Over the weekend, Susan and I saw Wolfgang Petersen's Troy--as did a big chunk of the University of Chicago quizbowl team and most of the rest of the city of Chicago. There's a lot to say about the movie, but I'm too lazy to write an organized and coherent review of it; instead, this entry will consist of some overall thoughts on Troy, followed by a series of comments on specific parts of the production. Some random overall reflections:
A lot of the problem with Troy, then, had to do with the way the project was conceived. I also had some issues with the movie's execution, however, and the casting in particular struck me as very uneven:
Troy's script was even less inspired than LOTR's. There were no memorable lines and a lot of the dialogue sounded stilted and unconvincing; in that sense, it reminded me that the Lord of the Rings scripts were--despite their flaws--more than good enough. (They weren't Shakespeare, of course, but they conveyed a sense of passion and intensity that was completely missing from Troy.) On the other hand, Troy's script was infinitely better than the script for either Star Wars prequel: there were no lines as awful as Annakin Skywalker's moving denunciation of sand in Attack of the Clones. (Then again, I guess Achilles's line "In a time of war, you gave me peace" does come close...) In that sense, Troy drives home just how hard George Lucas must have worked to come up with scripts as awful as he gave us in The Phantom Menace and Attack of the Clones, and reminded us that LOTR's scripts, while far from perfect, really weren't so bad after all.
Even so, I have to admit that a more dreadful script would have made the movie more entertaining. Then again, I also thought that the movie would be more fun if a giant cartoon lightbulb had appeared just above Sean Bean's head when he saw a man carving a little horse figurine and suddenly had an idea for how to get into Troy...
The frequent references to Patroclus as Achilles's "cousin" do help explain the dramatic (and otherwise unconvincing) change in Achilles's character late in the movie, however. [ineffective sarcasm ahead] After Priam shows up in Achilles's tent and begs him for the body of Hector, Achilles suddenly has a massive change of heart and becomes a straightforward good guy (rather than a spoiled brat) for the rest of the movie. Here's my theory: Achilles was jealous of Hector because Briseis said he was her "cousin," and in Achilles-speak, "cousin" means "man with whom I'm having passionate sex." Achilles therefore became even more determined to kill Hector. But when Priam showed up, Achilles found out that Hector and Briseis were related after all, and so he feels bad and undergoes an inexplicable personality transplant. [/ineffective sarcasm] (Hey, it makes as much sense as what actually happened!)
I think Petersen and company realized that there was an issue here, which was why they killed off Agamemnon and Menelaus. But they still treated the Trojan War as a grand and exciting expedition, and acted as if the quest for fame and glory was a worthy aim in itself. It would have been fantastic if Troy had been even more blatantly pro-Trojan, and the war had been treated as a silly, pointless, and tragic waste of lives--The Iliad, after all, can be read as an anti-war polemic. As it was, the message of Troy was a garbled mess. Its message seems to be that it's good to join a war to become famous, even if the leaders of that war are jerks who deserve to die and the other side is far more virtuous and decent.
What was most problematic about Troy, I'd argue, was that it seemed completely uninspired and uninspiring. It's almost as if a bunch of movie executives had gotten together, wracked their brains to come up with another epic so they could build on the success of The Lord of the Rings, and randomly come up with an idea of a Trojan War movie--without understanding what made LOTR successful and what makes The Iliad a brilliant work. They assumed that if they included "stirring" (make that, trite) speeches about fame and glory, audiences would feel inspired and invigorated, and forgot that we won't necessarily care about Achilles and Agamemnon, even if they are really famous. (Troy took it for granted that there Achilles and company were great heroes, and assumed that if the theme of the film was the quest for fame, viewers would enjoy watching.) The movie really had no reason for existing, and without a real sense of engagement and passion, it seems like a monumental waste of time, effort, and money.
[Hmmm... In retrospect, it wasn't especially effective to write this entry using bullet points. That method enabled me to get it written quickly and efficiently without expending much time or thought, however, so I guess it was good enough...]
Posted by Ed
Over the last few years, The Washington Monthly has changed its reputation completely: it used to be known as a boring and stodgy old magazine for policy wonks, but since it changed editors a few years ago, it's become much livelier and more interesting. Now the magazine even features articles on American culture--touching on themes like "addictive allure of Home and Garden Television" and "the contempt of courtship" in contemporary American life.
The current Washington Monthly features an article that may interest some of this blog's regular readers: it describes the new poker craze among 20-something American males. I'm not convinced that this is good cultural criticism, however: given how behind the times I am, does the fact that the magazine is commenting on a trend I've noticed mean that it's in tune with the times--or that it's as backward as I am? Given the people I know who seem to love playing poker, however, I think the article's characterization of the game as "baseball for the unathletic" (and of poker players as "five-card nerds") is dead on. I'll never understand the appeal of the game, I'm afraid, but, then again, I also can't understand why movies like Rounders and Ocean's Eleven could become "cult films on college campuses." Is the article even right about that? If so, then I'm worried about the future of American movies!
Posted by Ed
Just a quick note: today's Washington Post features a profile of Rashid Khalidi, a former University of Chicago professor (with whom I took a course in Orientalism my first year.) He has a new book out, declaring the Iraq war a "policy born out of ignorance."
The article as a whole struck me as okay, but far from outstanding. One particular passage bothered me:
"There are kids I know in the service," he says, worried about what will happen to them if they're captured. "Every time I hear these laptop neocons talk about international law. . . ."
He trails off, disgusted. "Neocon" and "neoconservative" are among Washington's most fraught rhetorical markers, used by some people in much the same way that "liberal" was once used to dismiss an entire category of supposedly failed thinking. Others, including the AEI's Pletka, see a more sinister resonance.
"I think the phrase 'neocon' is much more popular among people who think it shields their anti-Semitism," she says. "But it doesn't."
Moreover, Pletka's broader charges--that the term "neo-conservative" is most often bandied about by anti-Semites--strikes me as greatly overstated. A majority of the people often described as neo-cons are Jewish, it's true; moreover, when someone like Pat Buchanan uses the term, I think it often does function as a code word for "Jewish." In the vast majority of cases, however, my sense is that the term has nothing whatsoever to do with Judaism--some leading neo-cons, like Donald Rumsfeld, aren't even Jewish. My bigger concern with the term is that it just isn't very helpful. (What's the difference between a neo-con like Paul Wolfowitz and a more traditional conservative?) The term is often used to suggest that a cabal of extremist presidential advisers has foisted a disastrous policy on the country (which shifts the blame away from where it ultimately belongs, with the president); plenty of descriptions of neo-conservatism overstate its connections to Leo Strauss or make shallow analogies to Trotskyism, which doesn't help clarify the ideology of many of the president's foreign policy advisers. I might well avoid overuse of the term, but I think it's a poor choice of words because of its vagueness, not because of its connection to anti-Semitism. At least so far...
Posted by Ed
The new issue of The Atlantic Monthly features a review, by the cookbook writer Ann Hodgman, of Laura Shapiro's Something from the Oven: Reinventing Dinner in 1950s America. In this book, Shapiro attempts to explain the rise of processed food like cake mixes, jello, and TV dinners, looking at factors like "the food industry's pushiness, advertisers' wiliness, [and] consumers' eagerness to wolf down trainloads of salt, sugar, and preservatives":
Shapiro began this saga in an earlier book, Perfection Salad (1986)—a charmingly idiosyncratic look at the way home cooking changed in this country during the early part of the twentieth century. Perfection Salad covered the horrors wreaked on middle-class food when nutritionists, home economists, and other "domestic scientists" got hold of it and turned everything into Jell-O salad and white sauce. Something From the Oven picks up the story after World War II, when standardized food was already entrenched in America. (Shapiro considers the 1950s to be the period from the late 1940s to the mid-1960s.)
As Shapiro tells it, the post-World War II food industry, bursting with tricks it had learned for feeding soldiers overseas, was eager to train Americans "to develop a lasting taste for meals that were a lot like food rations"—dried, reconstituted, indestructible. The offerings included dried wines, a potato snack called Tatonuts that was touted as having "strong resistance to weather conditions," canned hamburgers, and—I swear—frozen concentrated mineral water. Meanwhile, magazine and newspaper publishers did all they could to persuade the American housewife that she had no time to cook.
Posted by Ed
This evening, ABC will present a made-for-TV movie based on Madeleine L'Engle's A Wrinkle in Time. Here's an exchange from a recent Newsweek interview with L'Engle:
NEWSWEEK: So you’ve seen the movie?
Madeleine L’Engle: I’ve glimpsed it.
And did it meet expectations?
Oh, yes. I expected it to be bad, and it is.
The ads for the Wrinkle in Time movie looked horrible, and I was amused that they advertised "an all-star cast featuring Alfre Woodard" (wow!), but I'll have to tune in to see how bad it really is.
Incidentally, A Wrinkle in Time is arguably the best work of literature to begin with the sentence "It was a dark and stormy night." The other leading contender is Edward Bulwer-Lytton's Paul Clifford, which is more famous for that opening line.
Posted by Matt
The past few days the University of Chicago has had its annual Scavhunt competition. It's a large scavenger hunt, involving finding or building odd items, or competing in various other ways. I enjoyed this year's less than I have those of previous years, and to some extent I fault the list, though there are certainly factors not directly related to the hunt that made a difference. Still, the list had a lot of good things. Congratulations to the Snell-Hitchcock team for taking first. I was working with the second place team.
One of the items I worked on was a Tesla coil. We attempted to build a large one, and were fortunate to know someone with a secondary coil already wrapped (one of the most time-consuming parts). Unfortunately, our high-frequency choke kept coming detached from its wire, which wasted a great deal of time, and we could find no suitable capacitor. Preliminary tests of part of the circuit indicated things weren't working quite right, and building a Leyden jar would have been dangerous, so after long hours and much debate, we finally gave up and settled on attempting witty showmanship with a small Tesla coil used in some of the undergraduate teaching labs here. The item called for us to power a vibrator, a lava lamp, and a theremin with the coil. The coil we had, as one of my teammates observed, already vibrated well enough. Christmas tree lights might resemble miniature lava lamps if you squint. But best of all, one of my teammates was able to play Jingle Bells with the sound of the sparks between the small coil and the impressive-looking secondary from our attempt at a large coil. This was really excellent; it even sounded mostly in tune, although the highest note had to be pretty flat. The pitch changes based on the distance the spark has to jump. This isn't so far from how in a theremin the pitch changes when you move your hands, right? I think it went over fairly well, though I don't know what fraction of the points we got. Probably not as many as last year when we convinced them we did the Rutherford scattering experiment with a digital camera, an americium source from a smoke detector, and a bottle of Goldschlaeger. On the other hand, this time several of us are left with an unhealthy desire to build a large Tesla coil in our spare time, despite having practically no electronics experience.
At the moment I think I've been awake about 39 hours. I ate and slept little over the course of the hunt. As Subash has observed, long periods of sleeplessness have interesting psychological effects. In my case I think it mostly enhanced frustration arising both from the difficulties with the items and from other sources, so I spent too much of this hunt pacing and thinking about problems without moving toward a solution. It seems to almost be the reverse of what Subash described. I feel like the lack of eating anything resembling a normal meal for a few consecutive days had more adverse affects than the relatively mild ones of not sleeping for one night. Anyhow, at this point I'm finally starting to feel like I might be able to sleep, so I'll leave this post with the urge that you look over the Scavhunt lists online from past years if you're curious. A lot of creativity went into creating the lists, as well as into solving them, and the judges (who produce the list) each year should be applauded for their work.
On another note, this year's hunt had one overly weird item, in which someone ate his own umbilical cord that had been saved since he was born. That was way too bizarre.
Now I really have to sleep before I stop making any sense at all.
Posted by Ed
This week's Village Voice features a short but fascinating review, by Allen Barra, of a book on the history of female chess pieces. Here's an excerpt:
Chess, as any Nabokovian knows, is a superb metaphor for life. As Marilyn Yalom illustrates in her fascinating new book, Birth of the Chess Queen, the metaphor works the other way as well. Yalom, a senior scholar at Stanford's Institute for Women and Gender and author of A History of the Wife and A History of the Breast, makes a convincing case that the queen's prominence reflects the evolution of the female in the Western world. In India, where the game began in the fifth century, there was no female piece, and most versions to be found in Muslim countries still have none.
The chess queen got her first break in 12th-century Spain, replacing the vizier, who represented the king's chief counselor in Eastern chess (apparently Spaniards noticed who got the most attention when she whispered in the king's ear). From Spain, the game moved to the South of France, where Eleanor of Aquitaine gave the chess queen her first real-life role model, epitomizing "the trappings of queenship that worked their way into the symbolic system on the chessboard."
The rise in the chess queen's power stirred an increased passion for chess in men. Chess became the common man's form of knightly combat. Men fought for their chess queens as nobles fought for their real-life counterparts. According to English legal documents, in 1251 and 1256 there were at least two "chess homicides" and several other chess-related brawls. The church reacted to the new sexual element in the game with alarm: In 1291 a prior and canon were condemned to bread and water for (as the record has it) "being led astray by an evilly-disposed person . . . who had actually taught them to play chess."
Other articles worth reading this Sunday include this Boston Globe ideas section article on whether innovative architecture leads to innovative science, and this Michael Dirda column that reviews Diarmaid MacCulloch's new book on the Reformation. (It sounds like fun.) Finally, those of you who miss the Invisible Adjunct may be interested in this Boston Globe article about her.
Posted by Ed
Comic books--or graphic novels, or whatever you want to call them--have never terribly appealed to me, however often I'm told that I'm underestimating them. (If I don't have time to read all the real books I want to look at, why should I spend my on a bunch of silly little books with pictures?) Nevertheless, I'm weirdly intrigued by the idea behind Superman, Red Son, which is reviewed in today's Observer:
Imagine for a moment a wildly alternative twist to the Superman mythology. What if the infant from Krypton had landed not in Kansas, but in Ukraine? And what if that prodigious alien child was raised by collective farm workers whose values were truth, justice and something different from 'the American way'? How would the arrival of a superhuman being alter a supposedly egalitarian society and how would it shift the Cold War stalemate of two military superpowers? ('Let our enemies beware: there is only one superpower now...')
Would I read such a work? Well, probably not... The images of Superman, Red Son on this website don't especially appeal to me, for example, as neat as it is to see Superman conversing with Stalin. Even so, I guess I'm glad that someone got an idea like this, and in the unlikely event that I feel the need to look at a comic book, maybe I now know where to start.
Posted by Ed
In this week's Times Literary Supplement, A.N. Wilson reviews a new volume of the collected letters of C.S. Lewis. Here are some excerpts from Wilson's review:
For by reading the letters entire, we receive a piquant sense of how things have changed, not merely in academic life but in the Western world generally. One of Lewis’s most valuable contributions as a critic is to save readers from that crass error of taste, judging the past by the ephemeral values of the present. At his best, he persuades us not to be clumsy tourists who fail to learn the language, but patient listeners to the lost words, and lost world view, of the Middle Ages. Lewis’s generation was the last in which the sexes lived separate lives in England, in which men, with their heavy macs, their love of bars, mixed grills, heavy smoking and what he calls bawdy would, however much they liked their wives or girlfriends, prefer to spend much of their time segregated, in the work place, in clubs (working mens or gentlemen’s, it came to the same thing), even in churches. (In High Churches such as Lewis came to like, men and women sat on opposite sides of the aisle, even until the 1950s.) Nevertheless, even by the standards of his age, Lewis was surely quite extraordinarily male, pungently, bullishly and bullyingly so. Writing to Warnie about rereading Jane Eyre, a book which the notes helpfully remind us was written by Charlotte Brontë, Lewis says, “part of the interest lies in seeing in the most (apparently) preposterous male characters how quite ordinary people look through the eyes of a shy, naive, inflexibly upright, intelligent little woman of the mouse-like governessy type. It opens vistas – how you or I look to Maureen’s friend ‘Fuller’ or how we looked to Smudge . . .” .
The other bugbear, apart from women, is modernism, whether in literature, art or theology. Only four years after his full-scale conversion to Christianity, Lewis finds no place in his heart for his fellow Anglican T. S. Eliot (whom the faithful Walter makes into Elliot on page 94). Eliot’s work is seen as “a very great evil”, and The Waste Land is dismissed as “pornography”. The paranoia does not stop there. Many of us see the arrival of T. S. Eliot in England, to study the philosophy of F. H. Bradley, as a boon to English literature. For Lewis, it seems more like the infiltration of some modernistic fifth column.
"Eliot stole upon us, a foreigner and a neutral, while we were at war – obtained, I have my wonders how, a job in the Bank of England – and became (am I wrong) the advance guard of the invasion since carried out by his natural friends and allies, the Steins and Pounds and hoc genus omne, the Parisian riff-raff of denationalised Irishmen and Americans who have perhaps given Western Europe her death wound."
It would be churlish not to record that these letters also contain passages which remind us of that C. S. Lewis whose criticism is so infectious, reflecting as it does such combined powers of intelligence, observation and ear. (See his letters describing the virtues of Coventry Patmore, Dante, Charles Williams or P. G. Wodehouse.) We should also be prepared to take his own advice – “Memo: to read all collections of letters in the light of the fact that a letter writer tends to pick out what is piquant, or unusual. He may tell no lies: but his life is never as odd, either for good or ill, as it sounds in the letters”.
Posted by Ed
Some links I've found interesting of late:
Perlstein's article is especially good. (It even mentions a seminar on Bush's theology by a University of Chicago professor...) Check it out!
Posted by Ed
The New York Times, it seems, is shutting down its "arts and ideas" section. A New York Observer article describes this decision, and Scott McLemee gives his own reaction. The section was often lame, he says, and its approach qas questionable:
One aspect of the problem is what I tend to think of as "news desk epistemology," which holds that, in principle, any reporter can "cover" any given development using certain techniques that are more or less equally valid, or at least transposable, from one situation to the next. Whether it is a philosophical debate or the election of a dogcatcher, the journalist brings the same tool-kit to the task of reporting it.
Now, there are things about this approach that I do like, at least in the abstract. But it creates all kinds of problems when the "beat" is something less tangible than a war, a scandal, or some other event that is simply "out there," happening in the world, in some relatively unproblematic way.
For one thing, it means that there will be a very strong tendency for cultural coverage to be influenced by the work of publicists. It also means that any given development will be framed in terms of conflict and/or "the hot new trend." (How much do I loathe that expression? More than words can express.) The cumulative effect is stupid-making.
Posted by Ed
This week's London Review of Books features an article by my adviser, discussing a new translation of a diary written by a Soviet schoolgirl between 1932 and 1937. It's a fascinating story about a Soviet teenager who recorded her impressions of her life under Stalin and ended up in the gulag.
Posted by Ed
In recent weeks, I've begun to wonder whether we could witness a landslide Kerry victory in November. I know, I know--this theory seems unlikely, especially since some recent polls have suggested that Bush's position is getting stronger. But I can't help but wonder if this election will be like the 1980 election, in which Reagan and Carter were essentially tied in the polls on Labor Day, only to see Reagan pull ahead in the final weeks to win the election by 10 points.
I'll readily admit that this hope is based in part on wishful thinking: every day, it seems, we get new evidence of the president's incompetence. If I had to bet on the outcome of the election, I'd bet on a narrow Bush victory. Nevertheless, I was intrigued to see this Washington Monthly article by the Hotline's Chuck Todd. His assessment:
There are perfectly understandable reasons why we expect 2004 to be close. Everyone remembers the nail-biting 2000 recount. A vast number of books and magazine articles describe the degree to which we are a 50/50 nation and detail the precarious balance between red and blue states. And poll after poll show the two candidates oscillating within a few percentage points of one another. There are also institutional factors that drive the presumption that the race will be tight. The press wants to cover a competitive horse-race. And the last thing either campaign wants to do is give its supporters any reason to be complacent and stay home on election day.
But there's another possibility, one only now being floated by a few political operatives: 2004 could be a decisive victory for Kerry. The reason to think so is historical. Elections that feature a sitting president tend to be referendums on the incumbent--and in recent elections, the incumbent has either won or lost by large electoral margins. If you look at key indicators beyond the neck-and-neck support for the two candidates in the polls--such as high turnout in the early Democratic primaries and the likelihood of a high turnout in November--it seems improbable that Bush will win big. More likely, it's going to be Kerry in a rout.
Posted by Matt
I am at the American Physical Society "April Meeting" (May 1 - 4) in Denver, Colorado. It's been very enjoyable. For one thing, just getting a chance to relax and spend time with some friends who are also physics students has been nice. We even managed to get $10 gift certificates to Amazon by easily winning a "physics trivia quiz" at the student reception tonight. In one case we even corrected the quiz writer. A question asked for the most recent Nobel prize given in particle physics, before the recent one given to Ray Davis, Masatoshi Koshiba, and Riccardo Giacconi. The answer they were looking for was the 1995 prize given to Perl and Reines for the tau lepton and the neutrino, respectively. We pointed out that 't Hooft and Veltman had won more recently, for showing that the Standard Model is renormalizable. It's fun to be able to combine my physics and trivia dorkiness, especially when I get prizes out of it.
This conference has given me a good chance to learn a bit more about things that are going on in string theory. My knowledge of string theory is rather limited, but I have enough of the basics down to have an intuitive picture and follow along in talks, even if I don't get the details. At the DPF prize session, after Peter Onyisi gave his talk on his work with CDF that won him a well-deserved Apker Award, there were two good string theory talks aimed at a broad audience. Veneziano gave a very nice talk on string theory from a historical perspective, and Maldacena talked on the AdS/CFT correspondence.
What interested me more on the string theory front, though, were talks on recent ideas. One such development has been the discussion of the "string theory landscape." Roughly, the idea is that string theory is not enough to specify a complete theory; you have to specify a "vacuum" to expand around. For instance, some vacua might correspond to flat space, others to inflationary universes, and so on. In what has become referred to as the "KKLT" paper (for its authors, Kachru, Kallosh, Linde, and Trivedi), a model of a metastable de Sitter vacuum in string theory was proposed. This would be a universe that expands as ours seems to. Its "metastability" means that it is unstable; at some point, the universe would tunnel to a lower vacuum state. This would be catastrophic, but you don't really need to worry; even if the idea is correct, the lifetime of the universe would far outlast humanity. Giddings has argued (e.g., here) that this idea is in fact more generic, and that in any theory with extra dimensions (as string theory, for instance, says we should have) and a positive cosmological constant (as experiment tells us we have), our four-dimensional universe will be catastrophically unstable (again, nothing to worry about). Here in Denver I got to hear Giddings speak about this "landscape" picture, and also Tom Banks speak about his own views. (Mostly he focused on other topics, but he briefly criticized the landscape picture.)
Most of the controversy revolves around the idea that there would be very many of these deSitter vacua, so that string theory would not give unique predictions for physics. Instead one might have to turn to the anthropic principle to explain why physical parameters have the values that they do. Tonight another paper has been posted about this topic. In this paper, Daniel Robbins and Savdeep Sethi of Chicago argue that the metastable deSitter vacua are not as generic as others have proposed. They make this argument by using nonperturbative ideas in string theory rather than the simple effective field theory pictures that others have used. Explaining more of the technical details would only serve to make plain my own lack of sufficient technical sophistication. It's a very interesting debate, though, and it's fun to try to figure out what is going on. Over the next year as I study string theory more seriously, hopefully I'll have more to say about it.
Aside from those things, I have met some interesting people. I briefly got to chat with Juan Maldacena, who developed the idea of the AdS/CFT correspondence, roughly that 5-dimensional quantum gravity (i.e., a string theory) on a certain space is equivalent to a well-understood 4-dimensional quantum field theory (i.e., a theory of point particles) on the boundary of the 5-d space. This is a concrete realization of the interesting idea of "holography." It has also inspired a lot of phemonological ideas, in which people approach "technicolor"-like models (proposing that strongly interacting physics at a certain energy scale [which will soon be probed by the Large Hadron Collider in Switzerland] will explain certain riddles in the Standard Model) by considering their duals as "brane-world" scenarios in 5 dimensions. I told Maldacena that I'm interested in such phenomenological ideas, and he sounded rather skeptical of them, but I didn't get a chance to ask him why.
There was also a nice public lecture by John Bahcall on the solar neutrino problem. I should try to make a post explaining it along similar lines sometime. Tomorrow morning I might get to hear some phenomenology lectures before I run to catch my flight back to Chicago (just in time for Scavhunt).
Also while here, I saw Kill Bill 2, which I enjoyed, and I got a chance to do some CD shopping. I only regret that I didn't have a chance to get away to the mountains for some hiking; I did get a nice view of them from the 22nd floor of the hotel, but that wasn't so satisfying.
Posted by Ed
In yesterday's New York Times, Harold Varmus (a former head of the National Institute of Health) looks at the science in the new movie Godsend. The process of cloning, he suggests, is "vividly portrayed and quite accurate, except for the use of donor cells from someone who has been dead for a couple of days," but that doesn't mean that Varmus was impressed by the movie:
Ultimately the film is not really about the physical or ethical dangers of cloning; it asks us to experience the sensations of horror on classic premises: human perversity and scientific implausibility. The evil that produces the horror-film atmospherics is traced to a conventional villain: the mad, bad scientist, Dr. Wells, who rolls steel balls between his fingers like Captain Queeg of the Caine. Moreover, the consequences of the doctor's evil act — the insertion of Zachary's DNA into Adam's clone (turning A to Z) — are based on biological thinking much less plausible than a prediction that reproductive cloning will someday work efficiently and safely in humans. In fact, the far-fetched biological premise on which the horror finally hangs is only tangentially related to cloning: memory is presumed to be stored in the DNA of brain cells and to be transferable from one individual to another by injecting DNA fragments into an early (and, in this case, cloned) embryo.
To set the stage for this preposterous proposal, Dr. Wells tells the Duncans about an "urban legend" — that rats can learn to run a maze by eating brains of maze-trained rats, thereby acquiring the DNA-imprinted memories of the educated animals. (Although the molecular basis of memory is not yet established, it is virtually certain to be recorded, not permanently in DNA but instead in less stable changes, like those involving the chemistry or folding of proteins.) We are then asked to accept another completely arbitrary idea, that the externally provided memory DNA kicks in, inciting nightmares, flashbacks and behavior patterns that belonged to Zachary, only when Adam No. 2 has passed the death day of Adam No. 1. Now we are in the realm of spooky music; the sudden appearance of hands wielding hammers; shower curtains ruffled by ghost-like apparitions; and scary moments in the cellar of that large new house or in the traditional horror chamber, the abandoned shed.
Posted by Ed
If you're like me, and you're both interested in history and oddly intrigued by questions of proper style and word usage, you may enjoy reading this New Yorker "Talk of the Town" piece. It seems that The New York Times has abandoned its long-standing policy of preventing writers from describing the Armenian "genocide" of 1915:
Reporters at the paper have used considerable ingenuity to avoid the word (“Turkish massacres of Armenians in 1915,” “the tragedy”) and have sometimes added evenhanded explanations that pleased many Turks but drove Armenian readers to distraction: “Armenians say vast numbers of their countrymen were massacred. The Turks argue that the killings occurred in partisan fighting as the Ottoman Empire collapsed.”
Posted by Ed
The most terrifying theme park in the world, I've decided, might just be Dinosaur Adventure Land, a creationist theme park that invites children to "discover the truth about dinosaurs." Today's New York Times describes the park and its founder, portraying Dinosaur Adventure Land as a kind of boring place:
At Dinosaur Adventure Land, visitors can make their own Grand Canyon replica with sand and read a sign deriding textbooks for teaching that the Colorado River formed the canyon over millions of years: "This is clearly not possible. The top of the Grand Canyon is 4,000 feet higher than where the river enters the canyon! Rivers do not flow up hill!"
There is a movie depicting the creation, the flood and the fall of man, which fast-forwards from a lush Garden of Eden to a New York City traffic jam.
There are no mechanized rides at Dinosaur Adventure Land — no creationist-themed roller coasters, scramblers or even a ferris wheel — but instead, a simple discovery center and museum and about a dozen outdoor games, each of which has a "science lesson" and "spiritual lesson" posted nearby. A group of about 60 parents and home-schooled children who visited Wednesday, including the Passmores, spent all afternoon trying the games, which promote religious faith more than creationist tenets.
Take Jumpasaurus, which involves jumping on a trampoline while trying to throw a ball through a hoop as many times as possible in a minute. The science lesson: "You will use coordination in this game, which means you will be doing more than one thing at once." The spiritual lesson, according to Mr. Johnson: "You need to learn to be coordinated for Jesus Christ so you can get more things done for him."
Update: P.Z. Myers links to the same article at Pharyngula. He describes the piece as "a pandering bit of fluff" and notes that he read it "with stark disbelief."
The more I think about it, the weirder this article seems. It seems debatable whether this is even a real theme park, after all: it's not exactly loaded with exciting rides and fun activities. When I first looked at the article, I noticed some of the more ridiculous highlights (like the Jumpasaurus "science" lesson and the Grand Canyon propaganda) and assumed that one of the main points of the article was to highlight the park's silliness; nevertheless, as Myers points out, there's just "one brief expression of a mildly contrary opinion in the whole thing." Even the tax fraud charges against the park's founder are downplayed... Perhaps I should have read more carefully before blogging about it.
Part of me wonders whether the article's author knows just how silly the park is and didn't think she needed to highlight its ridiculousness. (Perhaps she assumed that readers knew the context of parks like this and understood the anti-science extremism of plenty of creationists.) It would have been nice if an editor had told her to put some more criticism into the article, though.