Posted by Ed
Over the last week, I've spent several of my spare moments reading Chaos of Disciplines, a book by a University of Chicago sociologist named Andrew Abbott. This is the sort of book that makes me feel really ignorant of science, since my knowledge of fields like chaos theory is quite shallow: Abbott uses fractals to explain the patterns that appear in academic disciplines and to discuss the ways that academic knowledge changes and advances. I've found the book really stimulating, but I can't decide if it's brilliant, eccentric, or a combination of the two. (It helps that I haven't finished reading it yet!)
For those of you who are curious, here's an excerpt from the book's prologue that will give you some sense of what Chaos of Disciplines is like:
Every spring the MCAT examinations select the medical elite from among the upper extreme of the college population in terms of scientific and rational abilities and attitudes. But three years later those selected will choose specialties ranging from psychiatry to family practice to cardiology, thereby replicating within the compass of medicine the entire humanistic-rationalistic scale that the MCAT defines on the college population as a whole.
On the other side of the world, the great hierarchy of the caste system relegates certain groups so firmly to the bottom as to exclude them from the four varnas altogether. Yet among those excluded harijans, an internal hierarchy exactly replicates the much larger one that places them beneath all the caste Hindus.
Those two social structures have a peculiar property in common: the property of self-similarity. No matter what the level at which we inspect them, we find the same pattern repeated. Nor is this simply a matter of looking at a linear scale in progressively finer detail. The world of medicine covers much more than just those in the upper extreme of scientism, just as some harijans enjoy substantial authority and power in their daily existence. These are truly self-similar structures, in which fine detail recapitulates gross structure.
A similar pattern emerges in cultural systems. At any given time the avant-garde of art is itself broken up into a thousand little cells, each imagining itself to be the true avant-avant-garde, leading those who will lead the general art public. Similarly, just as psychiatric practice offers a category system explaining everyday life to the average denizens of modernity, so does psychoanalysis offer an explanatory system to the psychiatrists who treat everyman. Or, to come suddenly close to home, if we take any group of sociologists and lock them in a room, they will argue and at once differentiate themselves into positivists and interpretivists. But if we separate those two groups and lock them in separate rooms, those two groups will each in turn divide themselves over the same issue.
Thus cultural structures too may have the characteristic of self-similarity. In the book that follows, I apply this argument about cultural self-similarity to a particular example, holding that self-similarity provides a general account of how knowledge actually changes in social science.
Posted by Ed
Here are two interesting articles from the Boston press:
Posted by Ed
There was a time when the study of Soviet politics--or Kremlinology--was best-known for its analysis of seemingly trivial signs that change was afoot in the U.S.S.R. (Who was standing closest to Leonid Brezhnev in a picture? How prominent was the mention of some minor party functionary?) I was reminded of those days earlier this morning, when I came across a delightful article in the Style section of The Washington Post:
North Korean ruler Kim Jong Il's surprise summit in China last week took the top off of at least one of the Pyongyang government's best-kept secrets: The Dear Leader is losing his famed big hair.
Indeed, the familiar profile of Kim, 62, has long been characterized by his luxuriously piled bouffant. But in a photo-op during his trip to Beijing that ended last week, Kim turned his back to the rolling cameras in a fateful moment to embrace Chinese leaders. Then, boom, it came into focus -- the shiny patches of the Dear Scalp glistening between strategically combed curls.
That this image of Kim's thinning hair -- quickly scooped up by international media outlets -- got past Chinese censors appeared to be a humorous coincidence at the least. At the most, it showed a Chinese lack of sensitivity to the role of Kim's hair in North Korea's body politic.
For any another totalitarian leader, a collapsing coiffure might raise nary an eyebrow. Who, after all, would have noticed if Pol Pot or Augusto Pinochet needed a little Rogaine? But Kim's high and mighty mane, teased into a mushroom cloud and appearing capable of doing equal damage, has become the defining symbol of his unique dictatorial style. Here is a despot who has ruled not only with an iron fist, but with dynamic hair.
"He may be the Dear Leader, but he is not such a tall man," said Nam Sung Wook, professor of North Koreanology at Seoul's Korea University. "So he needs to look bigger, look greater, so the people in North Korea and the world will know his true stature. He does that with the hair."
My favorite part of the article came later, however:
Kim once lobbed a test missile over Japan in 1998. But the North Korean leader has nevertheless become an underground fashion icon in Tokyo's teenage subculture. Various Web sites in Japan and around the world celebrate Kim's hair. One titled "Kim Jong Il's Fan Club" shows a disco-dancing Kim, hair stretching toward the heavens. In specialty stores selling North Korean memorabilia, lapel pins of Kim Jong Il spirited out of North Korea outsell those of Kim Il Sung. Especially sought after are the pins showing an extra-poufy Kim Jong Il, which sell for more than $80 each.
A comic book about the life of Kim Jong Il, first published in South Korea, has sold more than 500,000 copies in Japan. The book includes popular close-ups of Kim and his ever-expanding hair. In its sequel, Kim is a tights-wearing, big-haired "Superman of Darkness."
Posted by Ed
Today's New York Times features a profile of Edward Rutherfurd, the English historical novelist who's just published a new book spanning 1,100 years of Irish history.
Here's Rutherfurd's take on history:
Mr. Rutherfurd, who is 56, acknowledges Michener as the inventor of the form, a kind of docu-genre. "It's an odd sort of hybrid," he said during a recent visit to New York from his home in Dublin, "a curious region between fiction and nonfiction." Major events and historical characters are real; other characters are imaginary.
Mr. Rutherfurd is scrupulous about accuracy and about avoiding anachronisms. His books are heavily researched and, chapter by chapter, he says, he checks his facts with historians, although by now he could have some claim to that role.
He is candid about his craft: "I think I'm a popularizer of history, a commercial novelist, and I don't think anything is wrong with that." But, he said, he feels an "ethical responsibility not to mislead" the reader.
"I won't cheat on history," he said.
For all his emphasis on research, he retains a certain skepticism about history. He recalled an incident from his school days. Two teachers suddenly began fighting in a classroom as 22 boys looked on "in absolute horror." Then the fighting stopped and one of the teachers asked the class to write down what had happened.
"There were 22 conflicting stories," he said. "The moral is, historical truth doesn't exist." Facing historical uncertainty, he will have two fictional characters disagree about events and offer alternative views. Through his fiction Mr. Rutherfurd can arrive at an imaginative approximation of the truth.
It's great that Rutherfurd has been able to tell his readers about the past, and some of the lessons he offers seem fascinating and potentially important. The profile discusses the preponderance of slavery in early Ireland, the high frequency of divorce, and the common notion of trial marriages, for example. ("Puritanical Irish society is really a 19th-century phenomenon," Rutherfurd says--an important lesson for readers.) But Rutherfurd also says that he was surprised to learn that St. Patrick didn't really rid Ireland of snakes. Was he really this naive before beginning his research, or was this just the sort of silly claim you make to get a reporter to shut up?
Posted by Ed
Then go to humanities grad school! That's what this Village Voice article says:
Here's an exciting career opportunity you won't see in the classified ads. For the first six to 10 years, it pays less than $20,000 and demands superhuman levels of commitment in a Dickensian environment. Forget about marriage, a mortgage, or even Thanksgiving dinners, as the focus of your entire life narrows to the production, to exacting specifications, of a 300-page document less than a dozen people will read. Then it's time for advancement: Apply to 50 far-flung, undesirable locations, with a 30 to 40 percent chance of being offered any position at all. You may end up living 100 miles from your spouse and commuting to three different work locations a week. You may end up $50,000 in debt, with no health insurance, feeding your kids with food stamps. If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off.
The article briefly discusses the case of the Invisible Adjunct, which was also described in this much-discussed Chronicle of Higher Education article.
Posted by Susan
(I promise this entry is less depressing than the title sounds.)
Over the past week, I've heard a few lectures from a researcher at Chicago (Mark Lingen) who's studying tumor angiogenesis--that is, the process by which tumors create blood vessel networks for themselves. For many years, angiogenesis was something of the redheaded stepchild of cancer biology--Judah Folkman, the "father of angiogenesis", worked on it for decades before being taken seriously (there's an interesting Nova program on Folkman's work here; among other things, it answers the eternal question of, "How can I obtain a large amount of mouse urine to use in protein purification?"*). Angiogenesis has become quite the hot field; there are a few dozen anti-angiogenic drugs in clinical trials, and one (Genentech's Avastin) has recently been approved by the FDA as a treatment for colon cancer after failing to work against breast cancer in 2002.
Targeting angiogenesis is radically different than conventional chemotherapy and represents a changing view on the curability of cancer that may completely revolutionize cancer treatment.
Most conventional chemotherapies are cytotoxic--that is, they kill cells, especially rapidly dividing cells. They affect cancer cells more than most healthy cells because the cancer cells divide more; however, healthy tissues that normally undergo a lot of cell division (such as bone marrow) are severely damaged by conventional chemotherapy. This is just from the intended effect of the drug; most chemotherapies also have severe side effects (ototoxicity from cisplatin, neurotoxicity and urotoxcity from ifosfamide, etc.) that are due to toxic metabolites of the drugs. The normal dosing schedule for conventional chemotherapy includes a period of very high drug dosing (in an attempt to obliterate the cancer) followed by a "holiday" to give bone marrow and other tissues a chance to recover from the damage.
By contrast, good anti-angiogenic drugs** are not terribly toxic to the patient. As you might expect, there are minor effects upon wound healing. I expect inhibition of angiogenesis would affect the female reproductive system, but I can't find any data on this presently. Also, the dosing schedule for anti-angiogenics is radically different than that of normal cancer drugs. It's called metronomic dosing; patients take a small amount of drug every day. No holidays. This should make sense intuitively; for anti-angiogenic therapy to work, it has to prevent the tumor from obtaining a blood supply by constantly preventing the growth of new blood vessels to the tumor. Using the "pound the hell out of it then give the patient a holiday" schedule would allow the blood vessels to grow back during the holiday.
The big difference between anti-angiogenic therapy and regular cancer therapy is that the latter is cytotoxic whereas the former is cytostatic. Cytostatic therapy doesn't kill cells in the same way that cytotoxic therapy does; rather, it prevents them from growing and dividing. So what's the point of cytostatic therapy? On one hand, it might be useful as an adjuvant therapy to normal cytotoxic therapy (though there are problems with this idea***). However, the revolutionary idea is that anti-angiogenics can be used as a long-term tolerable therapy to allow patients to live with cancer. This is a huge shift from normal ideas about cancer treatment. We're no longer talking about curing cancer but about taming it, turning it into a disease that can be managed for years--even decades--with drugs. It's not unlike the paradigm shift**** in AIDS therapy, and I'd venture to say that the saga of AIDS therapy will be helpful in identifying potential pitfalls in long-term cancer therapy.
An even more radical idea is the use of such drugs for long-term chemopreventative therapy; for example, people at high risk for head and neck cancer could take EGFR inhibitors even before they developed cancer to prevent any cancers that arise from initiating angiogenesis. Head and neck cancer is a particularly good candidate for this sort of scheme as these extremely aggressive cancers tend to activate angiogenesis very early in their development. Another example of chemopreventative therapy that I've heard of is the idea of using tamoxifen, raloxifene, or other synthetic anti-estrogens to reduce the risk of breast cancer in high-risk individuals.
Here's a few potential pitfalls to think about:
Different expectations of cancer drugs
I guess this is a bit obvious; a drug that you take for a short time to cure a disease is going to be different from a drug that you take for thirty years to manage a disease. I'll take the chemopreventative example of tamoxifen for breast cancers as an example. Tamoxifen acts as an anti-estrogen in breast tissue; it binds to the estrogen receptor and prevents it from being activated by estrogen. However, tamoxifen acts as a synthetic estrogen in other body tissues, like bone and uterine tissue. There is a (very small) increase in the risk of developing uterine cancer among women who are treated with tamoxifen. In the context of breast cancer therapy, this risk is negligible--the benefits of tamoxifen use in curbing the breast cancer far outweigh the small risk of the uterine cancer. However, in this scenario, we're talking about maybe ten years of tamoxifen therapy at max (women frequently take tamoxifen for the first five years that they're in remission). What happens when it's thirty years? What happens when the woman doesn't already have cancer? Is the risk of uterine cancer larger or more important in that case?
Another concern along these lines is that most anti-cancer agents affect fertility. Anti-angiogenics are especially bad in this respect--two drugs with anti-angiogenic effect, retinoic acid and thalidomide, are best known for their ties to birth defects. When you actually have cancer, childbearing isn't as much of a concern. However, if you're going to be on a chemopreventative or long-term chemotherapeutic agent for thirty years or so, the circumstances are quite different.
Are at-risk groups differentially sensitive to the chemopreventative agent?
As you probably know, cancers have a number of environmental and genetic causes. Obesity and smoking will increase your risk of practically every cancer. There are some concerns in how one identifies at-risk groups to be treated with chemoprevention. Some easily-identified at-risk groups for specific cancers--for example, women who carry mutations in the tumor suppressor gene BRCA1, who are at a much higher risk of developing breast and ovarian cancers--have different treatment requirements than most people with that cancer. In the case of breast cancer, BRCA1-positive (i.e. mutant) women are significantly less likely to respond to tamoxifen therapy (because their tumors are less reliant upon estrogen) than women with mutations in BRCA2 or women with spontaneous cancer. These women would not respond well to tamoxifen treatment. On the other hand, people who face an increased environmental risk of cancer--such as users of chewing tobacco, who have an increased risk of oral cancer--would be much better candidates for chemopreventative trials as their cancers are unlikely to be different from the average cancer of that type.
Taking chemotherapy home--patient compliance
Presently, chemotherapy is administered by hospitals. This is obviously impractical for lifetime therapy. If long-term chemotherapy becomes a real method of treatment, doctors are going to have to impress the importance of drug compliance upon their patients. This is especially true of anti-angiogenics, which require constant dosing to effectively prevent blood vessel growth.
Cost is going to be an issue. Cancer treatment isn't cheap now--lifetime therapy could be staggeringly expensive.
It's an interesting prospect. I'm especially curious to see how the field of chemopreventative agents progresses; it'd be really nice if we moved beyond exhortations to eat tomatoes to prevent prostate cancer or whatever. I guess realistically the best chemoprevention is weight loss and not smoking (with limiting sun exposure, not drinking rotgut whiskey, not living in Love Canal, and not getting HPV somewhere behind there). It is pretty cool to hear about clever drug therapies, though. I'll close with one of my favorites--Charlie Rudin (formerly of Chicago, now of Hopkins) is involved with the development of an attenuated adenovirus that is selectively cytotoxic to cells deficient in the tumor suppressor gene product p53 (i.e. cancerous or precancerous cells). His group is specifically targeting oral cancer, and the virus is administered as a mouthwash. How cool is that?*****
*Answer: feed mice unlimited sugar water and they'll produce their body's weight in urine every day.
**There exist bad, or at least less good, anti-angiogenic drugs. Thalidomide is a good example--it's a small-molecule inhibitor of angiogenesis (molecular mechanism unclear) that also causes immune system depression (through interactions with the NFkB pathway), neuropathy, and horrible birth defects. On the up side, it prevents nausea and is effective against leprosy. Very interesting drug.
***Anti-angiogenics seem to work synergistically with radiotherapy (RT); the idea is that the anti-angiogenic therapy, if given shortly before RT, can stabilize the leaky, abnormal tumor-induced vessel that feeds the tumor and make the tumor more oxygenated. Oxygen-rich tissue is the ideal target for RT, which exerts its cytotoxic effects through the reactive oxygen species it creates in damaged cells. However, the idea of combining anti-angiogenics with cytotoxic chemotherapy has some problems; most importantly, many cytotoxic agents target DNA and work best in rapidly dividing cells. If you stem cell division with anti-angiogenics, the cytotoxics won't work as well.
****I hate you, Thomas Kuhn.
*****Answer: SO COOL.
Posted by Ed
Over the weekend, The New York Times published an entertaining article on a recent study showing--its authors believe--that poets don't live as long as novelists or nonfiction writers:
Overall, poets lived an average of 62.2 years, compared with nonfiction writers, who lived the longest at 67.9 years. Playwrights lived an average of 63.4 years; novelists, 66 years. The differences between poetry and prose were pronounced among Americans, where poets lived an average of 66.2 years, and nonfiction writers lived an average of 72.7 years.
"The image of the writer as a doomed and sometimes tragic figure, bound to die young, can be backed up by research," Mr. Kaufman wrote in his study, "The Cost of the Muse: Poets Die Young," published in the journal Death Studies in November 2003.
I'll refrain from posting my own random musings on this question. Suffice it to say that chemists and historians seem especially unlikely to be happy in graduate school (and especially likely to work really hard); I've known plenty of English, math, and physics grad students who seemed to have lots of free time on their hands. The cynical part of me wonders whether, when you add in the pressures of the job market, historians might rank low on a list of life expectancies for academics... Then again, the skeptical part of me isn't necessarily convinced by the study I've linked to above, and I'm sure that practically all grad students like to complain about their lot in life.
Posted by Ed
Question of the day: Are malice and spitefulness always bad? Consider the following quotation: "This is the first indication that I have ever had that there is a God." Those were the words of Supreme Court Justice Felix Frankfurter, on the September 1953 death of Chief Justice Fred Vinson. Vinson, a staunch segregrationist, was expected to rule against integration in the case of Brown v. Board of Education, but his successor, Earl Warren, engineered a unanimous decision overturning the court's 1896 ruling in Plessy v. Ferguson.
That quotation was one of the more fascinating tidbits in Cass Sunstein's New Yorker article on the fiftieth anniversary of the Brown decision. The article as a whole isn't bad.
Posted by Ed
Certain novels have the power to reduce their reviewers to feats of unimaginitive musing--driving original thoughts from their minds and making them all sound the same. Carlos Ruiz Zafón's Shadow of the Wind is a case in point.
From Michael Dirda's review in The Washington Post:
Critics describing a new novel will sometimes resort to a particularly seductive formula: "If Judith Krantz had written Ulysses . . ." or "Half Georgette Heyer, half H.P. Lovecraft," or "If you enjoyed A Dog of Flanders, you'll just purr over The Cat's Pajamas." This is a seductive formula because it's easy to use (too easy, most of the time) and because it can quickly convey something of the range and complexity of a new book without going into a lot of detail.
But such shortcuts also remind us that novels, like most literature, build on earlier books as much as they do on life or on a writer's personal traumas. Indeed, one loose definition of modernism might be writing that is actually rewriting.
The Shadow of the Wind provokes such thoughts because it is a long novel that will remind readers of a good many other novels. This isn't meant as criticism but as an indication of the story's richness and architectonic intricacy. Before everything else, Carlos Ruiz Zafón's European bestseller is a book about a mysterious book, and its even more mysterious author. Try to imagine a blend of Grand Guignol thriller, historical fiction, occasional farce, existential mystery and passionate love story; then double it. If that's too hard to do, let me put it another way: If you love A.S. Byatt's Possession, García Márquez's One Hundred Years of Solitude, the short stories of Borges, Umberto Eco's The Name of the Rose, Arturo Pérez-Reverte's The Club Dumas or Paul Auster's "New York" trilogy, not to mention Victor Hugo's Hunchback of Notre Dame and William Hjortsberg's Falling Angel, then you will love The Shadow of the Wind.
When book publicists try to dress up their product in designer clothes, they reach for the verb ''meets.'' Made-up instances: ''John le Carré meets Dostoyevsky,'' for a thriller with metaphysical ambitions. ''P. G. Wodehouse meets Sophocles,'' for a tragicomedy set during an English country weekend. It's lowdown and lazy, but here goes: ''Gabriel García Márquez meets Umberto Eco meets Jorge Luis Borges'' for a sprawling magic show, exasperatingly tricky and mostly wonderful, by the Spanish novelist Carlos Ruiz Zafón. The three illustrious meeters must surely have been drinking and they weave about a little, but steady remarkably as the pages go by.
The Shadow of the Wind has apparently been a literary sensation and a run-away bestseller in Spain (and in much of Europe.) An English translation of the novel has just been released in America; the translation was written (I was intrigued to learn) by the daughter of Robert Graves.
I've been spending the afternoon sitting in the computer lab of the Chicago science library, reading Ruiz Zafón's novel, leafing through Habermas's Structural Transformation of the Public Sphere, and writing some of my thoughts on the theoretical problems associated with my dissertation. There are worse ways to spend a Saturday afternoon...
Posted by Ed
Jonathan Chait has an entertaining article in the current Atlantic Monthly, criticizing the country's political fixation on the sunny side of life. Candidates for office--especially conservative candidates--often compete to portray themselves as the most optimistic politician in the field, and journalists frequently repeat the cliche that the most optimistic candidate almost always wins the election, conveniently ignoring the fact that it's easier to be optimistic when you're ahead in the polls.
Chait does a very nice job refuting these claims. Here's his take on one recent study on the question:
The best-known empirical basis for the claim that optimists win elections is a study conducted by Martin Seligman, a professor of psychology at the University of Pennsylvania, with his graduate student Harold Zullow. Seligman helped create a system—called "content analysis of verbatim explanations," or CAVE—to measure the level of optimism in written or spoken statements. In 1987 he and Zullow applied CAVE to every nomination acceptance speech by a major-party candidate since 1948, masking whether the winning or the losing candidate had uttered it. The researchers found that the more optimistic candidate had won every election but one. (The exception was in 1968, when the "Happy Warrior" Hubert H. Humphrey came roaring from behind, only to lose narrowly to Richard Nixon.) They later examined the stump speeches of the 1988 primary candidates and correctly predicted that George Bush and Michael Dukakis would win their parties' nominations.
This research attracted the attention of The New York Times, which asked Seligman and Zullow to predict the winner of the general election. Analyzing the two acceptance speeches, they predicted that Dukakis would win, by six or seven percentage points. Oh, well.
Posted by Ed
A recent review in The Telegraph discusses a "rare and astonishing book": The Russian Criminal Tattoo Encyclopaedia. Here's a quick description:
The main section of the book is made up of Danzig Baldaev's drawings and interpretations of tattoos collected during his life as a prison attendant. The son of an eminent Buryat ethnographer who was imprisoned twice and finally died in Stalin's labour camps, Baldaev grew up in an NKVD orphanage for children of "enemies of the people". On reaching adulthood he was sent to work in the camps himself. This was a common practice: the heavily indoctrinated orphans made loyal prison guards.
Baldaev, however, saw his father again before his death and showed him some of his tattoo pictures. "My son," his father told him, "collect the tattoos, the convicts' folklore, or it will all go to the grave with them."
Posted by Ed
Today's New York Times features a fun article by Charles McGrath on F. Scott Fitzgerald's unhappy stay in Hollywood in the late 1930s. Fitzgerald was one of many famous writers who moved to California in search of easy money; he produced a mountain of scripts without ever managing to make it big in the movie business.
My favorite detail from the article: Fitzgerald worked as a rewrite man on the script for Gone with the Wind for about a week, during which "he was forbidden to use any words that did not appear in Margaret Mitchell's text." He ended up being fired.
Posted by Ed
Norris McWhirter, a co-founder of the Guiness Book of World Records, has died at 78. Here's the story of how the book first came about:
The idea for The Guinness Book of Records came during a shooting party in Ireland in 1951. When Sir Hugh Beaver, managing director of Arthur Guinness, Son & Co. Ltd., brewer of Guinness stout, and his friends failed to down a flock of golden plovers, they argued about whether it or the red grouse was Europe's fastest bird.
The answer turned out not to be in encyclopedias, but the McWhirters happily supplied it: grouse, 58 to 63 miles an hour, and plover, 50 to 55 miles an hour. Sir Hugh realized the utility of a reference book — with his company's name on it, of course — to settle similar arguments in the 84,000 pubs of Britain and Ireland. Within six months of publication, it was No. 1 on Britain's best-seller list.
Norris had found an avocation as well as a highly remunerative vocation. A typical vacation was to the Atacama desert in Chile, rainless for 400 years and the driest place in the world, at least according to Guinness. On a trip to Japan in the early 80's, he delighted in meeting what was then thought to be the oldest man in the world, Shigechiyo Izumi, believed to be 116. He lived to be 120 — or perhaps only 105, later research suggested.
One of McWhirter's perennial targets was subliminal advertising. In 1970 he demanded that the Director of Public Prosecutions investigate the Labour Party for repeatedly flashing the message "Labour tomorrow" for 0.04 of a second during an election broadcast. In 1985 he unsuccessfully sued the Independent Broadcasting Authority after Spitting Image briefly transmitted his face attached to the body of a naked woman.
Update: The biologist John Maynard Smith has also died.
Posted by Ed
Today's the sort of day where I'm having trouble getting any of my thoughts down on paper in a way that pleases me. In other words, it's a good day to link to Joseph Epstein's latest article from Commentary, in which he discusses a new book about the neuroscience of literary creativity. You may have seen it linked to at other sites, but it's a fun read if you get the chance. (It's more effective discussing Epstein's view of writing than it is in delving into the neuroscience presented by the book, however.)
Posted by Matt
As promised long ago, here's another post to help provide some physics background for those of you who are not physicists. I'll briefly summarize the structure of the Standard Model of particle physics. This will hopefully make some of the other things I want to write about later more comprehensible.
In physics, we know of four basic "forces" in nature: electromagnetism, the weak force, the strong force, and gravity. Gravity is the odd one out, and I won't elaborate on it now.
Read on for further background:
The other three forces are all carried by particles of spin 1; you may have read popular science books that describe how one can think of the forces as being transmitted by messenger particles. These spin 1 particles are called "gauge bosons." In general, a "boson" is a particle with integer (i.e., whole number) spin (0, 1, 2, and so on). If you're unfamiliar with the concept of spin, this isn't so helpful a thing to say. The thing that it is easy to give a sense of without mathematics is the "personality" of bosons: they like to cling together in groups. There's no limit to how many of them can sit in a particular state, so they will tend to all pile in together. This is as opposed to fermions, particles of half-integer spin (1/2, 3/2, and so on), which obey the Pauli exclusion principle that says no two of them can be in the same state. Think of bosons as gregarious and popular, and fermions as loners. It may seem reasonable that no two things can be in the same place at the same time, but from a mathematical standpoint, fermions are weirder: they correspond to fields that are zero when squared, but are not themselves zero.
So, the three non-gravitational forces that we know about (electromagnetic, weak, and strong) correspond to bosons, of a particular sort: spin 1 gauge bosons. Thus the mathematics of them is, superficially, almost identical. However, there are subtleties that make them behave very differently from each other in practice. These mathematically are characterized by symmetries, and I'll tend to speak loosely and use the words "force" and "symmetry" almost interchangeably below; hopefully this will not be too confusing. Electromagnetism is the simplest: it is what is called an "abelian" gauge theory, which means that the particles carrying electromagnetism (the photons) do not interact with each other. They are also massless. Because of this, light can travel in waves over arbitrarily long distances: looking up at the night sky, you can see some good evidence for this.
The strong force is a lot crazier. It is a "nonabelian" gauge theory; roughly what this turns out to mean is that it has more than one carrier analogous to the photon, and they interact with each other. They are called gluons, and it is possible for two gluons to combine to make a third, or for one to split into two, or for two to turn into two others, or one into three others, or three to combine into one. (This is a rather verbose way to say there is a 3-gluon vertex and a 4-gluon vertex in the theory.) The strength with which they interact with each other is large -- hence "strong" force. This leads to physics that is very different from electromagnetism: gluons can bind together into objects called "glueballs," and they bind quarks (we'll get to those later) together into objects called "mesons" and "baryons," of which the protons you and I are made of are examples.
The weirdest force, though, is possibly the weak force. It's also a nonabelian gauge theory, so once again the force carriers can interact with each other. However, in this case they have mass. The force carriers are called the W plus, W minus, and Z, and they have masses between about 80 and 90 times the mass of a proton. As fundamental particles go, that's really heavy. And it blatantly contradicts the symmetry principle that governs the way these gauge theories work in general. Roughly, what happens is that at high energies (high by the scale of particle physics, small for the real world: roughly 1000 times the proton mass), there is an exact weak symmetry, with 3 *massless* force carriers, again called W's. There is also a force that acts just like electromagnetism, called "hypercharge." Together this is called the "electroweak" symmetry. Then something funny happens at this energy of 1000 times the proton mass, so in the early universe, as it expanded and cooled, there was some sort of phase transition. The hypercharge and weak interactions got all mixed up, and the only remaining symmetry was the electromagnetism that we observe today. The rest of the original symmetry got "broken," but we still see remnants of it in the form of the W plus, W minus, and Z, and the weak interaction they carry.
The punchline is this: this picture of the basic forces has been around since the 70's, and we still have no idea how this transition that breaks the "electroweak" symmetry happened! The best explanation we have is called the Higgs mechanism, and you've probably heard of it if you've read any articles about the Tevatron collider at Fermilab or the Large Hadron Collider that is being built at CERN. It involves yet another boson, but this one isn't a gauge boson like the photon, gluons, W's, or Z. It's what is called a "spin 0" or "scalar" boson. It's a really clever idea, but a somewhat simplistic one, that when tacked onto the Standard Model can explain how this breaking of symmetry happens. Most physicists are convinced that the simplest Higgs idea is not realized in nature, but that something more subtle is going on. Most candidates for something more subtle, though, still involve a particle that looks just like the Higgs boson. This is why the search for a Higgs boson is a big deal. Measuring the properties of one could help guide us toward which more subtle idea is right. Not finding one at all indicates we need a substantial new idea to explain how to get by without it. I'm hoping for the latter: if we don't find what we expect, there's a lot of fun work to be done to try to explain the discrepancy.
There are tons of things I haven't mentioned, like the fermions that make up matter (e.g., quarks that feel the strong force, and electrons that don't), the way the Higgs works, or why we think the Higgs alone isn't enough. But this was probably enough information to digest. I hope this has provided useful background for some of you; I'm not sure how to find the right balance between comprehensibility and amount of information. I would like for more of my audience to understand roughly what I'm talking about in the future when I start posting about some ideas in current research that I'm intrigued by. Feedback is welcome.
Posted by Ed
Sometimes I think that the biggest problem with contemporary American politics is that people take it too seriously. A case in point: as I graded papers this evening, I played my new DVD of Robert Drew's 1960 documentary Primary in the background. This film tells the story of John F. Kennedy's campaign against Hubert Humphrey in the 1960 Wisconsin primary and ends with a stirring conclusion: it plays a 1960 Humphrey theme song that was written to the tune of "Davey Crockett"! Can anyone imagine a serious presidential campaign playing such a cheesy tune these days?
I wonder exactly when it was that American politicians quit playing silly campaign theme songs. (Most political theme songs today are just regular songs that politicians frequently play at rallies; Bill Clinton rallies often featured Fleetwood Mac's "Don't Stop Thinking About Tomorrow," for instance.) When did this change come about? What does it say about larger changes in American politics? These days, the only politician I know of with an original campaign theme song is Barack Obama, the Democratic candidate for the Senate here in Illinois. Is it a coincidence that he's a fantastic candidate?
Posted by Ed
In a review for The Village Voice, J. Hoberman declares that Kill Bill, Volume Two "is full of flashbacks and fakelore, kung-fu catfights and tacky rear-screen projection, not to mention the sort of pranks that would have had a '70s 42nd Street audience bellowing expletives in delight." One of these days--perhaps even this afternoon--I'll describe my own reaction to Quentin Tarantino's latest film, but for now, I wanted to comment on Hoberman's use of the term fakelore, which I'm now officially designating as the cool word of the day.
I first came across the word fakelore three years ago, when I read Frank Miller's fascinating book Folklore for Stalin. Miller describes how Soviet writers of the Stalin era built on the style and form of traditional works like the epic poem and the ritual lament to write works that resembled folklore, and were presented to the public as the genuine product of the people, but that were really manifestations of the Stalinist cult of the individual; he termed these cultural products "pseudofolklore" or "fakelore." A paper I just found online, "Fakelore, Multiculturalism, and the Ethics of Children's Literature," uses a similar definition of the term:
Folklorists have been complaining for generations about what Dorson (1950; 1976) bluntly called "fakelore": the representation of materials written by professional authors as reproductions of the oral traditions of historical and ethnic communities. Some fakelore is total fabrication, utterly unconnected to any actual folklore source-the Paul Bunyan stories found in schoolbooks were never told by lumberjacks, Pecos Bill was not a cowboy hero, and all those cutesy "Indian" origin legends were created by nineteenth and twentieth-century romantics. Other fakelore caters loose adaptations to contemporary literary and moral fashions, "processed folk" as I like to call it (Singer 1988). In either case, the published material, however much it claims ancestry in a particular "folk" community, is written to appeal to the tastes and desires of publishers, promoters, and readers, instead of to reflect the narrative and intellectual sensibilities of real "folk."
Whatever it means, fakelore is a cool word. I'm curious if other people are familiar with its use in a non-academic setting.
Posted by Susan
Like Matt's, my recent lack of posting has been due to business rather than laziness. I am rotating once again (i.e. working for ten weeks in a lab to see if I want to do my thesis work there), this time while taking classes, which is rather taxing.
I never know how much it's kosher to say about ongoing research, but to put things broadly, I'm looking at BRCA1 in DT40 (chicken pre-B) cells. To the general public, BRCA1 is best known (if at all known) for being one of the two main genes associated with familial breast and ovarian cancer. The BRCA1 gene product is an extremely large protein involved in many processes--to name a few, regulation of growth and proliferation, RNA transcription, and (my focus) DNA repair. (Here's the OMIM entry if you're at all interested in BRCA1).
Due to my work, I've been thinking a lot lately about noncoding DNA*, so I was pleased to see the subject come up at The Panda's Thumb (due to the aforementioned business, I am writing about this quite a long time after seeing the article). I suppose I can't be touting The Panda's Thumb as a great new blog at this point, but it's certainly a great blog and worth checking out if you have any interest in evolutionary theory or science education. It's also fun if you like to tear apart anti-evolutionists--I always admire people who can attempt to argue with creationists or "intelligent design" devotees or the like. Anyway, the discussion at The Panda's Thumb focused more on introns and related topics, whereas my thoughts on noncoding DNA have concerned DNA repair.
Much of the early work on DNA repair was carried out in yeast and bacteria, whose genomes contain very little noncoding DNA (about 1% for yeast as compared to about 43% for the human genome). Both of these organisms primarily repair double-strand breaks (DSBs) by recombinational repair (aka homologous recombination or HR). In this process, the resected ends of DNA invade the intact homologous region on the sister chromatid to copy it, resulting in repair without loss of genetic information (though gene conversion may occur). One important feature of HR is that the resolution of the double Holliday junction formed from the strand invasions (helpful picture from Nat Rev Mol Cell Bio (2002) 3:430) can result in crossover or noncrossover products, depending on the resolution of the junctions. An interesting process (single strand annealing) shares some similarities with HR while giving only noncrossover products, but it's not germane to this discussion.
Nonhomologous end-joining is a pathway present only in eukaryotes. It is the predominant pathway of DSB repair in humans. Unlike HR, nonhomologous recombination involves loss of genetic material. Basically, once a DSB forms, the ends are recognized, resected a bit, and then joined together. Microhomology (of two or three base pairs) is required in some forms of NHEJ. While this process is critical for generating genetic variety in T-cell receptors and antibodies, it can damage genes and may be involved in generating some of the chromosomal translocations found in cancers like chronic myelogenous leukemia.
So why do humans (and other higher eukaryotes) use such an error-prone method to fix their DNA? Most attempts to answer this question have focused on the difficulty of performing the homology search, a key component of HR, on the human genome. Two problems are evident: first of all, the human genome (6 billion base pairs) is considerably larger than either the yeast (12 million bp)** or E. coli (5 million bp) genome. However, the chicken genome (1.1 billion base pairs), which is more comparable in size to the human genome, undergoes a significantly larger portion of its repair through HR***, so genome size is unlikely to be the sole factor dictating the preference for HR in repair of human DNA. This leads us to the other homology search problem--repetitive DNA elements, most commonly found in noncoding DNA. As I mentioned earlier, roughly 43% of the human genome is thought to consist of noncoding DNA. Genomes of lower eukaryotes and avians , which use HR preferentially over NHEJ, have much less. The theory is that, in an organism whose genome contains lots of repetitive elements, the homology search can lead to false matches of repetitive elements on different chromosomes (or on the same chromosome in the wrong place). If HR were performed on these false matches and crossover products arose, this would give rise to translocations, breakage cycles, circular chromosomes, chromosome loss, and other sorts of genomic instability that could be fatal for the cell (or, should the cell survive the changes and become cancerous, fatal for the organism).
Obviously, this sort of thing is rather hard to test, let alone prove, but I enjoy the speculation.
*Or, as Francis Crick would have it, "junk DNA". It's interesting to look at Crick's track record--he has managed to be both utterly right (structure of DNA, codons as triplets of bases) and utterly wrong (central dogma), and I suspect the whole noncoding DNA business is not going to go his way either. Regardless of how important noncoding DNA turns out to be, I agree with the commenters at the Panda's Thumb that the characterization of noncoding DNA as "junk" has probably retarded progress in the field.
**Fun fact of the day: the Saccharomyces genome database FAQ, in its miscellaneous questions section, addresses the question "I think I may have a yeast infection. What should I do?" thusly:
Unfortunately, we cannot directly help you because SGD is a scientific database that provides information about the molecular biology and genetics of the yeast Saccharomyces cerevisiae to researchers. We are not medical doctors and cannot give medical advice. You should speak to a qualified physician about any medical concerns. To find out more information about pathogenic yeast infections such as Candidiasis, you can go to a medical library at a local university, search the PubMed database for relevant literature, or browse the Candidiasis information at MEDLINE plus.
***Among the delightful consequences of this preference is the high specificity of gene targeting (an HR-dependent process) in chicken cell lines like DT40 cells.
Posted by Ed
This week's News of the Weird column in The Chicago Reader features an interesting item on the First Family:
In a December profile of presidential brother Neil Bush, the Washington Post described the breezy eighth-grade American history course that his company Ignite! is selling to schools. The course's methodology assumes that "hunter-warriors" (apparently Bush's term for rambunctious boys) don't have the patience to read and should be taught using music, animation, and other media. The Constitutional Convention of 1787, for instance, is cast as a rap song: "It was 55 delegates from 12 states / Took one hot Philadelphia summer to create / A perfect document for their imperfect times / Franklin, Madison, Washington, a lot of the cats / Who used to be in the Continental Congress way back."
It's hard to tell just how silly Bush's education software really is, though this item doesn't exactly make it sound good. (Neil has also been accused of trying to profit from his family connections by selling software to schools in Florida, where his brother is governor; the software in question includes test prep software for the standardized exam enacted under Jeb Bush.)
For more, here's an excerpt from the Washington Post article mentioned above:
Ignite! is designed, Bush said, to make learning fun for "hunter-warrior" kids who don't like reading. It's a computer curriculum that uses music, graphics and animation to teach middle school kids.
The program's first course -- eighth-grade American history -- was tested over the last two years in schools in a dozen states. Available commercially for the first time this year, it is being used by about 40,000 students in 120 school districts, mostly in Texas, at a cost of about $30 per pupil.
One school that uses Ignite! is Mendez Middle, a predominantly poor and Hispanic school in Austin. After three years of using the program, says Principal Connie Barr, the number of students who passed the state's eighth-grade history test has risen from 50 percent to 87 percent. "That's incredible," says Barr. "It doesn't replace the teacher or the textbook. What it does is give the teacher another way to deliver the information."
However, Ignite! has been attacked by other educators for dumbing down history. Among its controversial aspects is a lesson that depicts the Seminole Wars in a cartoon football game -- "the Jacksons vs. the Seminoles" -- the animated Indians smashing helmets with animated white settlers...
Ignite! is working well, Bush wrote in an e-mail: "Teachers and students have given anecdotal feedback that confirms the powerful impact our program is having on student achievement, student focus and attitudes, and teacher success in reaching all of their students."
But at Whitney reviews were less laudatory. "The kids felt pretty strongly that what this was about was lowering the bar," says Humes.
Humes wasn't impressed, either. "There was a lot of rhyming and games," he says. "It reminded me of what my son uses -- but he's in kindergarten."
(Thanks to Susan for pointing the article out to me.)
Posted by Ed
My plans for next year, like Matt's, have become much clearer over the last week. Yesterday I learned that I've won a Fulbright-Hays Doctoral Dissertation Research Abroad Fellowship, which means that my plans are now largely set: I'll be in Cambridge from September until late December, reading microfilms of Russian archival documents in the Harvard libraries, and then I'll go to Russia for nine months of research. I expect to spend roughly half my time in Russia working in the central archives in Moscow and half in several provincial archives; I expect to work in a Siberian regional center (most likely Novosibirsk), a medium-sized city that was not occupied by the Germans during World War II (perhaps Ivanovo, where I did research last summer), and a third site to be decided later. I also plan to undertake some research in Riga, Latvia, where it's possible to read secret police denunciations that are inaccessible to researchers in Russia.
My dissertation topic, for those of you who don't already know, is the Soviet Communist party's investigations of the misconduct of its members after World War II--a topic that that includes everything from drinking and womanizing to religious observance and collaboration with the Germans. One of these days I plan to update this blog with a description of some of the reading I've been doing lately, but for now, I have too much reading and celebrating to do!
Posted by Ed
Here are some random links of the day:
Posted by Matt
I haven't blogged lately, and as opposed to the usual reason (laziness) this time it's because I have been exceedingly busy. I've made my grad school choice, after a lot of consideration. I'm going to Cornell. This seems to surprise a lot of people, who would have thought Harvard, Berkeley, or Stanford to be better choices. But I've given it a lot of thought. Here is my reasoning.
(This post might be somewhat more personal than most of the content of this blog, but it will provide the context for a lot of my future posts, and it is somewhat academic.)
I want to do particle theory, somewhere on the spectrum between collider physics and string theory. Mostly I think my interests lie in the area of model-building and general theoretical work beyond the Standard Model, but not formal string theory. However, I do want to be familiar with string theory and I also want to be able to do more concrete work with collider physics. When the LHC data begins coming in a few years from now, I want to be positioned to help understand it.
Now, all four of the schools I named above are quite good at particle theory. But the emphasis varies. At Cornell, Csaba Csaki is positioned more or less where I want to be on the spectrum, between string theory on the one end and collider physics on the other, but able to deal with both. Much of his work lately has been on model-building with large extra dimensions. (Mostly in warped geometry, inspired by the AdS/CFT correspondence.) However, his most recent paper is on the phase structure of SUSY gauge theories, building on recent work of Intrilligator and Wecht. Maxim Perelstein is somewhat more toward the concrete, experiment-oriented end of the spectrum. Henry Tye is doing string cosmology. He knows string theory well, but is very focused on getting experimentally testable results. This is much along the lines of what people at Stanford are doing. He was one of the originators of the idea of inflation, so he knows cosmology well. Beyond these three people, there are experts on lattice QCD and on B physics, and probably a new hire in the next year or so.
What appeals to me most about Cornell academically is that Csaba's grad students have been involved in multiple projects, and have had a lot of publications. If I were to work with certain professors I spoke with at other schools, I would be primarily focused on one project. This could be interesting, but I think that as a grad student there are many aspects of physics I need to learn to do well, and I want to be able to publish in several of them. With Csaba I could work on extra dimensions, on supersymmetry, or perhaps on cosmology. With Maxim I could work more specifically on understanding experimental predictions of various models. And with Henry Tye I could work on string cosmology. What's more, it seems encouraged and very feasible to work with all of these people over the course of graduate school. Although the group is very small compared to, say, Berkeley, I think the breadth of my education there would be very good. After graduate school I think I will want to work with someone who is more focused on deeply understanding on big new idea at a time, like Nima Arkani-Hamed at Harvard or Savas Dimopoulos at Stanford.
Aside from academics, Cornell does have some advantages. (Yes, I can hear you all laughing, but I'm serious.) Ithaca is nice enough for a town of its size. The surroundings are pretty, and there's good hiking (not like I would have access to by driving a few hours from Berkeley, but still nice). Getting to New York City for a weekend is very feasible (for a day, not so much), which is nice, as I like New York, and I could easily visit my friend W. at Yale. I have a couple of friends in grad school at Cornell: a physics experimentalist I know from work and from quiz bowl at Chicago, who is in his first year there now, and a history student who was one of my roommates at Chicago, who also starts in the fall. Also, one of my undergrad friends at the U of C might be transferring to Cornell for her third and fourth years of college, because she wants to do art, and the U of C is not the best place for that. Besides this, the professors and grad students all seemed exceptionally friendly. And I don't mind cold weather, most of the time. I think it will be a nice enough place to spend four or five years; I don't think I would want to live in Ithaca on a longer term than that. Another advantage is its relative cheapness; with my NSF fellowship I should be able to easily travel to nicer climates if I need an escape sometime.
So, in the fall I will be at Cornell. I'll take a string theory course from Henry Tye, I might take some of the normal first-year grad stuff like statistical physics or E&M (after all, I've never been forced to work large numbers of Jackson problems), and I'd like to try to take algebraic geometry or Lie algebras in the math department. I also plan to dive into research right away, and to try TA'ing some classes (for the experience, and a bit of additional money). It sounds like a lot to me, but the grad students there tell me it's perfectly feasible.
I'm pretty busy preparing a talk for the American Physical Society April Meeting (held May 1-4, for some reason), but hopefully I can manage a regular rate of posting in the future.
Posted by Ed
I'm afraid I don't have much time for blogging today, what with the unholy trinity of taxes, teaching, and research, but here's an interesting link for you: a Wall Street Journal article by Richard Carwardine on the British view of Abraham Lincoln.
I found one passage especially interesting, but I suppose I'll need to read the author's book to see it fully developed:
Certainly I take very seriously Lincoln's moral relationship to power, and in this I differ from Charnwood only in emphasis, and not in general interpretation. What strikes the neutral reader is the tenacity of Lincoln's ethical convictions: his meritocratic faith; his belief that no one's opportunities for self-improvement should be limited by class, religious beliefs or ethnicity; his repugnance for slavery as a system that denied men their chance of moral and economic self-fashioning; his unwavering commitment to a Union freighted with moral value, as a democratic model; and his determination that the Union should not be lost on his watch. Lincoln's moral understanding of the demands of power was not founded on a conventional Christian faith. But the evolution of his religious thought, his quest to understand divine purposes during the war, his Calvinistic frame of reference, and the ease with which he rooted his arguments in scripture, make it essential to take his religion seriously.
I am also struck by the way Lincoln derived strength from his political relationship with evangelical Protestantism. This has been a neglected theme in previous studies, in Charnwood and since. Yet religion was deeply embedded in the culture of Lincoln's age. Mainstream evangelicals helped shape the new mass politics that reached their maturity at about the same time that Lincoln arrived at his. In Illinois, as elsewhere before the Civil War, the lines of party political division often coincided with religious ones. Alert to the influence of religious opinion, Lincoln's appeal blended Protestant conscience and Enlightenment rationalism. The orthodox Protestantism that sustained the Republican Party and much of the wartime Union coalition was not Lincoln's religion. But he successfully harnessed its power, both to win the presidency and then to rally support behind the war.
Update: The ongoing series of Civil War articles in The Washington Times discusses Richard Carwardine's new book on Lincoln.
Posted by Ed
When I rule the world, no one will be surprised at all when America's newspaper of record publishes a profile of a prominent translator (Robert Fagles) and a profile of a famous biologist (Francis Crick) on the same day. I wish the profile of Fagles had discussed the nuances of translation in more detail, but it was a nice article nonetheless.
On a completely unrelated note, here's a Washington Post article on the divorce and custody battle surrounding Gulnora Karimova, the daughter of Uzbekistan's president. Larry McMurtry's New York Review article on Ulysses Grant has some interesting moments, too.
Posted by Ed
Here are some random links to tide you over until I have time to write something more substantive:
Posted by Ed
Today's Boston Phoenix features a short article on this year's Pulitzer Prize for investigative reporting, which went to The Toledo Blade. As Dan Kennedy writes,
The Blade’s reporters found that an elite unit of American troops called the Tiger Force did terrible things in Vietnam in the late 1960s. They murdered innocent people in cold blood, some as they were begging for their lives. They tossed grenades into tunnels, where they knew elderly folks, women, and children were hiding. They cut off the ears of their victims and made necklaces of the grotesque souvenirs. In all, the Blade reported, the 45-member Tiger Force may have killed hundreds of unarmed civilians during a seven-month period in 1967.
"We were living day to day. We didn’t expect to live. Nobody out there with any brains expected to live," a former Tiger Force sergeant named William Doyle told the paper. "So you did any goddamn thing you felt like doing — especially to stay alive. The way to live is to kill because you don’t have to worry about anybody who’s dead."
The series, "Buried Secrets, Brutal Truths," concerns some of the most emotionally charged territory in American life, even today, some 37 years after those events took place. Vietnam was at the heart of a right-wing attempt to smear Senator John Kerry after he’d wrapped up the Democratic presidential nomination. Critics dug up testimony Kerry had given before the Senate Foreign Relations Committee in 1971 about atrocities he’d heard his fellow veterans attest to — rapes, mutilations, torture, random killings, and the like (see "Sex, Lies, and Republicans," News and Features, February 20) — and held it up as evidence that Kerry was somehow unpatriotic. The Blade has shown that such horrors may have been commonplace — and that Kerry’s testimony spoke to something very dark and very true about the Vietnam War.
Posted by Ed
I've been really busy lately, spending a lot of time on the writing course I'm teaching and still more time reading theoretical works related to my dissertation. (I've been reading so much theory, I'm afraid, that Michel Foucault appeared in one of my dreams, though he didn't do anything interesting.) I wish I had more time to write for this blog, and one of these days I will.
I just can't resist linking to this Common Review article about the Moscow production of Chicago, the Bob Fosse musical that was turned into an Oscar-winning movie. The article's loaded with fun stuff and I'd strongly recommend it.
Posted by Ed
The 2004 Pulitzer Prizes have now been announced, and two awards went to books in Soviet history: William Taubman won the Pulitzer in biography for Khrushchev: The Man and His Era and Anne Applebaum won the prize in general nonfiction for Gulag: A History.
I'm a little embarrassed to admit that I haven't read either book in its entirety. My sense is that a lot of Soviet historians aren't huge fans of the Applebaum book: I'm told that the middle sections are solidly written and researched, but that the opening and the conclusion frame the discussion in terms that most professional historians of the Soviet Union would object to. (Unfortunately, the more questionable theoretical sections of the book have arguably had a larger impact on understanding of Soviet history than the more empirical sections in between.) I've read about half of Taubman's book, which struck me--in general--as an excellent introduction to Khrushchev's life for the non-expert. Certain passages struck me as a little misleading about the current state of the historiography, but mostly in ways that a non-expert wouldn't notice.
Neal Ascherson reviewed Khrushchev in The London Review last summer, and his essay included one passage so delightful that I just have to repeat it:
His personality was horribly deformed; his crimes were unforgivable. And yet his lust for the new was disarming. I will never forget a story Taubman tells about his London visit in 1956. What, he asked his Foreign Office escort, was that odd 'oo, oo!' noise coming from the back of the crowd? The diplomat explained that people were booing, an expression of disapproval. Khrushchev grew thoughtful. In the back of the car, he said experimentally to himself: 'Boo!' And then again: 'Boo!' He liked it. For the rest of the day, he went around exclaiming 'Boo!' to all kinds of puzzled people. He had learned something.
Posted by Ed
Here are two recent articles on movies that are worth checking out:
Frankly, I have never looked at my watch as often during a movie as I did in "The Return of the King." Toward the end, I found myself desperately cheering on the giant spider in hope of getting home early. Eat Frodo! Eat him!
If the obsession with expensive technology and shallow effects is to ruin Hollywood film as an art form, by all means let the deed be carried out with the help of talented New Zealanders. The visual effects, costumes and makeup Oscars for "Lord of the Rings" are richly deserved. But beyond that, are these movies, or any of the over-technologized films of our epoch, of lasting value? Let's get a grip.
Even so, I enjoyed Dutton's article--and I think it's especially interesting alongside Teachout's commentary on digital animation in Finding Nemo (part of the Triplets review above.) I wish that there were more blogs willing to take a contrarian stance on popular movies, instead of just asserting that a movie is great and urging readers to go see it.
Tangential update: Michael Medved has published a Wall Street Journal article on Jack Valenti that touches on an interesting theme: the decline in movie audiences. I don't think you can understand movies as a cultural phenomenon without understanding their viewership, though I'm too lazy to discuss this question today. (See the brief comment on Casablanca and The Wizard of Oz above--and remember that movie audiences were much larger in 1939 than in 1965, when--Medved writes--they were far larger than they are today.)
Posted by Ed
Here are some random, disjointed, and poorly developed thoughts on several recent movies I've seen:
There were times, however, when the movie would have benefited from a deeper and more nuanced look at East German communism. One reason for the film's success in Germany, after all, is a widespread feeling of "Ostalgie," or nostalgia for the communist East; the movie's mother figure is portrayed as an ardent and enthusiastic Communist, and her elderly neighbors express their unhappiness with post-1989 changes in society. It would have been nice if the movie had confronted this phenomenon head on. If the movie had been called Goodbye Goebbels, and had dealt with the main character's efforts to recreate the Third Reich in his mother's apartment, the public would have reacted with anger and distaste. What makes Communism different? Is Ostalagie a true nostalgia for Soviet rule, or an accidental nostalgia that results from legitimate unhappiness with the present day and then merely falls back on a recent era that happened to be Communist? What did people admire in East Germany, and were they ever correct to do so?
At times I thought that Goodbye, Lenin! was about to make a more sophisticated point about Ostalgie. At one point, I expected the mother to tell her children that her love for the East was a charade, but she didn't do so. I couldn't tell whether she caught on to her son's trick in the end and simply decided to go along with his games, or whether she remained clueless about her country's fate until she died. Finally, the central character eventually acknowledges that the East Germany he'd created for his mother had become the East Germany he would have wanted himself. How, if at all, did this country differ from the real GDR?
There is a prominent American philosopher who has devoted a good part of his distinguished career to exploring such connections. In his review of "Eternal Sunshine," Slate's David Edelstein noted that it fit the template, laid out by the Harvard philosophy professor Stanley Cavell in his 1981 book "Pursuits of Happiness: The Hollywood Comedy of Remarriage." By fortuitous coincidence, a review copy of Dr. Cavell's new book "Cities of Words" (to be published by Harvard next month) landed in my mailbox on the same day that I went to see "Eternal Sunshine." The book, which follows the outline of a popular undergraduate course Dr. Cavell taught for many years, juxtaposes canonical texts of Western moral philosophy with studio-era comedies and melodramas, with occasional excursions into Shakespeare and Eric Rohmer.
While it would be exaggerating to compare this dazzling, rambling 500-page intellectual excursion to one of Mr. Kaufman's scripts, it is possible to imagine a character like Dr. Cavell popping up in one of them, a kindly wise man who might help the main characters solve their problems, or who might succeed only in making them worse. (Even his title — Walter M. Cabot Professor Emeritus of Aesthetics and the General Theory of Value — sounds like something Mr. Kaufman might have dreamed up.) But whether or not Mr. Kaufman has read Dr. Cavell, his latest movie confirms — and extends — the philosopher's notion that what is at stake in a certain kind of romantic comedy is also at stake in the strain of thought he calls "moral perfectionism."
I'm not going to try to answer these questions, in large part because it's now been several weeks since I saw the movie and my memories of it are no longer completely clear. These questions point to one of the claims I made when I wrote about Eternal Sunshine in this blog, however: given the movie's obvious surface-level appeal and its reputation as a "smart" film, it's easy for a viewer to decide that he or she really likes it without really understanding it. Many viewers and reviewers of the film seem to believe that it's a straightforward romantic movie, for example; just yesterday, I read a commentary on the film claiming that its main message was that "love conquers everything, even efforts to erase it" (or something like that.)
I can't help but think that this is a naive and unsophisticated reading of the movie. One could just as easily leave Eternal Sunshine with a very different conclusion--that it's easy for someone to act self-destructively or foolishly when he's enraptured by a member of the opposite sex. According to the IMDB, the original cut of the script had an alternate ending, in which Kate Winslet's character decided to have her memories of the Jim Carrey character erased again; the audience would then learn, through a view of a computer screen in the office, that she and Carrey had erased their memories of each other several times before. I'm glad that this scene was deleted--it's a little too clever for my tastes, and it would have shifted the emphasis of the movie away from the universal problems facing people in relationships toward the personal eccentricities of the main characters. Even so, I think the fact that this ending was once in the script is evidence that the movie shouldn't be read as completely optimistic about the prospects for true love in the world.
Of course, I don't want to suggest that there's nothing romantic about Eternal Sunshine. I can even think of a way to argue that the movie really is a romantic film about the triumph of love, but I think that this argument is wrong--and raising it would involve listing several spoilers. Even so, what makes the movie really intriguing is its combination of sentimentality and cynicism, realism and surrealism. That's a mixture that gets short shrift whenever someone overemphasizes the romantic side of the story, and that really needs to be appreciated for a genuine understanding of the movie.
Will seems to think that I didn't much like Eternal Sunshine, which surprises me: my blog entry on the movie (which he linked to) strikes me as very favorable overall. (I'd rank Eternal Sunshine with The Triplets of Belleville as one of the two best movies I've seen this year.) True, I did write that some viewers of the movie seem to have enjoyed it without fully understanding it, but I consider that a criticism of the viewers--not of the movie itself. Moreover, I get the sense that I touched a nerve when I criticized Shakespeare in Love, which--as I just learned when I searched Will's blog for references to Eternal Sunshine--Will praised in the very blog entry in which he first wrote about Eternal Sunshine. Perhaps Will read this (incorrectly) as a veiled criticism of him.
I think there's something deeper going on here, however. Several months ago, after I watched the movie Lost in Translation for the first time, I wrote an entry with several criticisms of the film. I tried to make it clear that Lost in Translation was worth seeing, but I think I summed up my attitude when I wrote, "It was a good movie, and I enjoyed seeing it, but I can't quite figure out why so many people are so excited about it." Another blogger, commenting on what I wrote, said "Ed at Gnostical Turpitude really didn't like it... I had a somewhat more positive take on the flick. At least, I enjoyed watching it." This struck me then--and strikes me now--as a misreading of what I wrote.
Maybe I was just unclear in my comments on both movies: it's quite possible that my commentary on each of them sounded more negative than I intended. (It's certainly true that my criticisms of Lost in Translation were more memorable and more developed than my praise.) Nevertheless, I think there's something more going on here--something connected to the way that people read and write commentary on movies and on popular culture in general.
I've always thought that blogs are a potentially important new outlet for intelligent commentary on movies, but they've never lived up to my expectations. The best reviews, I believe, are those that try to say something interesting about the subject in question, and aren't just straightforward recommendations on whether or not to see a movie; that's one reason that many of the most entertaining and interesting reviews come from magazines that aren't aiming at a mass audience, rather than from mass-market newspapers. Today's New York Times article was an essay in the arts section, not an actuall review; in general, I prefer reviewers at publications like Slate and The Chicago Reader to newspaper reviewers.
I've only read a very few fun blog posts about movies, however: Naomi Chana and Timothy Burke wrote insightful essays on The Return of the King, but they're the only bloggers I can think of who've written memorably on film. (Terry Teachout has written some nice blog commentary on movies too, but I'd put him in a separate category of writers, since he's a critic in his day job.) More often, blog entries about movies seem intended merely to evaluate or to judge--to give a quick-and-dirty assessment of the film, without really telling what makes it interesting, important, or appealing. In this sense, they're just as shallow and uninteresting as the reviews in a small-scale American newspaper.
There's a sense in which this style of movie blogging resembles a lot of blogging about politics. In general, political blogging has left me underwhelmed: most of the good political blogs give you analysis and reporting on very current issues--analysis that's worth reading, but that would be out-dated before it could appear in a magazine. (Josh Marshall and Kevin Drum are among the best practitioners of this style of political blogging.) On the other hand, far too many political bloggers merely assert their opinions on a wide range of issues--issues they often know little about--and attract the interest of people who agree with them. I doubt that Instapundit will ever change anyone's mind on a serious issue, after all.
There's a real opening for another type of political blog, however. This type of blog would feature essays that are more informal, speculative, and tentative than you're likely to find in an opinion journal, but longer than what you expect to see on an op-ed page; these essays are harder to write, take a lot of time, and can't be produced with the regularity or frequency that characterizes most blogs. There's only one political blogger I know of who writes entries like this, and that's Mark Schmitt--who, just this week, has written insightful entries on the role of ideas in politics and the political future of the white South. I can't really imagine political analysis of this sort appearing anywhere but in a blog, but most political bloggers are more interested in writing narrower analyses that focus on political horse races and judge small-scale trends in politics. That's a shame.
My point, then, is simple. Bloggers run a very real risk of becoming opinionated without having opinions on things that actually matter, or of constantly providing readers with their thoughts without writing anything thoughtful. (I sometimes feel that I fall into this trap myself: I worry, occasionally, that I write a lot without ever having the time to sit down and put much thought into what I'm producing.) Readers of blogs can fall into a similar trap--of looking for a quick lesson or message in a blog entry when there isn't one readily available, or assuming that a blog entry on a movie is intended to give a simple recommendation on whether it's worth seeing when that isn't really its goal. That's one of the downsides of blogging, and I hope I don't fall into either one too often.
Posted by Ed
Today I'm feeling lazy, so I'll just provide more links instead of real content:
Posted by Ed
Something I did not know: the novelist Jonathan Franzen is a 1981 graduate of my alma mater, Swarthmore College.
Franzen recently visited Swarthmore, giving both a casual talk and a more formal speech (that included a reading from his novel The Corrections). Here's a quick excerpt from the daily Swarthmore email newsletter's article on the visit:
When asked how Swat academics affected his creative expression, Franzen recalled the first paper he wrote for English professor Chuck James's class, which received a C+. He then discovered that good writing was hard work, and strove to avoid classes that required papers.
He also noted that there is a grand Swarthmore tradition of pretending knowledge, and that that skill is a useful one for novelists, who "need to pretend to know everything on the page."
On what to avoid as a young writer, Franzen advised Oprah and cigarettes.
Posted by Ed
When you're travelling in a foreign country, you expect to come across signs that American pop culture has been transplanted abroad. Nevertheless, it's sometimes really surprising exactly what you can find. When I spent the summer in Germany as part of a high school exchange program in 1992, I was startled to learn that the TV character Alf was really popular. (A quick look at the IMDB reveals that the show had already been off the air two years in America.) I was also amused to see that one of Russia's main TV networks presented a day-long Alf marathon when I visited Moscow in 2002 (and that the show also appeared on TV on weekday evenings.)
I've often been weirdly intrigued by seeing which examples of international pop culture carry over into other countries. During my 2002 visit, I wasn't at all surprised to see Russian-produced versions of the TV shows "Who Wants to be a Millionaire" and "American Idol," but I wasn't expecting the afore-mentioned "Day of Alf." Both TV shows were originally produced in Britain and were easy to translate into another language and cultural milieu, after all--but I have no idea why Russian audiences would want to watch a badly dubbed version of a run-of-the-mill American sitcom that had been off the air for ten years. Do obnoxious-but-furry space aliens have a special hold on the hearts and minds of Europeans?
What determines which pop culture phenomena are successful in which other countries? I was reminded of this question today when I saw a Moscow Times article on Cheburashka, "a cuddly, furry animal dreamed up by Soviet children's author Eduard Uspensky and brought to life in the 1960s and 1970s in a series of endearingly clunky animated films." Cheburashka has become a pop culture phenomenon in Japan, where he's known as "Chebi" and emblazoned on T-shirts and stationery. A fierce international copyright dispute might keep the Cheburashka craze from taking off overseas, but there's already another poorly-animated Soviet cartoon waiting in the wings: Varezhka, a mitten that was magically transformed into a puppy for a lonely young girl who needed a friend. Exciting, huh?
Russian children are also big fans of Karlson the Roof, a book by the Swedish children's writer Astrid Lindgren. I know several Russian-Americans who say that it was once their favorite story, and I'm told that the book's popularity in Russia rivals its popularity in Sweden; the book is out of print in America, however, with the most recent U.S. editions published in 1975 and 1985. Nevertheless, many Americans are familiar with the works of Astrid Lindgren, who's best-known in this country for Pippi Longstocking. For whatever reason, one of the best-known books of a popular Swedish novel has become a beloved children's classic in Russia, but is scarcely known in America--even by children who eagerly devour the author's other books. Popularity, it seems, is as unpredictable as it is short-lived.
Posted by Ed
Yesterday's Chicago Tribune featured an interesting article about the lawyer Geoffrey Fieger. Fieger is best known for defending Jack Kevorkian and for his eccentric forays into Michigan politics, and here's how the Trib describes him:
Fieger sues and, more often than not, Fieger wins. Theatrical, bombastic and occasionally outrageous, Fieger, 53, has a knack for persuading juries to grant multimillion-dollar awards, including a $25 million liability award against "The Jenny Jones Show" in 1999, though that was later reversed on appeal. He has set the legal bold standard for malpractice cases in the past 25 years.
It's easy to make fun of this program, given Fieger's rather extreme interpretation of how theatrics should be used in the courtroom. Here's what Michael Barone's Almanac of American Politics has to say about Fieger's 1998 campaign for governor of Michigan:
Suddenly the spotlight was on Fieger, and it was not an attractive sight. [Governor John] Engler, he said, was ''fat,'' a ''moron,'' a ''racist,'' the product of barnyard miscegenation. He criticized his fellow Democrats as well. At a unity breakfast, he said they were ''a party of wimps and oatmeal''; he called [Detroit Mayor] Dennis Archer ''a slow learner.'' He called Catholic Archbishop Adam Maida a ''nut'' and when Council of Orthodox Rabbis called assisted suicide murder, he said ''They are closer to Nazis than they think they are.''... Fieger did have a program--cutting the sales tax and property taxes and repealing the gas tax and single-business tax--and spent $5.7 million of his own money on his campaign, but, despite his contempt for others' intelligence, showed no mastery of state issues.
Even so, I can't help but wonder if this program is on to something, and not just in legal education. In general, I think it's fair to say that American graduate schools in history don't do a fantastic job teaching their students how to be good teachers. Certain dramatic techniques might well help lecturers to give effective presentations to their classes. Many history professors might benefit from greater appreciation of the fact that history can be told as a story--though that's a point I certainly don't want to overemphasize. I'd be wary of any program that treats lecturing as nothing more than a variety of drama, of course, but I can't help but think history education--or even graduate education in general--would benefit from increased attention to the art of communication.
Posted by Ed
Eugene McCarthy was a fascinating man. Louis Menand is an interesting writer. Menand's New Yorker review of Dominic Sandbrook's new McCarthy biography is therefore a fun read.
My favorite detail from the article: "McCarthy once spent a year in a monastery, with a view to becoming a priest; he had already got engaged, and his fiancée had to wait for him to change his mind and come out before they could marry." (It's not clear exactly what this says about McCarthy's character--beyond the obvious conclusion that he was serious about Catholicism.) There's lots of other fun stuff in there, too--like Sandbrook's suggestion that LBJ or Hubert Humphrey could have won the 1968 fall election if McCarthy hadn't challenged the president in the primaries. Menand, I think, is correct to dispute this, and I enjoyed his claim that McCarthy's subsequent presidential campaigns "suggested the same unpleasant combination of piety and frivolity as John and Yoko’s bed-ins for peace. "
It's easy to quibble with parts of Menand's analysis. Menand writes that Sandbrook's McCarthy bio would make a "worthy companion" to Rick Perlstein's recent book on Barry Goldwater, adding "Perlstein is interested in the story; Sandbrook is interested in the analysis." Menand is correct to praise Perlstein's book, but I don't think this comment does Perlstein justice. Even so, I'd recommend this article to anyone interested in 20th-century American politics.