Thursday, September 30, 2010

Author's Big Mistake

A few months ago I wrote what I thought was a mostly laudatory review in MHQ of a new book by the military historian Victor Davis Hanson. Although the book is a bit of a hodgepodge — it's mostly a collection of previously published magazine articles and repurposed book reviews — I was particularly interested in, and impressed by, Hanson's nuanced discussion of the difficult time military history faces in the American academic world.

It has always been striking to me that in Britain, military history has no trouble being taken seriously as a scholarly field of inquiry; yet in America, military historians, particularly those who concern themselves with the operational level of war — in other words, the fundamental questions of how wars are fought and won — tend to be looked down on by "serious" historians.

As Hanson notes, when war is studied in American universities, the focus these days tends to be on the usual suspects of race, class, and gender. Those are not misguided inquiries in and of themselves, Hanson is quick to acknowledge; yet it is absurd that Rosie the Riveter or "the face of battle" experienced by the ordinary soldier should be studied and taught to the exclusion of the traditional, big questions of military history: Why do wars start? Why do the winners win and the losers lose? What do wars accomplish, or fail to? I couldn't agree more.

I also offered a few criticisms of the book, in particular the way that Hanson — who began his career studying the wars of ancient Greece and who is now a fellow at the conservative Hoover Institution — repeatedly tries to argue that criticisms of the Bush administration's handling of the war in Iraq are nothing but ignorant fault-finding by people uneducated in the lessons of military history. Over and over he insists that were it not for our "historical amnesia," we would not complain about the mistakes made, the length of the war, the fabricated intelligence about WMD, the torture scandals, since (as he writes in one particularly egregious passage) "almost every American war involved some sort of honest intelligence failure or misinterpretation of an enemy’s motive,” for example.

I actually think I went rather easy on him on this point, because in looking back now it is exactly the sort of rhetorical smarminess exhibited in that sentence — slipping in that word "honest" as if that were an undisputed fact — that is particularly mendacious about much of his argument concerning Iraq and the Bush administration's decisions. Hanson's failure ever to honestly consider or address the costs (and opportunity costs) of the war is part and parcel of his approach: it is idle to argue (as he does) that the surge "worked" or that good has been done in Iraq without considering the price that has been paid in fighting the war in the first place and the toll it still is taking (in lives, strained budgets, military readiness, political and diplomatic capital, and above all America's ability to deter and fight elsewhere if needed to protect our vital interests).

In any case, Hanson responded with an e-mail to the editor saying he had "mixed" feelings about my review, which I presume meant he liked the praise but didn't like the criticism; he then followed it up with a classic example of what Paul Fussell once termed the "A.B.M.": the "Author's Big Mistake," viz an aggrieved and indignant letter for publication from the author explaining that his book is much better than the reviewer (biased as he is) allowed.

As Fussell explains the "dynamics of the author's angry letter":
He or she reads the unfavorable review, which is of course a shock, since author, editor, family, and friends have been telling each other repeatedly how great this book is. Finding out there a stranger who doesn't think so, the author takes pen in hand and dashes off a letter of protest, quite forgetting Harry Truman's maxim "If you can't stand the heat, stay out of the kitchen."
You can read the published exchange here, and decide for yourself whether — as Fussell warned — that the chief effect of the A.B.M. "is simply to reveal to an amused audience how deeply the author's feelings have been lacerated by the criticism he himself so sedulously solicited."

Fussell explained that, for one thing, all serious authors recognize — in the words of Edna St. Vincent Millay — "A person who publishes a book willfully appears before the populace with his pants down. If it is a good book nothing can hurt him. If it is a bad book, nothing can help him." Or, as E. M. Forster put it, "No author has the right to whine. He was not obliged to be an author. He invited publicity, and he must take the publicity that comes along."

Fussell is especially scornful of the author's complaint that he must reply to "clarify" or "set the record straight" because he has been "misunderstood": "
If he has [been misunderstood], it's his fault . . . It's his fault because, as a writer, he's supposed to be adept in matters of lucid address and explanation, and if he's failed there, he's failed everywhere.
For another thing, there is the fact (which I can attest to from many personal experiences) that "unfavorable observations in reviews tend to remembered only by authors or reviewers, very seldom by readers." Fussell recalled once having "winced" through a hostile review of one of his own books ("a sad disappointment"; "well-informed fatuity"; "chirpy facetiousness"; "prissy hauteur") and a few hours later been called by friends congratulating him, in all sincerity, on the great review: what they came away with was simply that some major periodical took the book seriously and gave it a lot of play.

I confess I failed to heed Fussell's wise suggestion to the reviewer in the face of the Author's Big Mistake: "Editors often try to cajole the original reviewer into composing an 'answer' to the complaint. The best advice to reviewers is that ascribed to the British Foreign Office: never explain, never apologize. And, in addition, never write without payment."

But Fussell's absolutely golden bit of advice for authors tempted to whine about an unfavorable review, which he gave in an interview that I edited back in my U.S. News days and which struck me as both hilarious and perfect and which as an author I've tried to heed ever since, was this:

"Thin-skinned people should stay out of show business."


---

A pdf of Fussell's original essay is available here; he revisited the topic in "A Power of Facing Unpleasant Facts," which appears in Thank God for the Atom Bomb and Other Essays.

I'm off for a few days to Denver, where I've been invited to give a talk to the National Animal Interest Alliance that's a bit of a blast from my authorial past: I'm giving a paper on the evolution of the relationship between human beings and dogs.

I'll be back to the blog next week.

Tuesday, September 28, 2010

Who you calling a millionaire?

In the 1930s, a candidate for governor of one of the Southern states produced a devastating charge against his opponent: he was a millionaire.

Nowadays we are routinely treated to the spectacle of multi-millionaire, right-wing conservatives denouncing their liberal opponents as members of the "elite." Obviously the popular definition of "elite" has changed.

One thing that has not changed is the sense of entitlement to power among the (economic) elite. This has always posed a psychological challenge in a democracy, where one is supposed to respect the will of the people; thus ever since Franklin Roosevelt's day, the common resort of the economically powerful to failure at the ballot box has been to portray their opponents' victory as somehow fundamentally illegitimate.

The attempts to portray President Obama as a secret Muslim, as foreign-born, as foreign-influenced, as a "socialist," as un-American; the sinister readings given his most innocuous moves — all might have been taken word for word (with the only possible exception being the substitution of "Muslim" for "Jew") from the flood of poisonous calumny, innuendo, and rumor that beset FDR throughout his presidency and indeed even after his death.

Of course, FDR was branded a "socialist," but that was nothing. "The rich," wrote William Manchester, "regarded the administration of Washington as though it were an alien government." FDR, according to stories that freely circulated in the better circles, had been infected with gonorrhea by Eleanor, who herself had got in from "a Negro." She was going to turn the country over to the Russians when he died. He was "nothing but a New York kike" anyway, whose family had changed their names; an elaborate genealogy proved he was descended from a Colonel van Rosenfeld.

Manchester catalogued the cliches repeated over and over: FDR was trying to destroy the American way of life; you can't spend your way out of a Depression; our children's children will be paying; half the people on relief are foreigners anyhow; the New Deal was under the "insidious influences" of "foreigners and transplanted Negroes."

In 1936 (in an article in Harper's titled "They Hate Roosevelt") Marquis Childs described the "fanatical hatred of the President which today obsesses thousands of men and women of the American upper class. No other word than hatred will do. It is a passion, a fury, that is wholly unreasoning. It permeates  . . . the whole upper stratum of American society. It has become with them an idée fixe."

Childs went on to say, with an optimism that we today would envy, that this irrational hatred of the President was "a phenomenon which social historians of the future will very likely record with perplexity if not astonishment."

Sunday, September 26, 2010

The teflon doomsayers

In The Rational Optimist, Matt Ridley offers example after spectacular example of a phenomenon that has baffled me ever since I began covering environmental issues in my first job in journalism thirty years ago: to wit, that while the entire presumable goal, purpose, and raison d'être of applied environmental science is to solve environmental problems, any environmental scientist who dares to suggest that problems are being solved is asking for trouble. As Ridley observes, we have arrived at a state where even the most wildly irrational pessimism is treated with reverence, while the most cautiously sober optimism is ridiculed.

Some of this is human nature and was ever thus; intellectuals, as The Rational Optimist reminds us, have been decrying modernism ever since modernism began. Actually, I wouldn't stop there: the belief in a lost golden age is as old as civilization, as is the intellectual vanity of casting oneself as the lone uncorrupted voice in the wilderness. A few thousand years before Dostoevsky, Malthus, George Orwell, and Paul Ehrlich, the Hebrew prophets were pouring out gloom and dismay with the best of them, dismissing the superficial comforts of the civilized world and its material rewards as a fool's paradise. Pessimism is what people with deep minds and deep souls have; optimism is what idiots with vacant grins on their faces have.

Pessimism is of course a proven fund-raising tool; "save the whales!" is always going to bring in more cash than "the whales are being saved!" But much more than that, we have today the amusingly ironic spectacle of tenured professors with salaries, health insurance, lifetime job security, and excellent retirement plans courtesy of TIAA-CREF being showered with worldly rewards (bestselling books, "genius" awards) for telling us that progress is an illusion and the end is near . . . while still preening themselves as daring outsiders courageously taking on the mighty and powerful. The fact that it takes no daring at all to adopt such an intellectual posture these days does not stop any of the practitioners of this business model from invariably announcing themselves to be the bearers of "dangerous" or "heretical" ideas and congratulating themselves for "speaking truth to power."

So there are understandable reasons why it pays to say that things have gone to hell and will continue to go to hell.

What I find almost inexplicable in all of this, however, is how the scientific doomsayers get away over and over again with making predictions that are fabulously, ridiculously — and demonstrably — incorrect, without the slightest repercussions upon their credibility or careers. Predictions of impending doom are published based on absurd methodologies and threadbare evidence of a kind that in the normal course of scientific affairs would be sufficient to ruin careers ten times over, and the authors walk away from them without a scratch.

Ridley has a number of remarkable for-instances in his book, many provided by MacArthur genius award winner Paul Ehrlich — who in addition to insisting in 1971 that the world had already lost the race to feed an expanding population and that mass starvation in the 1970s and 1980s would cause death rates to soar and world population to collapse to 2 billion, also declared around the same time that because of exposure to cancer-causing chemicals that had already occurred, "the U.S. life expectancy will drop to forty-two years by 1980, due to cancer epidemics."

The astonishingly wrong and repercussion-free prediction of imminent doom that first riveted my attention was the claim of the impending mass extinction of the Earth's species. In 1979, the biologist Norman Myers declared that a fifth of all species on the planet would be gone within two decades. This prediction was based upon . . . absolutely no evidence whatsoever. Myers acknowledged that the documented species extinction rate of animals was 1 per year; he then asserted that scientists had "hazarded a guess" that the actual rate was 100 per year; he then speculated that government inaction was "likely to lead" to several thousand or even tens of thousands a year, which would add up to as much as a million species over two decades. (This was when people thought there were 5 million species; the best guess now is at least 10 million.) It swiftly became conventional wisdom.

Subsequently, an attempt was made to give these made-up numbers a patina of scientific respectability that was in many ways an even worse abuse of scientific logic and evidence. In the 1990s E. O. Wilson began citing the so-called "species–area relation" as the basis for predicting that tens of thousands of species were being extirpated a year by habitat loss caused by forest clearing. Wilson popularized various numbers ranging from 4,000 to 100,000 species a year being lost, and these numbers were repeated over and over again in environmental groups' fundraising literature, in congressional testimony, in speeches by Al Gore (who in 1993 said that "one-half of all species" could disappear in our lifetime, apparently an extrapolation of Wilson's and Ehrlich's pronouncement, in a 1991 paper in Science, that as many as a quarter of all rain forest species will disappear in 30 years).

I started to look into the science and mathematics of the species-area relation when my father, who was an applied mathematician at Harvard University, mentioned to me an Op-ed in the New York Times by his fellow Harvard faculty member Wilson that included a description of this formula, which had struck my father as absurd on its face as a mathematical model. The formula most often used is:

             S = CAz

where S is number of species, A is area, and C and z are arbitrary constants tweaked to make the curve try to match the data. Basically, the formula says if you count the number of species on, say, islands of varying sizes, the bigger the island, the more the species. Wilson's argument was that if you start cutting down rainforests, say, you'll shrink the number of species contained in them according to the same curve.

The prima facie problem, which irked my father, is that the dimensions of the arbitrary constant C vary according to the numerical value of the other arbitrary constant z. Without going into the technical details too much, this is (as my father put it) "cockamamie" from any scientific perspective; it means that this is just an exercise in curve-fitting, not a scientific model based on any cause-and-effect understanding or mechanism.

The more I looked into it the more ridiculous it became. The definitive review article on the species–area relation correctly noted that the formula is at heart nothing more than a "sampling phenomenon . . . without a functional relationship." The authors concluded that there was no biological significance to the constants C and z; the fact that z tends to fall in the range of 0.2 to 0.4 when you fit the curve to islands is simply a mathematical coincidence, and the same thing happens when the same formula is used to fit other empirical relationships (e.g., the relation between brain size and body size in mammals).

The much more serious problem, as a few (truly daring) conservation biologists pointed out, is that there is absolutely no reason to think that such an empirical, broad-brush, descriptive formula has any predictive value in the real world at all. As the conservation biologist Vernon Heywood wrote: "The species–area curve (in a mainland situation) is nothing more than a self-evident fact: that as one enlarges an area, it comes to encompass the geographical ranges of more species. The danger comes when this is extrapolated backwards, and it is assumed that by reducing the size of a forest, it will lose species according to the same gradient."

Heywood pointed out many reasons why this is not going to happen: species are not distributed at random, conservation measures are already protecting many critical habitats, many species can adapt to other habitats as the original forests are cut down. Rather than seeing species numbers decline in lockstep with loss of forest area, a more biologically realistic model might predict few if any extinctions until habitats are almost completely destroyed, and even then species numbers would certainly not plunge to zero (as the species–area curve predicts), since many species would be able to survive in the secondary forests that regrow or in other habitats still available.

Even more striking is the fact that the predictions from the formula are wildly incorrect in practice where they can be checked. More than 90 percent of the Atlantic coastal forests of Brazil were cut down, mostly in the 19th century; by the species–area relation that means 50 percent of species should be gone. In fact the actual number of animal extinctions has been zero, even though many of the Brazilian species are highly endemic, found nowhere else in the world.

Similarly, the eastern U.S. forests were reduced to less than half their original extent from colonial times to 1900; but instead of 30 extinctions of birds as predicted by the model, there have been 4 — 2 of which (the Carolina parakeet and the passenger pigeon) were wiped out by hunting, not habitat loss, and the other 2 of which (the ivory-billed woodpecker and Bachman's warbler) were restricted to very specific habitats in the southeast that were destroyed by logging and agriculture. The fact is you could have cut down vastly more forest area in the entire region east of the Rockies without losing a single bird species had we protected the small but critical habitats required by the ivory-billed woodpecker and Bachman's warbler and halted the criminally stupid hunting of the passenger pigeon and Carolina parakeet. (Please note: I am not advocating cutting down vast areas of forest east of the Rockies.)

You might think that with such a record of slapdash science and wildly incorrect predictions, conservationist biologists might be a bit sheepish. But there is simply too much invested in a methodology that gives the "right" answer — while nothing else in ecological science (actual field studies, surveys of endangered species, historical evidence) comes close to doing so. A recent peer-reviewed article in a prominent journal referred to the species–area curve reverently as "ecology's oldest law" and even "the periodic table" of ecological science, an assertion more than sufficient to make Mendeleev turn over in his grave. Virtually every article invoking the species–area relation as the basis of a catastrophic prediction of impending species extinction cites the definitive review article I mentioned above — without ever heeding (or even mentioning) its fundamental conclusion regarding the logical and methodological fallacy of using it to make such predictions.

And the circular reasoning/begging of the question that takes place is simply jaw-dropping at times, my favorite example being Thomas Lovejoy's defense (in 2002) of Norman Myers's 1979 prediction of extinction doom (by 2000): "Myers . . . deserves credit for being the first to say that the number was large and for doing so at a time when it was difficult to make more accurate calculations." Another egregious practice in the ecological literature has been to adjust the formula and starting assumptions to minimize the number of predicted extinctions (sometimes by a factor of 10 or 100 or more) when comparing the predictions generated by the formula to the actual evidence in specific cases (thereby "validating" the methodology), but doing the exact opposite when using the formula to produce the alarming worldwide extinction predictions of tens of thousands per year.

There is no scientific dispute that extinctions are occurring, that they are occurring at a rate above the natural level due to human action, and that strenuous efforts are needed to protect critical habitats, to eliminate invasive competitors that threaten species, and to prevent overexploitation.

But the egregiously bad science that is still being invoked to shore up wholly unsubstantiated predictions of catastrophic mass extinctions is only undermining the credibility of environmentalists, and is already causing a dangerous political backlash that has handed ammunition (exactly as in the case of global warming) to those who want to reject any and all evidence of human impacts on the natural environment.

A first step in restoring credibility might be to revive some intellectual opprobrium for those who are flagrantly wrong, even in a good cause.

Friday, September 24, 2010

GOP Chutzpah Awards, cont.

One of the other fun facts in the GOP "Pledge to America" is this chart purporting to show that Bush reduced spending, while Obama is now sending it through the roof with his "Democrat Budget":


Of course as statisticians always remind us, averages can hide a multitude of sins. (Note also the way the vertical scale was finagled on this chart to make 23 percent look like more than twice as much as 19.5 percent.) Here's a more complete picture of each president's stewardship of Federal spending during his term:

*(Obama, 2013 OMB projection)

Jangling juxtapositions

A few years ago, one of our stalwart local conservative state legislators made as a cornerstone of his reelection campaign the promise, "A billion dollars for new roads without raising taxes." (He also devoted much of his time in office to sending plastic fetuses to his fellow representatives, but that's another story, perhaps.)

What was striking about this was not so much the contradiction as the fact that these two apparently mutually exclusive propositions appeared in the very same sentence. Until then politicians had had the tact to spread their proposals for suspending the laws of physics over several sentences at least, so as not to appear to have discarded a tiny remaining shred of respect for the intelligence of the voters. 

Yesterday the GOP congressional leadership doffed their coats and ties and appeared in immaculately pressed shirt sleeves at a local lumberyard to complete the process of revoking cause and effect. I'm sure we all have our favorite jaw-dropping contradictions in the GOP "Pledge to America" unveiled yesterday (pledging to cut off funds for the "costly new health law," which the nonpartisan Congressional Budget Office projects will save $100 billion over the next ten years; promising to put America "on the path" to a balanced budget, while proposing to add close to a trillion dollars to the deficit over the next ten years with tax cuts for the wealthiest 2 percent; vowing "to ensure transparency" while blocking requirements to reveal political donors; and, in the Supreme Chutzpah Award division, pledging to "fight efforts to use a national crisis for political gain").

But the one that left me truly agog was the claim that "in the 1990s . . . a Republican Congress was able to bring the budget into balance and eventual surplus."

Omitted from this explanation was the small fact that in the 1990s (when . . . Bill Clinton was president, as I vaguely recall) every single Republican member of Congress voted against the measure that brought the "budget into balance and eventual surplus": the 1993 Clinton budget act that raised top marginal tax rates to 36 and 39.6 percent for the wealthiest 1 percent of Americans; set the corporate income tax at 35 percent; limited personal exemptions and itemized deductions for the highest income brackets; and got rid of the regressive income limit subject to Medicare tax.

Here's a handy reference chart to a few of the significant contributors to the current national $9 trillion debt:


(Sources for this are the Office of Management and Budget historical tables, the CBO's calculations of the costs of the Iraq war — which it places at around $2 trillion by the time we're finally done, by the way — and the Tax Policy Center's analysis of the Bush tax cuts.)

Another way of looking at the current budget situation is to ask how we got from the $800 billion annual surplus we would have had by now under the policies in place when Clinton left office in 2001 (and which had already produced three surpluses in a row by that point) and the $1.2 trillion annual deficit we now have. That $2 trillion shift from black to red breaks down like this, based on a compilation of CBO data by the New York Times last year:

Wednesday, September 22, 2010

Hoover days are here again

You won't find any of the latter-day heirs of Herbert Hoover's economic and political philosophy utter the words he himself uttered in one revealing moment of frustration: "The only trouble with capitalism is capitalists. They're too damned greedy."

Hoover in fact also believed in something else we hear precious little of from conservatives today: "social responsibility." Until his failure to deal with the Depression pushed him more and more into self-serving denial, Hoover frequently spoke of the need for the wealthy to bear the burden of taxation as a basic social duty; he favored a "steeply graduated tax on legacies and gifts . . . for the deliberate purpose of disintegrating large fortunes"; he accepted at least in principle the need for some government intervention during economic downturns; he denounced as immoral those who worshiped unrestricted economic freedom, saying "they give no consideration to the fact that property or the power over property can be used to abuse liberty. It can be used to dominate and limit the freedom of others."

What is remarkable about the amnesia of present-day Republican leaders is how they have managed to become a caricature of even the historical caricature of Herbert Hoover. Thus Sen. Mitch McConnell, the Republican minority leader, declared the other day that it is the rich who have been "hit the hardest by this recession" and that they must not be made to "foot the bill for the Democrats' two year adventure in expanded government"; thus the Republican mayor of Colorado Springs rejected Federal jobless assistance for his city explaining of those who had lost their houses and jobs, "some people want a homeless life . . . they really do"; thus GOP Senate candidates Sharron Angle in Nevada and Ron Johnson in Wisconsin stated recently that they oppose extending unemployment benefits because (Angle) "extending unemployment . . . really doesn't help anyone" and (Johnson) unemployment insurance only discourages workers from facing up to the fact that they need to "take the work that's available at the wage rates that's available"; and thus we have the conservative chorus-line of denunciations of government public-works and jobs programs as "socialism" and a threat to "liberty" and of the White House's proposal for allowing tax cuts on the wealthiest two percent to expire as "class warfare."

All are pitch-perfect echoes of Hoover's more famous examples of legendary callousness to the suffering of those hit by the Depression. As Hoover became more and more embattled, he became more doctrinaire in insisting on a balanced budget (even though he clearly knew it was the worst prescription in a recession), denounced "raids on the federal treasury" to pay for unemployment relief, attacked FDR in the 1932 campaign as a Russian-style communist and a promoter of "class antagonsisms" and the Democrats as "the party of the mob," and insisted repeatedly that the Depression was over and told reporters "no one is actually starving . . . The hobos, for example, are better fed than they have ever been. One hobo in New York got ten meals in one day." (Years later, in his memoirs, Hoover explained that during the Depression "many persons left their jobs for the more profitable one of selling apples.")

The only reason present-day Republicans are able to indulge in Herbert Hooveresque talk is because they don't have to bear the consequences of Hebert Hooveresque policies, and they (alas correctly) calculate that historical memory is insufficient for most people to be aware of what happened last time we tried them. It is a kind of political moral hazard: it's easy to take reckless and grandstandingly ideological positions (such as opposing an extension of unemployment benefits, opposing financial regulation, opposing federal aid to the states to avoid laying off teachers and police, opposing deficit spending in a recession) if you know the other party will save you from the actual consequences of those irresponsible positions.

But had the nation experienced — as it did from 1929 to 1932 — three years of suffering at the hands of conservative economic prescriptions, they simply could not get away with it. By 1932 the nation was without a shred of doubt as to what the real consequences were in a recession of limited government, balanced budgets, "states rights," hands-off laissez-faire, and tax policies favoring the rich. Unemployment was approaching 25 percent; hundreds of thousands of children were out of school because localities had no money to pay teachers (though in Chicago teachers worked without pay to keep the schools open) while Hoover sill adamantly blocked direct federal aid to individuals and municipalities (though his Reconstruction Finance Corporation gave large loans to banks); millions of Americans literally fended off starvation by prowling through restuarant refuse bins for rotten scraps of food, or gleaning farm fields for discarded vegetables, or standing for hours waiting for a handout at the inadequate soup kitchens and bread lines. Desperate farmers blocked highways and mobbed foreclosure auctions literally threatening to hang court officials if they tried to go through with the sale. In January 1933 the president of the conservative American Farm Bureau told the Senate Agriculture Committee, "Unless something is done for the American farmer we will have a revolution in the countryside in less than twelve months."

That such radicalization is not abroad in the land in the Bush recession is ironically a product of the very policies Republicans feign to denounce so bitterly. Ironically, the Democrats have done just enough to make the world safe for laissez-faire ideologues, but not enough to command the political loyalties of those who would be the first to suffer if those ideologues ever have the opportunity again to match actions to their words.

--

For those who would like to refresh their memories about what actually happened the last time we tried out the policies now being espoused by the GOP leadership, I can recommend no better place to start than historian Robert McElvaine's superb book The Great Depression, an exemplary blend of analytical and narrative history that brings the era to life along with clear explanations.

Monday, September 20, 2010

To the barricades in Oklahoma

When it comes to manning the bulwarks of liberty, we can count on the Oklahoma state legislature to be the first to answer the call whenever America is threatened by enemies domestic or foreign, real or imaginary.

As part of the modestly titled "Save Our State" constitutional amendment passed by the legislature and on the ballot in November, state courts would be barred from "considering or using" international law or Sharia law. The initiative, explained its chief sponsor, Rep. Rex Duncan (R), "will constitute a preemptive strike against Sharia law coming to Oklahoma . . . While Oklahoma is still able to defend itself against this sort of hideous invasion, we should do so." Duncan said it was also a "preemptive strike" against "liberal judges" who want to "undermine [the] founding principles of America."

Not to be outdone, Newt "America At Risk" Gingrich brought the crowd at the "Value Voters Summit" to its feet over the weekend by proposing a federal law to the same effect.

Most Americans were probably unaware that the Oklahoma courts were about to be invaded by legions of fatwah-issuing mullahs craftily disguised as liberals. But the larger conservative propaganda line about "unelected" "activist" judges who try to write the law rather than just "enforce" it — which lies behind such fantasies — has been far more successful in penetrating the American consciousness, even though it constitutes just as great a distortion of legal and historical reality.

The fact is that from the founding of our nation, judges have always made the law and not merely enforced it; they have regularly cited foreign and international precedents; and they have frequently "considered or used" international law — also known as "the law of nations" — in deciding admiralty cases, disputes involving foreign countries or citizens of foreign countries, and the laws of war.

Once again, it is striking how those who talk the loudest about the good old days of the Founders and original constitutional principles are invariably the least informed about any actual history. During the debates over the Constitution, it was (interestingly) conservatives who were most adamantly in favor of insulating judges from the passions of popular democracy, the most adamantly in favor of judge-made law, the most adamant about curbing the power of state legislatures to interfere with the English common law that America's courts had inherited.

American and English common law was in its roots entirely judge-made law: it was an accumulation of basic legal precepts and precedents, and to James Madison and other authors of the Constitution it was one of the cornerstones of American liberty — and one that was suddenly being threatened by an "excess of democracy" as state legislatures passed a tidal wave of ill-considered statutes that abrogated contracts, catered to special interests, and created confusion and uncertainty as new laws were passed and repealed with dizzying rapidity.

Conservatives love to repeat the line that Chief Justice John Roberts used in his confirmation hearing  about the role of judges being properly confined to "calling balls and strikes," but that has never been the role of judges under the American-English system of common law (as Roberts, a graduate of Harvard Law School, surely knew). Virtually every court decision involves an examination of what is called "case law": previous precedents established by previous "activist" judges "making" the law.

When I was researching my new book on the War of 1812, I read a number of fascinating federal court rulings concerning trading with the enemy, the validity of seizures of merchant vessels, and other like issues raised by the war against Britain on the high seas. Many dealt with situations that had not before arisen in America, and all extensively examined not just English case law but the rulings of admiralty courts throughout the civilized world (even — horrors — France) in an attempt to ascertain the basic principles of the "law of nations" that should apply.

The claim that citing foreign law "undermines the Founding principles of America" would certainly be news to the Founders.

The Oklahoma initiative is remarkable in that it specifically defines international law as including treaties — which that Constitution that the tea partyniks like to wave around explicitly defines as the supreme law of the land (" . . . all Treaties made, or which shall be made, under the Authority of the United States, shall be the supreme Law of the Land; and the Judges in every State shall be bound thereby, any Thing in the Constitution or Laws of any State to the Contrary notwithstanding").

P.S. It is a historical fact that conservative alarm over "activist" "liberal" judges arose for the first time in the 1950s, when Federal courts began ordering the desegregation of schools.


---

Please check out my newly revamped author website, which along with information about my books includes a new NEWS AND REVIEWS section which I will keep updated with my published articles and reviews and other happenings.

Saturday, September 18, 2010

It's a smear

This year's political campaigns have produced a new definition of the word "smear." It now apparently means something true but unflattering. This definition seems particularly appealing to the crop of inexperienced "tea party" candidates who have suddenly found themselves in the political spotlight this year and are probably regretting ever having used that line about "we need to run our government the way we run our businesses and personal lives . . ."

Thus the campaign manager of the New York Republican candidate for governor, Carl Paladino, denounced it a "smear" by the "liberal elite" when reports surfaced that he had a habit of sending a long list of e-mail friends revolting pornographic videos and racist "jokes"; likewise the camps of other far-out GOP candidates have termed as "smears" the (completely true) facts that have emerged about their promotion of a pornographic website (Ben Quayle), their trail of unpaid bills, a mortgage default, and an IRS lien (Christine O'Donnell), and the various simply bizarre things they themselves were caught saying on camera just a few months earlier (Sharron Angle).

Once again Ambrose Bierce comes to the rescue to explain this:

Defame, v.t. To lie about another. To tell the truth about another.

Thursday, September 16, 2010

Nu, let us go then

A small detour on our trudge through life's vale of tears  . . .

I have been diverting myself lately from the travails of national politics and the legions of wrong-thing sayers abroad in our land with bits and pieces from the new Oxford Book of Parodies, a wonderful reminder of the devastating power of this ancient art form when wielded by the hands of a professional.

Great parodies are not just funny: they can kill. The best exemplars let their victims practically destroy themselves: one deft poker-faced flick of the wrist, and the target is left sprawling helplessly across his own pretentiousness, unoriginality, and pedantry. The more "high-minded," earnest, and self-regarding the original, the funnier and deadlier the resulting pratfall.

One of my favorite specimens (which Paul Fussell introduced me to in his superb book on World War II, Wartime) is Edmund Wilson's venomous demolition of the very self-regarding Archibald MacLeish. As Librarian of Congress, MacLeish was given to delivering lectures praising American literary earnestness and decrying the satire and wit of writers like H. L. Mencken (who served only "to poison the belief of the people in themselves"). MacLeish himself had written a very earnest and high-minded poem entitled The Hamlet of A. MacLeish, a monologue in which the author casts himself as sensitive soul, spiritually akin to Shakespeare's tragic hero, full of pathos and inner torment. Executed slavishly in the style of Coleridge's The Rime of the Ancient Mariner, MacLeish's poem goes on for about 300 lines and even includes the italicized marginal glosses, à la Coleridge's original, that ostensibly explain the poem's meaning and moral ("He is reproved his melancholy by the uncle-father.")

In Wilson's version (The Omelet of A. MacLeish ), MacLeish's high-sounding language is laid bare for the pompous, empty, copy-cat, middle-brow hackery it is:

                                        Anabase and The Waste Land:
These and the Cantos of Pound: O how they came pat!
Nimble at other men's arts how I picked up the trick of it:
Rode it reposed on it drifted on it: passing
Shores that lay dim in clear air: and the cries of affliction
Suave in somniferous rhythms: there was rain and there was moons:
Leaves falling and all of a flawless and hollow felicity . . .

The brilliant part, though, is Wilson's parodies of the marginal glosses:

MacLeish breaks 
an egg for his 
omelet.

He puts plovers' 
eggs and truffles 
into his omelet.

He is doomed to go 
on doctoring his 
omelet.

The omelet 
becomes a 
national institution . . .

Speaking of recipes, the real treasure in the Oxford Book is "Lamb with Dill Sauce à la Raymond Chandler," which comes from Mark Crick's Kafka's Soup: A Complete History of World Literature in 14 Recipes:
. . . I needed a table at Maxim's, a hundred bucks and a gorgeous blonde; what I had was a leg of lamb and no clues. I took hold of the joint. It felt cold and damp, like a coroner's handshake. I took out a knife and cut the lamb into pieces. Feeling the blade in my hand I sliced an onion, and before I knew what I was doing a carrot lay in pieces on the slab. None of them moved. . . .
     In this town the grease always rises to the top, so I strained the juice and skimmed off the fat. . . . I put the squeeze on a lemon and it soon juiced. It was easy. It was much too easy . . .
The Oxford Book contains but a passing mention of another work of parodical genius which I gather has been well known in certain circles for some time, but which was also news to me. I need to preface this by saying that "The Love Song of J. Alfred Prufrock" is one of two poems I ever committed completely to memory, something I did to distract myself during a less than completely happy year I spent in graduate school attempting to grasp quantum mechanics, differential equations, and girls (read that however you want); it's still one of my favorite poems of all time.

That said, I can well appreciate why Robert Pinsky would cite "Der shir hashirim fun Mendl Pumshtok" as the finest poem written by an American in the twentieth century. "Pumshtok" is a glorious Yiddish send-up of "Prufrock" composed in the 1930s by Isaac Rosenfeld and Saul Bellow. Though they never published it, the poem was passed on enthusiastically in a sort of oral tradition by friends who had memorized at least a part of it. I was able to find the Yiddish transliteration in several places on the Web. ("Shir hashirim" is the Hebrew for "The Song of Songs"):

Nu-zhe, kum-zhe, ikh un du,
Ven der ovnt shteyt uf kegn dem himl
Vi a leymener goylm af tishebov.
Lomir geyn zikh, durkh geselekh vos dreyen zikh
Vi di bord fun dem rov.

Oyf der vant fun dem koshern restorant
Hengt a shmutsiker betgevant
Un vantsn tantsn karahod. Es geht a geroykh
Fun gefiltefish un nase sokn.
Oy, Bashe freg nisht keyn kasha, a dayge dir
Lomir oyfenin di tir
In tsimer ve di vaybere senen
Redt men fun Marx un Lenin.

Ikh ver alt...ikh ver alt...
Un der pupik vert mir kalt.
Zol ikh oykemen di hor,
Meg ikh oyfesn a floym?
Ikh vel tskatsheven di hoyzn
Un shpatsirn bay dem yam.
Ikh vel hern di yam-moyden zingen khad gadyo
Ikh vel zey entfern, Borukh-habo.

An English translation was harder to come by, and of course nothing can quite do justice to the Yiddish "original," but here is an approximation, with some further explanatory notes below:

Nu, then, come, then, me and you,
When the evening stands beneath the sky
Like a clay golem on Tisha B'av.
Let us go, through streets that twist themselves
Like a rabbi's beard.

On the wall of the kosher restaurant
Hangs dirty bedding
And bedbugs dance in circles. There is a stink
Of gefilte fish and wet socks.
Oy, Bashe, don't ask questions, why bother?
Let me open the door
In the room where the wives are
Speaking of Marx and Lenin.

I grow old . . . I grow old . . .
And my navel grows cold.
Should I comb my hair?
May I eat a prune?
I will put on pants
And walk by the sea.
I will hear the sea-maidens sing Chad Gadya,
I shall answer them: Baruch Ha-ba.

The genius of this lies not just in Ronsenfeld's and Bellow's cultural one-upmanship of producing deliriously Yiddish equivalents of Eliot's waspy images of tea and neckties and white flannel trousers, but in exploding the entire air of aestheticized pathos that surrounds "Prufrock" —  exposing, thanks to their commonplace Yiddishisms, that there is absolutely nothing so special about Prufrock's sense of tragic, existential alienation other than the linguistic polish Eliot gives it.

Prufrock's ethereal "Let us go then, you and I" becomes simple nagging ("Nu, then, come, then"); Eliot's "patient etherised upon a table" becomes "a clay golem on Tisha B'Av" — which is actually about a triple-play, as a "leymener goylm" doesn't really refer to the golem that's the supernatural creature of Jewish folklore but is rather a vivid Yiddish metaphor for a clod, a person who's a hunk of wood, and Tisha B'Av is the Jewish day of mourning for the destruction of the Temple in Jerusalem; as Ruth Wisse observes, it's as perfect "an image of national impotence as one can imagine." It's Pumshtok saying to Prufrock: "You think you've got troubles?"

And of course, "May I eat a prune?" is absolutely no different in substance or meaning or intent from Eliot's famous "Do I dare to eat a peach?"; likewise there's nothing to "Oh, do not ask, 'What is it?" Let us go and make our visit" that isn't completely covered by the plain language of "Oy, don't ask questions, why bother?" In each case, as "Pumshtock" mercilessly reveals, Eliot's poem owes its glow of poetic profundity simply to a patina of refined language, which the Yiddish version unceremoniously strips bare, revealing its absurdly ordinary core. Baruch Ha-ba!





Wednesday, September 15, 2010

Tea Party primary victories in perspective

A picture is worth a thousand words of punditry . . .

Anti-incumbent fervor update

Only in America can a millionaire businessman invest $3 million of his personal fortune to win a primary election and claim with a straight face to be a "grassroots" candidate storming the citadel of the "ruling class" and the "elite."

Inevitably, the victories of two remarkably sketchy Tea Party–backed candidates in the Republican primaries in New York and Delaware yesterday are being cited as further proof of this year's "anti-incumbent" tide; what they are really proof of is how far to the fringe the Grand Old Party has shifted — and how much that is being exploited by the same old cut-the-taxes-on-the-wealthiest crowd, this time in the guise of populist upwell tricked out with lots of psychologically self-vindicating talk about "anger."

Polling data undeniably point to significant gains for Republicans in the upcoming midterm elections; the most recent forecast by FiveThirtyEight predicts the Republicans will pick up 57 seats in the House (to gain a 225–210 majority) and 7 in the Senate (cutting the Democratic majority to 52–48).

Such midterm gains by the party which does not hold the presidency are not exactly unprecedented. In 1994 Republicans gained 54 House seats; in 1974 Democrats gained 49; in 2006 Democrats gained 31.

There are many good explanations that can be offered for a GOP backlash this November. There is also one remarkably lazy explanation: that it reflects "an anti-incumbent sentiment," "an anti-incumbent fervor," "an anti-incumbent tide," an "anti-incumbent wave," an "anti-incumbent mood," an "anti-incumbent attitude," and "anti-incumbent disgust" . . . phrases that appeared (1140 times) in the last 30 days in the New York Times (chosen only because it is completely representative of this brand of analysis).

Here is the updated chart on the number of incumbents running for re-election who were defeated in primaries:


Stephen Budiansky; basic data: centerforpolitics.org and New York Times

Tuesday, September 14, 2010

Doesn't sound like a winner . . .

Why the Grand Old Party is determined to fall on the sword of tax cuts for the very rich might make an interesting subject for anthropologists engaged in the study and explanation of tribal beliefs.

A new poll out today from the Pew Research Center finds that only 29 percent of Americans favor keeping the Bush tax cuts for the wealthiest 2 percent. Although that's perhaps 27 percent more than one might expect, it's still not exactly the stuff majorities are made of. (Interestingly, an equal number favor repealing all the Bush tax cuts for all income brackets.)

Not even a majority of Republicans want to keep those tax cuts on the top 2 percent. Given that it would cost $700 billion over ten years to do so—and hand more than $100,000 a year apiece to everyone earning a $1,000,000 or more—it's perhaps not surprising that this is not the most popular idea abroad in a land in the midst of the worst economic downturn since the Great Depression.

Other interesting finding of the poll: while a plurality (45–38) disapprove of the new health care law, only 32 percent favor repealing it.

Income distribution and economic growth

I was taken to task the other day by a reader for saying that the growing income inequality of the 1920s was the "overwhelming cause" of the Great Depression. I should have said "the fundamental cause" or the "single most important cause," but I otherwise stand by the point. The widening gap between what the economy could produce (as worker productivity soared) and what it could consume (as workers' wages stagnated) caused a cascading crash in demand when the crisis came to a head, signaled by the stock market crash of 1929.

The massive upward redistribution of wealth and income produced by the productivity–wage gap of the 1920s was further accelerated by tax policies that favored the rich. The marginal income tax rate on incomes of $1,000,000 or more was cut from 73% to 24%.

As Robert Reich notes in his new book Aftershock, too many of today's politicians have forgotten "the larger lesson of the 1930s: that when the distribution of income gets too far out of whack, the economy needs to be reorganized so the broad middle class has enough buying power to rejuvenate the economy over the longer term.” Fundamentally, that means ensuring that workers once again reap the benefits of growing productivity.

But tax policy matters, too: When wealth is increasingly shifted to the top brackets, the economy as a whole increasingly lacks the spending power needed to keep it on a healthy course. Middle class and poor earners always spend a much higher percentage of their incomes than do millionaires and billionaires. Taxing a higher proportion of top incomes to stimulate demand — through government purchases of goods and services (for example, infrastructure programs) and by increasing middle class purchasing power either directly by reducing taxes on lower brackets or indirectly through public goods that the middle classes could not otherwise afford to pay for themselves (universal education and health care, for example) — is one way to restore the balance in an increasingly unbalanced economy.

When the Great Depression hit in 1929, the Republican denizens and Wall Street leaders were unanimous both in pooh-poohing its seriousness and in insisting that the best policy was to "let nature take her course" (in the words of New York Stock Exchange President Richard Whitney). Treasury Secretary Andrew Mellon's prescription for recovery was, "Liquidate labor, liquidate stocks, liquidate the farmers, liquidate real estate." Mellon told Hoover that the depression "was not altogether a bad thing . . . people will work harder, live a more moral life . . . and enterprising people will pick up the wrecks from less competent people." Hoover steadfastly opposed an increase in government spending as a response to the Depression, insisting (in denouncing a relief bill as an "an unexampled raid on the public treasury") that "we cannot thus squander ourselves into prosperity."

It is scarcely short of the incredible that Republicans in the Congress of the year 2010 are oblivious to this history, as they call for retaining tax cuts on the wealthiest 2 percent, as they denounce emergency government spending in a recession as "bailouts," as they oppose unemployment relief extensions, as they have tried even to make "stimulus" a dirty word.

But as Robert Reich points out, even Democrats are forgetting the "larger lesson" of the 1920s, which is that gross income inequalities threaten the long-term stability of the economy. It is not a question of even "economic justice" or social fairness that drives this consideration; it is, as I said the other day, basic economic realities.

Here by the way is a chart of the percentage of total personal income earned by the top 1 percent, courtesy of economist Emmanuel Saez (much more detailed data available on his website). It is unlikely to be a coincidence that the last time we saw such inequality was the last time our country was plunged into a serious depression:


Monday, September 13, 2010

Tyrannical health care

Virginia attorney general Kenneth "Ken" Cuccinelli took time out from his crime-fighting crusade against climate change scientists, nondiscrimination protections for homosexuals, and the bare-breasted Roman goddess on the state seal to attend a Tea Party rally in Washington yesterday. He told the crowd of a few thousand (brightened by the usual interesting costumes) that “King George III and the parliament of Great Britain that we rebelled against respected the liberty of the colonists of America more than the Congress and the president of the United States of America,” and called the Obama health care law “the greatest erosion of liberty" in his lifetime.

Although I am not the chief law enforcement officer of Virginia, and do not even hold a law degree from Pat Robertson's Regent University in Virginia Beach (dedicated to educating "Christian leaders who will change the world for Christ"), I can offhand think of a few other candidates for the "greatest erosions of liberty" in my lifetime:

The Mississippi State Sovereignty Commission, which from 1956 to 1977 employed private detectives and paid informers to spy on more than 80,000 citizens suspected of supporting integration; funneled state funds to a national campaign to prevent passage of the 1964 Civil Rights Act; and secretly instructed county election registrars on how to prevent African Americans from registering to vote.

Operation Chaos and Project Minaret, CIA and NSA operations in the 1960s and 1970s that illegally tapped the phones and opened the mail of thousands of United States citizens who opposed the Vietnam War or engaged in other "subversive" political activities.

The Plumbers Unit that, on direct orders of the President of the United States, wiretapped journalists, burglarized and planted listening devices in the offices of political opponents (including the headquarters of the Democratic National Committee), and ordered the IRS to harass and investigate political "enemies."

The Bush Administration's secretly ordered "terrorist surveillance program" that conducted mass surveillance of domestic communications in direct contravention of the Foreign Intelligence Surveillance Act, which makes it a felony offense for any official acting under the color of law to monitor domestic communications without a court-approved warrant.

Bush v. Gore, in which five unelected Federal judges determined the outcome of a democratic election by ordering a halt to a recount being conducted in accordance with state law — while ruling that their decision created no legal precedent in future cases.

Sunday, September 12, 2010

More GOP class notes

Recent activities of the Grand Old Party. Republican Party activist Steve May has continued his inspiring work among the homeless in Arizona, giving them a new start in life as Green Party candidates who, May says, will siphon votes away from the Democratic ticket in November. Congratulations to Christine O'Donnell! The Tea Party-endorsed challenger running in the Republican U.S. Senate contest in Delaware received her undergraduate degree from Fairleigh Dickinson University last week, 17 years after she said she had graduated. Bristol Palin has decided to drop out of the nursing assistant program she enrolled in to concentrate full-time on her career as an unwed teenage mother. This summer she played the part of an unwed teenage mother named Bristol on ABC's The Secret Life of the American Teenager and this past week she received $14,000 for a speech warning young women not to ruin their careers by becoming unwed teenage mothers. From Carl Paladino's friends comes the sad news that he's had to stop e-mailing his entertaining racial jokes and bestiality videos now that he's a candidate in the Republican primary for governor of New York. But the campaign hasn't completely stolen Carl's sense of wacky good fun! Just this weekend he sent voters a mailing impregnated with the odor of rotting garbage to make the point that state government "stinks." God, who has been funding Joe Miller's campaign in Alaska for U.S. Senate, could use some help. As Joe's campaign explained in an invitation to an open house to meet the candidate: "So far the Lord has always provided the money in this grass roots campaign, and this time God is going to use you to provide! . . .  Bring Check Book or Credit Card. Chips will be provided."

Friday, September 10, 2010

Great moments in bookburning history

June 16, 1953:  Secretary of State John Foster Dulles admitted that a small number of books had literally been burned by staff of American libraries overseas after they were ordered to rid their shelves of works by communist authors. An embarrassed President Eisenhower, speaking at the Dartmouth College commencement, urged the graduates not to "join the bookburners" but to read and learn what communism really is and fight it with full understanding. The banned authors included NAACP president Walter White, New York Herald Tribune chief Washington correspondent Bert Andrews, and detective novelist Dashiell Hammett.

November 16, 1973:  Residents of Drake, N.D. (pop. 650) were dumfounded at criticism leveled against their town after school employees burned 32 paperback copies of Slaughterhouse Five on orders from the school board, which had concluded that the book was "unsuitable for 15-year-old minds." A school spokesman said the board was also examining books by John Steinbeck, Ernest Hemingway, and William Faulkner that had been assigned by the same teacher and if they were found unsuitable they would destroyed, too.

March 21, 1958:  Department of Agriculture officials were reported to have burned 2,500 copies of a recent farm census report that implicitly criticized Republican farm policies. Assistant Secretary of Agriculture Don Paarlberg admitted he had ordered the booklets destroyed because they were "deemed to be statistically not representative in certain respects."

November 24, 1952: A preacher in Rocky Mount, N.C., announced he would burn a copy of the New Revised Standard Version of the Bible to protest the substitution of "young woman" for "virgin" and other changes from the King James version. He also charged that the National Council of Churches of Christ was deriving an "unmoral profit" from royalties on the book. "I think their price is a little steep anyhow," he added.

Thursday, September 9, 2010

GOP revisionist history (cont.)

President Obama's declaration yesterday that he would not accept any extension of the Bush tax cuts for the top 2% of income earners was not just a bit of ceremonial populism: it was a step to address the fundamental, structural precariousness in the economy that helped cascade us into the Bush recession.

Republicans have been coming up with some imaginative history lessons lately about the Great Depression and its causes — here's a howler from Rep. Michele Bachmann (R-Mars). Although many factors exacerbated the Great Depression — highly leveraged financial markets, overextended consumer credit (sound familiar?) — its overwhelming cause was the growing income inequality of the Roaring Twenties and the resulting imbalance in demand for what the economy could produce. From 1923 to 1929 manufacturing output per worker hour — productivity — soared by 32 percent. But wages grew by only 8 percent during the same time. Simply, that meant the economy was producing more and more stuff that wage earners lacked the money to buy. (For an excellent historical account, see Robert S. McElvaine's The Great Depression.) The shift of income to the wealthy was accelerated by Republican tax policies (sound familiar?) of the 1920s that slashed the taxes paid by millionaires by more than two-thirds.

During the 1950s wages grew in step with productivity, and the result was one of the greatest sustained periods of prosperity and growth in U.S. history. But the gap has been again widening precipitously since the 1970s, as shown on this chart I made using data from the Bureau of Labor Statistics:

index 1947=100

Some of the increased revenues that result from productivity gains justifiably go to R&D and capital investments in new technology. But as the following chart (which uses the BLS data plus additional data from the Commerce Department's Bureau of Economic Analysis) shows, more and more of the returns generated by increased productivity during the Bush years went simply to feed corporate profits, which shot ahead of wages at a rate scarcely equaled in the Coolidge and Hoover years:


Reversing this unprecedented upward redistribution of income isn't populist pandering nor is it "socialism": it's basic economics.

Tuesday, September 7, 2010

"You may be right"

Outraged readers who wrote to H. L. Mencken would receive in reply a preprinted card:

   Dear Sir or Madam:
   You may be right.
   Yours sincerely,
   H. L. Mencken

(I have also seen this story attributed to Mark Twain, Alexander Woollcott, Edward R. Murrow, and several other controversialists but it rings truest for Mencken.)

Mencken was serenely unconcerned by criticism and free of the thin-skinned compulsion to defend himself; after all, his whole raison d'etre was to be provocative, and he was hardly going to take umbrage that he had succeeded in provoking people. As he wrote a friend, his brand of opinionated journalism "must be done boldly, and, in order to get a crowd, a bit cruelly."

But like most people who have had the experience of covering politics and public affairs up close, Mencken also had a finely honed sense of what his biographer William Manchester perfectly captured in the phrase "tolerant misanthropy." An early lesson I learned in journalism was the limitless capacity of the human mind for sincere self-delusion. There are of course totally cynical charlatans, frauds, demagogues, and quacks, but they are actually exceedingly rare. In my time as a science writer I became quite interested in the phenomenon of medical quackery, and I interviewed a good many quacks who peddled worthless, dangerous, and unconscionable "alternative" treatments in crassly lucrative schemes. But I never encountered one who I ever doubted was absolutely and sincerely convinced of the truth and virtuousness of his metier.

Being in a business whose job it is to crank out facile arguments on complex matters also tends to give one a certain detached view of the whole process of public controversy — and of how easy it is to convince oneself of just about anything. Leonard Woolf described this perfectly in a wonderful passage about Richard Crossman, who he called "the best journalist I have ever known":
His mind was extraordinarily fertile of ideas; it teemed with them, and if you dipped into it, you brought up a shoal of brilliant, glittering ideas, like the shoal of shining fish that one sometimes sees in a net pulled out of the sea by a fisherman. It is true that Dick's ideas were almost as kaleidoscopic in colour and as slippery to keep a hold on as the mackerel for, having written a glittering and devastating article one week, he would turn up the following Monday with the most brilliant idea for the most brilliant article contradicting his most brilliant article of the previous Monday. And on each of the two Mondays, Dick, I am sure, believed passionately in each of the two ideas.
My three years at Nature left me painfully aware that scientists are about the worst people on earth when it comes to confusing their political inclinations with objective fact — and absolutely the worst in the concomitant certainty that one's opponents must be liars, frauds, or corruptly motivated, since (obviously) no honest person could possibly have reached a contrary conclusion through objective reasoning. As absurd and unwieldy as democracy is in handling scientific matters, I found myself constantly thankful that scientists weren't running things, mainly because of this supreme intolerance for differing political conclusions.

Many of the standard laments about the Internet are, I have always felt, overblown; I continue to be amazed not only that, thanks to its marvels, I now know for the first time in my life how to successfully slow-cook ribs over a charcoal fire, exactly where in the neck to give a horse an injection, and the complete lyrics to Spike Jones's "He Broke My Heart in Three Places (Seattle, Chicago and New York)," but also that far from isolating and separating people the Internet has made it possible to share information and make connections with people and ideas that never would have happened before. And as I mentioned the other day, there is much in the old business of print journalism that no sane person would ever regret.

And of course there have always been zealots; you can't blame the Internet for that. I have always loved the observation that Hugh Trenchard (the first chief marshal of the RAF and not exactly a model of open-mindedness himself) made about Billy Mitchell, the great American evangelist for air power in the 1930s: "He tried to convert his enemies by killing them first."

But one thing I do lament about the Internet is the way it has tended to amplify self-righteousness. I am sure this is old hat to most readers but I came across the other day for the first time this classic cartoon by xkcd:




The cartoon was cited the other day in a very interesting and moving article by Alan Jacobs about the vituperative Internet battles between liberals and conservatives in the Anglican church, and his own futile attempts to keep the discussion on substance rather than character assassination. The overdeveloped sense of moral certainty on each side — and the atrophied sense of charity and humility — left Jacobs saddened and frustrated.

His feelings echoed my reaction to the Internet-frothed bloodlust of those who have been cheering on the appalling and chilling efforts of my state's highly partisan attorney general to launch a criminal fraud investigation of the former University of Virginia climate researcher accused of cherrypicking data to make the case for global warming. The fact of the matter is that scientists, no less than lawyers, politicians, theologians, policemen, historians, criminologists, education theorists, and brilliant-opinion writing journalists, get things wrong all the time. And one lesson I took away along with my enduring gratitude that scientists aren't running things is the conviction that we need to be extraordinarily reluctant for the same reasons to enlist the power of the state to arbitrate, much less punish, scientific disagreements.

My comparison to Lomborg was precisely to make this point: his opponents were not satisfied with refuting him; they wanted to destroy him, and sought a quasi-legal process to do so. And that is precisely the same mindset at work in those who now want to turn a scientific dispute over climate evidence into criminal fraud. Fraud is taking the money and — instead of buying computers, writing software, searching the literature, hiring graduate students, and doing the work — spending it on a blonde and a trip to the Riviera. (Do people take blondes to the Riviera these days? I admit I'm not as up on this as I probably should be.) If fraud is overstating the evidence, using data selectively, or employing methodologies that don't stand up to later scrutiny, then half the scientists in the world will be in jail.

Nothing so aroused Mencken as the illiberalism (an old-fashioned word) of the nervous nellies who demanded the censorship, muzzling, or punishment of contrary views — who were so afraid of engaging the battle of ideas on its rightful and honorable ground that they had to call in the Truth Police to do their dirty work for them. And as Leonard Woolf observed, there are still two kinds of people in the world: those who in the "depths of their brain, heart, and intestines agree with Pericles" and those who "consciously or unconsciously accept the political postulates of Xerxes, Sparta, Louis XIV, Charles I, Queen Victoria, and all modern authoritarians." Woolf has always been one of my heroes for his ability to combine a fierce intellect and a passionate drive to carry on arguments over everything from politics to theology to literature to croquet with an unfailing humanity and liberalism. Even in the most heated disputes he could cite Voltaire's observation, "We may both be wrong"; he always knew that the real evil was the authoritarian mindset that viewed one's opponents' views as not merely mistaken, but impermissible.

Monday, September 6, 2010

Chimpanzees, cabbages, and DNA

For years, I've been irked by the assertion that appears with tedious certainty in every popular article about animal intelligence: chimpanzees, we will be dramatically informed at some point, share 98 or 95 or 98.6 percent of their DNA in common with us — the implication being that they therefore must be 98 or 95 or 98.6 percent just like us in cognitive ability.

Recently re-reading Clive Wynne's book Do Animals Think?, I came across a good explanation of why this is a meaningless yardstick. By the same methodology, one would conclude that all humans are 99.9999 percent alike in brainpower — a conclusion that anyone who has tuned in to talk radio would find hard to swallow. And for that matter a completely random collection of DNA base pairs ought to possess 25 percent of our smarts.

I hope this chart I made (click to enlarge) will clear things up:


Stephen Budiansky

Friday, September 3, 2010

Sustainable sentiments

Wherein your blogger, unwisely disregarding the epigram about angels and where they decline to tread, rushes in once more with a far-too-long essay on land, food, and local agriculture before putting that topic finally (he hopes) behind him, at least for a while  . . .

When I moved to a small farm 30 years ago I set out to be a practical romantic. I had all of the longing for a connection with nature and land and real things that many of the local-food advocates who responded to my recent New York Times Op-Ed expressed in their numerous and often passionate responses. (See also the letters in last Saturday's Times.) The lessons my children learned from growing up on a farm and from the (still) wonderful 4-H program — knowing where food comes from, experiencing the rewards of hard work, caring for animals — were of a worth that remains incalculable. In that time our local sheep-producers organization has grown by leaps and bounds; it was a bunch of old farts and now there are new young families with small children and plenty of savvy people selling their locally raised lamb to nearby restaurants and farmers' markets. Even if this is a drop in the bucket of American agriculture it's good for lots of reasons, not least that it offers another economic purpose for the land around here than simply to serve as the substrate for 6,000 square foot houses.

But I was on guard from the very start against sentimentality, because I knew all too well that sentimentality is but a hair's breadth from arrogance and pretension. I had read enough of American social, geographic, and economic history to know that no matter how much Thomas Jefferson and Currier and Ives glorified and romanticized the American farmer, the people who actually grew up on farms couldn't wait to get away from them. The work was brutally hard, the isolation stultifying. We still like to imagine pastoral scenes of community barn-raisings and rows of canning jars brimming with peaches and mornings fresh with the smell of new-mown hay; then you read what life was really like. Animals and the land were abused in ways literally unthinkable today; diets were atrocious; diseases of man, animal, and plants devastating.

Some of my instinctive suspicion of the local-food movement as a movement — as opposed to just people offering a product to compete with other products in the marketplace — is the wariness I feel towards anyone who puts on the razzamatazz to push their wares. Whenever anyone starts telling me I need to hand over my money as a moral imperative, a moral virtue, or for the salvation of the planet, my first instinct is to check to make sure my wallet is still in my pocket. The language of the huckster pervades this business; to look at most of the websites and literature of local/organic/sustainable sellers you'd think they wouldn't dream of taking your money, so noble is their calling ("We are in the redemption business: healing the land, healing the food, healing the economy, and healing the culture," reads one typical specimen). Old rule of commercial interaction: when someone says it's not about money . . . it's about money.

But what I really object to is the failure of local and organic advocates to confront the true implications of the agenda they are promoting — which would quite simply be devastating for the global environment were we ever compelled to do what an increasing number of its acolytes say we must do. And all I will say to those who so indignantly deny that locavores are "doctrinaire" (I never said they were "loco" or "rabid") is: look at virtually any of the gazillions of local food websites and books, with their lists of arbitrary rules, their admonitions to limit consumption to a 100-mile radius, their "ten steps to becoming a locavore" (as if it were a religion or self-help program), their grandiose claims for what this will all accomplish.

Among the winners in the unintentionally ironic responses I received ("if you call me a fanatic, I'll kill you" division) was a Huffiingtonpost item entitled "Myth of the Rabid Locavore." After asserting that I must have made up all the figures I cited (since they did not square with her romantic world view) the author of this post called this statement of mine
eating food from a long way off is often the single best thing you can do for the environment, as counterintuitive as that sounds
"so ludicrous" that I had to publish it on my own website "because hey, the New York Times is only willing to go so far." But in fact I made exactly the same point in my Times article, when I pointed out that by using modern, high-yielding farm technologies and concentrating the production of crops where they grow best, we have
spared hundreds of millions of acres for nature preserves, forests and parks that otherwise would have come under the plow. . . . The best way to make the most of these truly precious resources of land, favorable climates and human labor is to grow lettuce, oranges, wheat, peppers, bananas, whatever, in the places where they grow best and with the most efficient technologies — and then pay the relatively tiny energy cost to get them to market, as we do with every other commodity in the economy. Sometimes that means growing vegetables in your backyard. Sometimes that means buying vegetables grown in California or Costa Rica
This is the crux of the entire matter. We can grow tomatoes and cucumbers and pumpkins and basil near cities both because these crops are practical for small acreages and because the economics work: they generate high returns that make it pay to grow them on expensive land. But fresh vegetables suitable for local production account for only about 5% of the land that directly feeds human beings (I'm leaving animal forage, exported grain, and the entire corn crop out of the equation altogether, so you don't need to start telling me again about the evils of corn-fed beef and high-fructose corn syrup).

Nearly all of the the rest of the land that directly feeds people (see chart below) is taken up by basic staples like wheat, oilseeds, and dried beans. These are the crops that supply even vegetarians the bulk of their calories — and these are crops that are all fundamentally ill-suited for small or "local" cultivation. Never mind even the economics of it (these crops do not return enough per acre to make it pay to grow them on expensive land near cities): even if you were living off a trust fund, you couldn't find, near U.S. population centers, the 60 million acres of farmland needed to grow these staples.

But much more to the point: why on earth would you want to try to grow these staple crops "locally"? Wheat grows very well in the Midwest where the climate, soil, and natural rainfall are conducive; it grows extraordinarily well there in large stands that can be fertilized and harvested efficiently. Yields per acre, thanks to the development of advanced strains of wheat and the extensive use of synthetic nitrogen fertilizer, have more than tripled in the last century. Worldwide, hybrid varieties and synthetic nitrogen have generated even greater improvements in per-acre yields of rice and other staple food crops. Denounce big-ag all you want; buy local tomatoes all you want; the fact remains that chemical fertilizer, combine harvesters, hybrid crops, and modern transportation networks have done a few billion times more to save the planet than you ever will.

This is not just some theoretical argument about efficiency for efficiency's sake, nor is it an economic argument. You can grow these staple food crops like wheat and rice and pulses and oilseeds the old way on 3 or 4 or 5 times more land, or you can grow them on large plots using modern technology on 3 or 4 or 5 times less land. And we're talking about huge amounts of land, with huge environmental consequences. In India alone, improvements in wheat farming alone have spared 100 million acres of additional cropland that would otherwise have had to be slashed out of forests somewhere over the last 50 years to produce the same amount of wheat that Indian farmers produce today thanks to technological advances. That's the equivalent of three Iowas or 50 Yellowstone National Parks. Without modern farming, we literally would have already cut down every acre of rainforest just to grow the staple food crops that feed the world. Would that be "sustainable"?

Many readers insisted that since we are (supposedly) about to run out of oil, we have to shift to local food production. As I just said, it's literally impossible to do so — the extensive cropland required to raise the staple crops that provide the bulk of our subsistence even if we're vegetarians does not exist around cities. (And by the way it never has. Humans since Roman times have transported grain and cooking oil long distances to feed urban populations.) But it's also just plain bad environmental policy to do so even if we could, for the reasons I've just cited. And unless we contemplate a future with no energy use at all (unlikely), modern agriculture — including its relatively small transportation component — is among the most environmentally efficient energy expenditures we can make. There's a lot of frivolous energy uses we ought to eliminate before we start dismantling the essential energy uses that keep us fed.

Synthetic nitrogen fertilizer, by the way, is produced from natural gas, not oil, and for decades natural gas reserves have actually increased faster than we've been consuming them.

And why, but why, is it "unsustainable" to employ GM varieties? We've been breeding plants and practicing genetic selection for thousands of years; genetic modification is a way to modify one specific trait that we can accurately identify in order to increase yield, earliness, disease resistance — all improvements that reduce chemical and energy inputs. Isn't that a good thing?

And, O, for just a bit of scientific levelheadedness towards all of the ridiculously exaggerated claims about the nutritional and health superiority of "organic" produce (for which there is not a scrap of evidence). This is a matter that raises basic ethical questions, too: the way all too many "organic" and "natural" producers are trying to market themselves by running down their fellow farmers with scare stories and false claims of nutritional or health superiority. (I'm thinking in particular of the joker in Virginia who goes around trumpeting how his lambs are raised without hormones, and darkly implying that everyone else's is; in fact I've never heard of a single farm-flock producer using growth hormones in lambs.)

And while we're at it, why exactly is it so terrible to grow corn where it grows extraordinarily well and turn it into high-quality protein in the form of beef, pork, chickens, milk, cheese, and eggs? Of course this is a more energy-intensive product than flour or vegetable oil. But I keep coming back to the fact that on-farm energy inputs in the U.S. as a whole amount to 2 to 3 percent of U.S. fossil fuel consumption — corn, beef, and all. Households use ten times as much energy for heating, air-conditioning, and running appliances (including, in the average American home, powering your TV set six hours a day).

Finally, we're told that food security depends on local self-reliance. But the locavores have it exactly backwards on this point. Nothing is more vulnerable than self-reliance: one storm that destroys the crop one year, one local outbreak of an insect pest or blight — and if you have no other source to shift to the result is famine. This was the story throughout human history before modern transportation and commerce networks. Networks on the other hand are inherently resilient because a disruption in one spot will be easily compensated for by another. Everyone has been pointing to the recent incident of salmonella-infected eggs as proof of the dangers of globally interdependent "big ag." Yet half a billion eggs were recalled and I sure didn't notice any shortage of eggs in my store, nor even an increase in price. And don't think local means immunity from such problems: the devastating hoof-and-mouth epidemic in Britain a few years ago began on a small, local, traditional farm where pigs roamed around in the muck and ate natural, recycled food.

The growing urban faith that a return to “natural” or traditional farming methods can cure our many ills ultimately has its roots in the nineteenth-century Romantic reaction against industrialization and the loss of the bucolic, a history I explored in my book Nature's Keepers (available in fine second-hand book stores everywhere). But it’s a city-slicker’s attitude, like that of the tragic protagonist in Jean de Florette who wishes to return to the land to “cultivate the authentic,” to the derision of the locals and ultimately his own fatal end. There's also echoes of David Brooks's Bobos in Paradise in all of this: the yuppie-generation compulsion to justify epicureanism as selfless virtue, personal indulgence as global good.

It's only the luxury of affluence that allows us to forget the misery, poverty, and precariousness of subsistence farming, still the lot of all too many in the world — and that allows us to overlook the fact that the modern agriculture we so often decry is what makes our modern, comfortable existence even possible. It is, ironically, our modern distance from the realities of farming — which the local-food advocates at least in their hearts want to bridge — that allows us to believe that buying local is anything but a tiny factor in our global land use patterns and environmental impact, or that more dangerously leads us to believe that a tiny bit of (harmless in itself) self-indulgence ought to be generalized into an imperative for the whole world: for which it would be a disaster.




Data from USDA Census of Agriculture and Steve Savage