The graph above charts the chances of war with Iran as judged over the past few months by the wisdom of the crowd. More specifically, it reflects betting on the proposition “USA and/or Israel to execute an overt air strike against Iran before midnight ET 31 Dec 2012.”

As you can see, the collective wisdom of people who are willing to put their money on the line is that the chances of Iran getting bombed by the end of the year is hovering around 40 percent. And that number has dropped sharply in recent weeks.

This roughly reflects my own view. Certainly the last week, in particular, has reduced the chances of war happening this year, for reasons I’ll enumerate below. But, as I’ll also argue below, the price paid for a reduced chance of war in 2012 is an increased chance of war in 2013.

As for why chances of an airstrike during 2012 have dropped:

1) A few weeks ago, Iran offered to return to the bargaining table, increasing the chances of a negotiated solution.

2) Within the last week, we learned that President Obama had rebuffed Bibi Netanyahu’s request that the negotiations not start unless Iran first suspended its enrichment of uranium. With that obstacle cleared, the P5+1 (the permanent members of the UN Security Council plus Germany) has accepted the Iranian offer, and negotiations are expected to start in April.

3) Obama also did something that increased the chances of the negotiations succeeding. He had long held that Iran shouldn’t be allowed to develop a nuclear weapon. Netanyahu had long held that Iran shouldn’t be allowed to have even a nuclear weapons “capability”–that it shouldn’t be left with the technical wherewithal to produce a bomb should it decide to do that.

Netanyahu’s position, if adopted by Obama, could have impeded a deal with Iran. A ban on an Iranian nuclear “capability,” if interpreted broadly, would mean that Iran shouldn’t be allowed to enrich its own uranium for peaceful purposes, since even a modest enrichment infrastructure reduces the amount of time it would take to produce a weapon (if only reducing it to, say, two years). And pretty much nobody thinks Iran would agree to a deal that meant giving up its entire enrichment program; the hope had been to keep the enrichment modest and place it under intrusive monitoring that could detect any moves toward a weapons program.

Happily, Obama stood firm against Netanyahu. He signaled this in an interview late last week with The Atlantic’s Jeffrey Goldberg, when he repeated that what was unacceptable was Iran’s developing a nuclear weapon. He maintained that position through Netanyahu’s visit to Washington this week.

4) Standing firm against Netanyahu on this issue not only increased the chances that negotiations will succeed; it decreased the pressure Obama will feel to conduct or support air strikes during 2012 in the event that negotiations fail. Depending on how loosely you define “nuclear weapons capability,” Iran could have it well before the end of the year–in fact, if you define it loosely enough, Iran has it now. So if “capability” was the “red line” that Iran can’t cross, Netanyahu could argue that Obama is obliged to start bombing any moment now. But there’s pretty much no chance that Iran will have a nuclear weapon by the end of the year. So Obama can get through the November election and beyond without bombing Iran and without anyone claiming that he’s reneged on his promise to keep Iran from going nuclear.

OK, enough good news. Now for the argument that the chances of war during 2013 have risen, an argument outlined by former Israeli negotiator Daniel Levy of the New America Foundation.

Though Obama basically stood firm against Netanyahu, he did give the Israeli prime minister a consolation prize. Even as Obama refused to move the red line from “nuclear weapon” to “nuclear weapons capability,” he added an extra coat of paint to the red line, stating more emphatically than ever that he wouldn’t let Iran cross it.

Particularly significant was his declaration during the Goldberg interview that “I don’t bluff.” This vow made it into a New York Times headline, and you can rest assured that hawks will recite it via all known media should Iran get close to developing a nuclear weapon on Obama’s watch. And that could happen as early as 2013 should negotiations fail in 2012 (though a weapon developed in 2013 almost certainly wouldn’t be deliverable via missile in 2013). So Obama has managed to reduce the pressure for military action in 2012, but the price he’s paid may be increased pressure down the road.

According to Levy, this price may have been Netanyahu’s goal all along. After arguing that all the talk of an impending Israeli strike has basically been a bluff, Levy writes: “Perhaps this has been the Israeli intention all along: to checkmate the United States by locking it into a logic of confrontation down the road. Israel’s position has, after all, been relatively clear in preferring a ‘stars and stripes’ rather than a ‘blue and white’ label on the military taming of Iran.”

I’ll close with a couple of late-breaking reasons to hope that neither label will be required. (1) Today Iranian Supreme Leader Khameini welcomed Obama’s diplomatic overtures. (2) Yesterday some senators tried to pressure Obama into insisting on the “sustained suspension” of uranium enrichment, and they could muster only 12 supporters out of 100 senators. This may be a secondary effect of all the talk about an impending Israeli strike; it has made American politicians seriously contemplate the consequences of war with Iran, and doing that can make a negotiated solution look pretty attractive.

[Update: Those of you who like thinking about war probabilistically are in luck! Though I won't be providing regular reassessments of the chances of war with Iran, The Atlantic has assembled a whole team of experts who will be doing exactly that as part of our new Iran War Clock project. The dream team currently puts the chances of war at 48 percent, but that's not really at odds with Intrade's estimate of around 40 percent, because the period covered by Intrade extends only until the end of the year, whereas the Iran War Clock will always estimate the chances of war in the subsequent 12 months. And a 48 percent chance of war before early March of 2013 is consistent with a 40 percent chance of war before the end of 2012.]

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook


For the first time on record, more young Greek workers are without a job than with one, Reuters reports.*

Official youth unemployment in Greece has crossed the 50 percent barrier and there’s very little reason to think that this is the ceiling. The economy is still shrinking. The latest round of austerity, which will punish wages and lead to more firings, has yet to set in. And, as economic pain tends to inflict itself disproportionately on the young — young unemployment in the US is similarly twice the national rate — there’s good reason to expect that austerity will bite Greece’s young economy even more severely.

Here’s the graph from Eurostat data, via FT Alphaville. It only goes back to the end of 2011, but it gives you a sense of the scale of the crisis:

greek unemployed.pngPutting the picture into words: The youth unemployment rate in Germany, the Netherlands, and Austria is between 8 and 9 percent. The youth unemployment rate in Spain and Greece is between 49 and (as we learned today) 51 percent.

Remember two things: (1) Things will get worse for Greece’s economy before they get better, and (2) Unemployment is a lagging indicator, which means that things will get worse for Greek unemployment even after the economy gets better, which is scheduled to happen after the economy gets worse.

*Update: Spain, too.

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook


Happy International Women’s Day! Yesterday, we spanned the globe in 1,000 words on the remarkable ascent of women in the workplace.

Today, I present a picture, courtesy of Federal Reserve data, of the four decade rise (and fall, in some cases) of female labor participation rates in some of the biggest, richest countries in the world. Click to expand.

Screen Shot 2012-03-08 at 2.46.56 PM.png

The country with the most growth has been Canada, where female labor participation skyrocketed from a low 38 percent in 1970 to a world-class 62 percent in 2010. The only country to lose ground in the last 40 years has been Japan, which once led the (non-Sweden) world in female participation in 1970, but actually slipped 0.7% in the last 40 years to fall below every country on this list but Italy. 

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook

Mar 092012

If there’s anything new to be said about her flimsy candidacy, HBO’s Game Change can’t find it.

gamechange palin moore 615 HBO.jpg

Julianne Moore as Sarah Palin in Game Change

The most puzzling thing about Game Change, HBO’s handsome and richly cast but strangely inert adaptation of the book of the same name by New York‘s John Heilemann and Time‘s Mark Halperin, is that it’s focused on the selection of Sarah Palin to be John McCain’s vice presidential candidate. Palin was supposed to be the game changer in the race between John McCain and Barack Obama, but she didn’t turn out to be much of one. Her influence was more cosmetic than results-altering.

The movie could have been about how antagonism between Barack Obama and Hillary Clinton gradually turned into a powerful partnership. But perhaps HBO felt like that would have amounted to a campaign ad in an election year; it reportedly rejected an initial script that focused on Clinton and Obama before turning to Palin. Game Change could have chronicled the fascinating, baroque recklessness of John and Elizabeth Edwards, who proceeded with the race in spite of her cancer diagnosis and the fact that he was carrying on an extraordinarily stupid affair with a campaign hanger-on. But then Game Change would have been competing with Aaron Sorkin’s adaptation of The Candidate, the memoir of Andrew Young, the Edwards staffer who initially took the fall for the affair. And so we’re stuck with Palin, the least interesting figure of the lot. What’s worse is that Game Change opts to paint a familiar and unattractive portrait of Palin instead of asking hard questions of the people on both sides of the line she drew through American politics.

‘Game Change’ merely participates the tired, self-satisfied celebration of the country’s collective decision not to let That Woman anywhere near the White House.

It’s difficult to remember now, but when McCain’s selection of Palin as his running mate launched her onto the national scene that summer, the initial predictions for her future career, no matter the results of that campaign, were astronomical. Her rejection of the so-called Bridge to Nowhere was supposed to make her that rarest of political creatures, a genuine anti-waste conservative—until she turned out to be for the bridge before she was against it. Her experience as governor of a major oil and gas-producing state was supposed to make her a major force in energy policy—but the analysis she had to offer rarely rose above the level of “Drill, baby, drill.” Perhaps her most interesting act as a public servant was her last. Palin’s resignation from the governorship of Alaska, unprompted by scandal or illness, was an almost unprecedented act, the rare admission by a politician that elected office is not the ultimate prize.

Instead of following up that surprising move with something genuinely interesting or innovative, or even that had any sort of impact, Palin’s subsequent career has been decidedly common—at least by the standards of the American media industry. She followed in the path of Republicans recently evicted from office everywhere and signed a contract to be a commentator for Fox News. She produced some not-particularly-highly rated specials for the network. TLC created a series called Sarah Palin’s Alaska, which chronicled her shooting Fox segments in her at-home studio and going camping with the epitome of reality television embarrassment, Kate Gosselin. Palin’s daughter Bristol made the standard Z-list celebrity stop on Dancing With the Stars and now is starring in her own reality series, Life’s a Tripp. It’s a long fall from would-be revolutionizer of the American political system to a sidebar slot on gossip magazine covers, but it’s hardly an unprecedented fall.

The most interesting thing about Palin turns out to have been the way that we reacted to her: Why were people so eager to project hopes onto her flimsy candidacy? Now that she’s gone down to irrelevance, why does she inspire so much rage from the people who defeated her or who have come to regard her as unqualified? Game Change (premiering Saturday at 9 p.m.), even if it doesn’t realize it, has a partial answer to the second question: It’s a matter of shame for the people who presented her as a serious candidate in the first place. In her initial conversation with McCain’s aides on the way down from Alaska for vetting conversations, Palin’s a bright, cheery blank when she tells them “I have a servant’s heart, and if you really think I can help this campaign, if you really think I can help this country, I’m with you.” The movie isn’t interested in exploring how her faith motivates her, or how evangelicals frame the world around them.

Instead, it’s all about how McCain’s advisers, set up as proxies for the audience, react to that line; or to Palin’s declaration the basic world geography she’s just then learning is “flippin’ awesome”; or to the sight of Palin near-catatonic in a conference room, rocking back and forth, muttering “Fat. I’m so sick of looking fat.” Most of the time, with the response is contempt or condescension. Steve Schmidt (Woody Harrelson), the McCain adviser who is the nominal main character of the movie, tells Palin, “This goofy diet is bad for you, and I’m alarmed by your weight loss,” before giving up on actually trying to teach her anything about the issues. In one scene, Palin flings a phone at adviser Nicolle Wallace (Sarah Paulson) in a stairwell, screaming “You have ruined me! You have ruined my reputation, I am ruined in Alaska!” By setting the outburst to a horror movie score, Game Change telegraphs only that Palin’s a freak—not that, in a tough campaign, she might have had legitimate concerns about her political future.

And perhaps there’s nothing there to contemplate. Maybe Palin was as difficult, and as mercurial, and as fundamentally empty as Game Change makes her out to be. But if that’s true, it’s a story we’ve heard retold hundreds of times, in hundreds of thousands of words, over the last four years. At the end of Game Change, when Nicolle Wallace’s dramatic admission that she didn’t vote is played as some sort of moral victory, it’s clear that the movie isn’t interested in figuring out what actually happened in 2008. Instead, it merely participates the tired, self-satisfied celebration of the country’s collective decision not to let That Woman anywhere near the White House.

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook


What the rest of the world can teach conservatives — and all Americans — about socialism, health care, and the path toward more affordable insurance

615 emergency hospital reuters health care.jpgReuters

Avik S. A. Roy

Yesterday, Pascal-Emmanuel Gobry posted a stimulating comparison between the American and French health-care systems. “From my outlook,” he writes, “there’s something that I haven’t seen discussed and yet seems striking to me: how similar the French and U.S. healthcare systems are. On its face, this seems like a preposterous notion: whenever the two are mentioned together, it’s to say that they’re polar opposites.”

Indeed, there are a lot of misconceptions about how America’s health-care system compares to those of the other developed countries, including France. Both liberals and conservatives believe that the American system is a “free-market” or “capitalistic” one, and that European systems providing universal coverage are “socialized.” In this article, I’ll explain where both of these conceptions go wrong.


In reality, per-capita state-sponsored health expenditures in the United States are the third-highest in the world, only below Norway and Luxembourg. And this is before our new health law kicks in. (The U.S. appears second in the chart because we only have 2008 data for the Luxembourgers):

In 2009, according to these statistics, which come mostly from the OECD, U.S. government entities spent $3,795 per person on health care, compared to $3,100 per person in France. Note that these stats are for government expenditures; they exclude private-sector health spending.

If anything, the U.S. figures understate government health spending, because they exclude the $300 billion a year we “spend” through the tax code by making the purchase of employer-sponsored health insurance tax-exempt.

So: if we measure the relative freedom of health-care systems by the dollar amount of government involvement in health spending, the French system is actually meaningfully freer than America’s.

There are, of course, other important things to consider in terms of health-care freedom: do individuals have freedom to choose their own doctor, their own insurance, their own treatments, etc. On these bases, countries like the United Kingdom would fare very poorly. But very few people appreciate that the American government spends far more on health care than those of nearly every other country.

(For an excellent discussion of the ins-and-outs European health-care systems, I highly recommend this 2008 paper by Michael Tanner.)

The thing to remember in America is that we have single-payer health care for the elderly and for the poor: the two costliest groups. In addition, the relatively healthy middle class has heavily-subsidized private health insurance, in which few individuals have the freedom to choose the insurance plan they receive. Neither of these facts commend the American health-care system to devotees of the free market.


One of the most frequently-made arguments in favor of socialized medicine is that it saves money, relative to the American system. And it is true that Europeans et al. spend less per-capita, and as a percentage of GDP, than we do.

But the pro-socialism argument has a glaring weakness: it ignores the two most significant examples of market-oriented universal coverage in the developed world, Switzerland and Singapore, where state health spending is far lower than it is in other industrialized nations. Neither Switzerland nor Singapore could be described as libertarian utopias–both systems contain aspects that conservatives wouldn’t like–but they provide powerful examples of how market-oriented health care systems are more cost-efficient than socialized ones.

I’ve described Switzerland as having the world’s best health-care system. In Switzerland, there are no government-run insurance plans, no “public options.” Instead, the Swiss get subsidies, much like “premium support” proposals for Medicare reform or the PPACA exchanges, from which Swiss citizens buy health care from private insurers. The subsidies are scaled up or down based on income: poorer people get large subsidies; middle-income earners get small subsidies; upper-income earners get nothing.

The OECD puts Switzerland high on the league tables in terms of government health spending, but that is due to a statistical anomaly. Switzerland has an individual mandate; the OECD defines state health expenditures to include insurance premiums that the government requires individuals to pay, even if that spending is on private insurance. That is a debatable approach from the OECD, because the spending goes directly to the insurers, without the government as a redistributor. If you adjust for this anomaly, Swiss state health spending is $1,281 per person (which accounts for the taxpayer-financed premium support subsidies). I’ve listed both figures in the chart.

The premium support system allows the Swiss to shop for their own insurance plans, which gives them the opportunity to shop for value–something that almost no Americans do. As a result, about half of the Swiss have consumer-driven health plans, combining high-deductible insurance with health savings accounts for routine expenditures.


The other important market-oriented counterexample is Singapore. Singapore has, arguably, the most market-oriented system in the world. Singapore’s GDP per capita is about 20 percent higher than America’s, with comparable (if not higher) health outcomes, and spends an absurdly low amount on health care relative to the West. How do they do it?

The key to the Singapore system is mandatory health savings accounts: again, something that libertarians and many conservatives wouldn’t like. Matt Miller of the Center for American Progress describes Singapore as “further to the left and further to the right” than the American system–something that could also be said of Switzerland.

In a manner somewhat like our Social Security system, Singapore takes mandatory deductions from workers’ paychecks–around 20 percent of wages–and deposits them into health savings accounts called Medisave. Medisave accounts are used mostly for inpatient expenses, but also some outpatient ones. Singaporeans are expected to pay most of their outpatient expenses with non-Medisave cash.

On top of Medisave, Singapore has a government-run catastrophic insurance program called Medishield. Singaporeans can opt out of that plan and buy private catastrophic insurance. Premiums for Medishield can be paid for using the Medisave health savings accounts.

Then there is Medifund, a safety-net program for the bottom 10 percent of income earners, and Eldershield, a private insurance program for long-term care for those with old age-related disabilities. On top of these government-sponsored programs, Singaporeans can buy supplemental insurance for things like outpatient expenses.

Why does this system work so well? Because it incorporates the central idea behind free-market health care: that health-care spending is most efficient when that spending is executed by individual patients, rather than third parties. It’s easy to waste other people’s money. But if that money is your own, you are going to try your best to spend it wisely.

Singapore installed this system relatively recently. Prior to 1984, the former British colony had a system quite similar to that of Britain’s National Health Service. In that year, the government reversed course, with impressive results. Singapore, of course, isn’t a democracy–which allows the government to install sweeping changes that wouldn’t be realistic here. (And in no way should my praise of Singapore’s health-care system be interpreted as an endorsement of the country’s political system.)


The Swiss and Singaporean models wouldn’t be perfect models for America; we would want to replace the Swiss individual mandate, for example, with a more market-oriented approach like allowing people to opt out of buying health insurance if they also agree to forego subsidized care. But both Switzerland and Singapore embody the most important principle of all: shifting control of health dollars from governments to individuals.

How could something like this come about in the United States? One could imagine a scenario in which Medicare was converted into the premium-support model, such as one of the Paul Ryan plans, with far more aggressive means-testing such that upper-income seniors would no longer be eligible for the program. In addition, the tax exclusion for employer-sponsored health insurance is phased out. The resultant savings could be used to offer subsidized private insurance to lower-income individuals, as a replacement for Medicaid. Obamacare’s exchanges, though seriously flawed in their implementation, have some similarities to this approach. As these programs converge, we could have something that starts to look a lot like Switzerland.

The Singaporean system dovetails with an idea put forth by John Goodman and others of a universal tax credit that Americans could use to buy health insurance, or possibly even Medisave-like HSAs.

My message to conservatives is: wake up. America’s health care system has many qualities, but it is far more socialized than you think, and we can learn from the experience of other countries to make it better. My message to liberals is: if universal coverage is your goal, the possibility for bipartisan compromise exists, if you’re open to considering market-oriented approaches like those in Switzerland and Singapore. Let’s put our heads together.

Follow Avik on Twitter at @aviksaroy.

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook


Katherine Mangu-Ward

People love to freak out about incursions on their privacy. And by “people” I mean cable news shows. A week ago, Google implemented a plan to aggregate (most of) the data they collect from the many, many products they offer. Not to collect new data. Not to publish or disseminate that data in a new way. Just to put all the data in one pot and use it to tailor search results and ads across various platforms. Cue widespread panic about online privacy. Previously sacrosanct! Now utterly violated! Again!

As it happens, we know how much people value their privacy: They’ll sell information about every prescription they fill at CVS — or every pint of Haagen Dazs at Safeway — in exchange for a steady infusion of $1 coupons. They’ll hand off information about the timing of their daily commute in exchange for a couple of minutes saved at a toll booth every day. They’ll let Amazon track their diaper and book purchases because they would rather not re-enter their credit card number every time they want to buy something.

In contrast to those seemingly paltry payoffs, I think people get a pretty decent bargain when they hand over their personal browsing, search, and email data to Google: powerful tailored search results, an elegant, efficient email management system, photo and document storage space of science-fictional proportions, and instant access to every otter video (and/or TED talk) of all time.

The price? Google does its darnedest to sell you stuff you would probably like to buy.

Two some extent this is just a two kinds of people thing. When you watch a YouTube video about replacing busted tire and then Google suggests a local repair shop with high yelp ratings, do you respond “cool” or “creepy”? I’ve always dug the magical quality of advanced technology.

But if you’re more skeeved than pleased, consider letting your brain overpower your gut here. This is a fact you cannot change: All the free stuff on the Internet is possible because you slap your eyeballs on some ads from time to time. If Google and other retailers can’t scrape and sort your data to offer a few well targeted ads, there are two other viable choices: 1) Less of the free stuff you like. Like this blog. It might stop being free. For instance. 2) More ads in the throw-spaghetti-at-the-wall-and-see-what-sticks school. Think: those annoying dancing silhouette gals selling cheap mortgages.

More upsides of life in the Googleverse: Google shrank and simplified their privacy policy, making some decent progress toward correcting this shocking semi-legit statistic from 2008 making the rounds this week: “Reading the privacy policies your encounter in a year would take 76 work days.”

In 2003, Sun Microsystems founder Scott McNealy said: “You have no privacy. Get over it.” He was right then. And he’s even more right today. But don’t just get over it. Learn to love it.

Bonus: For your daily dose of irony, check out this Washington Post which uses the words “spooky,” “creepy,” and “Tom Cruise” in a discussion of Google’s new privacy policy. Now scan partway down the right sidebar, where readers are invited to “find the headlines that matter to you” on “Your Personal Post” — an offer which asks readers to log in in order to customize the site based on the same concepts that power Google.

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook


The right’s latest bogeyman, Derrick Bell, once pondered what would happen if aliens offered gold in exchange for America’s black people.


Americans decided in 2008 that Barack Obama wasn’t a radical black leftist, despite attempts by some conservatives to exploit his relationship with Jeremiah Wright and contact with Bill Ayers. Predictably, President Obama has not governed as a radical black leftist, or adopted the controversial rhetoric of a fiery preacher, or embraced the tactics of the Weather Underground. There is nevertheless another conservative attempt to persuade Americans that he’s hiding something.

The evidence? A couple decades ago, Obama hugged the late law professor Derrick Bell after introducing him during a rally at Harvard Law School. You can read about Professor Bell here. Spoiler alert: there’s no shame in having hugged him. True to its genre, the controversy is a big nothing-burger. But it has renewed interest in a provocative short story that Professor Bell published in 1992. Perhaps the easiest way to characterize it is “critical race theory meets sci-fi.” It begins as alien ships unexpectedly arrive on earth. The aliens have a proposition for the United States.

Their offer:

Those mammoth vessels carried within their holds treasure of which the United States was in most desperate need: gold, to bail out the almost bankrupt federal, state, and local governments; special chemicals capable of unpolluting the environment, which was becoming daily more toxic, and restoring it to the pristine state it had been before Western explorers set foot on it; and a totally safe nuclear engine and fuel, to relieve the nation’s all-but-depleted supply of fossil fuel. In return, the visitors wanted only one thing-and that was to take back to their home star all the African Americans who lived in the United States.

The jaw of every one of the welcoming officials dropped, not a word of the many speeches they had prepared suitable for the occasion. As the Americans stood in stupefied silence, the visitors’ leader emphasized that the proposed trade was for the Americans freely to accept or not, that no force would be used. Neither then nor subsequently did the leader or any other of the visitors, whom anchorpersons on that evening’s news shows immediately labeled the “Space Traders,” reveal why they wanted only black people or what plans they had for them should the United States be prepared to part with that or any other group of its citizens. The leader only reiterated to his still-dumbfounded audience that, in exchange for the treasure they had brought, they wanted to take away every American categorized as black on birth certificate or other official identification. The Space Traders said they would wait sixteen days for a response to their offer.

Had white Americans swiftly repudiated the alien offer due to their basic decency and lack of racism, the story wouldn’t be controversial, so I trust I am not spoiling anything by revealing that isn’t how things go in the piece. Bell has a rather dark view of human nature — see The Lottery for another dark exploration of a similar theme — and can you blame him? He was born in 1930 and spent his early career helping to desegregate swimming pools and schools over the objections of racists who wanted to keep them segregated. And he was later forced to resign from the Justice Department for the transgression of refusing to give up his NAACP membership.

I don’t really understand what the conservatives who conclude from this story that he is a racist are talking about. In fact, he seemed to think, circa 1992, that a majority of whites still harbored complicated but ultimately racist attitudes toward blacks. If the aliens came to America today, their ships loaded with gold, I don’t think America would agree to sell them its black residents if it went to majority vote. I am less certain how a referendum on Muslims would go. America would certainly have sold aliens citizens of Japanese ancestry in 1941. And also Native Americans at various points throughout our history.

Those objecting to “The Space Traders” would do well to acknowledge that for many decades of American history, including years during Professor Bell’s life, a majority of Americans would have voted in favor of trading blacks for fantastic wealth, unlimited energy, and an end to pollutants. I wonder, if God could run the hypotheticals for us, and Americans were forced to wager $1,000 of their own money, what year they’d choose as the first when blacks would win the referendum. I’d be curious for an answer from Diane Ellis, who seems to think that the story is deplorable and evidence of Bell’s alleged racism. I’d say it’s evidence that his experiences made him understandably pessimistic about how racial majorities will treat racial minorities given the right circumstances. To label someone as a racist for honesty conveying his dark view of human nature is the sort of politically correct, reductive stifling of speech that conservatives are supposed to stand against.

Flickr user MJTMail


Good and bad, the controversial Web publisher had a major impact on media, politics, and our public discourse.

breitbart fullness.jpg

The day that Andrew Breitbart died, the short obituary I published in these pages urged everyone to reflect on the evident love he had for his family, the energy with which he conducted his work, and the personal generosity he showed friends. People possessed of those qualities die every day without mention, some readers noted, arguing that the deaths of public figures should occasion no more than narrow assessments of their professional legacy. But the double standard would be better resolved in the other direction. The journalist’s charge is to convey reality, and although the press treats politics and business as though they’re of unique importance, it isn’t so. We’d do well to reflect more on the private people who shape society. The significance of apolitical, non-economic acts are often overlooked and under-appreciated.

For staunch critics of Breitbart, it is especially important to acknowledge his best attributes. They help to explain the posthumous outpouring of support he has received. His personal friendships with public figures are distorting how they are judging his professional legacy, but people who behave badly in one sphere can set an example in others. Matt Yglesias, a model of online civility compared to Breitbart, controversially tweeted after his untimely death, “The world outlook is slightly improved with Andrew Breitbart dead.” Few who’ve pilloried Yglesias objected even once to the daily stream of ad hominem incivility coming from the Breitbart publishing empire or his personal Twitter stream, so the righteous outrage on display is a bit hollow. But there is a better objection to Yglesias’ tweet than calling it uncivil: the fact of the matter is that even a critic of Breitbart’s professional legacy has no reliable way of measuring it against his personal life and making a summary judgment about his overall impact on the world.

Bear that in mind in this assessment of Breitbart’s professional legacy, the aspect of his life I am most qualified to comment upon. It includes praiseworthy achievements. As Nick Gillespie noted at CNN, Breitbart played an important role in the creation or evolution of pioneering Web sites like The Drudge Report, The Huffington Post, and his “Big” sites — whatever one thinks about their content, they helped spur advances in the Web medium the fruits of which are now universally available. Said Gillespie, summing up who benefits, “It’s the conservatives at Drudge, the liberals at HuffPo, the leftists at DailyKos, the libertarians at Reason. It’s all of us and Breitbart helped create and grow a series of do-it-yourself demonstration projects through which we can all speak more loudly and more fully. Breitbart is dead, but the conversation pits he built will live on.” Perhaps they’ll even improve with time like The Huffington Post, which started out as a glorified online diary for celebrities. It’s now publishing Radley Balko investigations. Breitbart also deserves credit for speaking in favor of including gays in the conservative movement and against its idiotic Birther faction, which he helped to pillory and marginalize.  

It is too much to call Breitbart a visionary. The “flatter media” he helped advance, for better and worse, was inevitable once the Web came along. But when early Internet age publishing is chronicled, his name belongs on a list that includes Andrew Sullivan, Mickey Kaus, Glenn Reynolds, Megan McArdle, Josh Marshall, Matt Drudge, Eugene Volokh, Jonah Goldberg, Arianna Huffington, Markos Moulistas, Jane Hamsher, Matthew Yglesias, Ezra Klein and others. For better and worse, they’ve all shaped the medium and the messages of our era.

Due to the untimeliness of Breitbart’s death, there has been an understandable reluctance to examine his achievements alongside his shortcomings, especially on right-leaning Web sites, for arguing about the man’s memory almost immediately turned into another skirmish between ideological tribes. But disagreeing about whether his professional legacy was a boon to the country, as many conservatives insist, or an overall detriment, as others claim, isn’t likely to get us anywhere. Suffice it to say that even history’s greatest heroes, beloved patriarchs, and loyal family dogs are imperfect. The most hard-core movement conservatives should be able to acknowledge that some aspects of Breitbart’s professional life would be better repudiated than celebrated or copied, even if their overall assessment of the man remains emphatically positive.

What follows isn’t an attempt to persuade you to share my conclusions about Breitbart’s overall impact on the world. The reader can draw that conclusion as well as I can. But having remarked on his innovator’s spirit, his contributions to the Web, his passion for his causes, his humor, and his loyalty to family and friends, it profits us to confront his flaws and transgressions forthrightly. Were he a monster, no one would be tempted to copy him. Precisely because he was a charismatic hero to many, avoiding his mistakes requires us to be unsentimental.    

Neither personal friends nor ideological allies are particularly good at that, so their obituaries, while very much worth reading, are insufficient. As someone who met Breitbart just a few times, an outsider rather than a member of the conservative movement, and a critical observer of his career who thought deeply about his impact in the course of tangling with him, sometimes bitterly, this is my attempt at an unsentimental critique. What follows is the part of the Breitbart legacy his fans haven’t confronted — and more reasons why they valued him too.   

A Movement Conservative For His Time       

In Decoded, Jay-Z’s autobiographical account of how and why he writes his rhymes, he describes the moment when the rap he was hearing on the streets of Brooklyn stopped being playful and started describing in graphic language the crack epidemic roiling urban America and the hustlers who were both its victims and its suppliers. “Hip-hop had described poverty in the ghetto and painted pictures of violence and thug life, but I was interested in something a little different: the interior space of a young kid’s head, his psychology,” he wrote. “Thirteen-year-old kids don’t just wake up one day and say, ‘Okay, I just wanna sell drugs on my mother’s stoop’… to tell the story of the kid with a gun without telling the story of why he has it is to tell a kind of lie… I wanted to tell stories and boast, to entertain and to dazzle with creative rhymes, but everything I said had to be rooted in the truth of that experience. I owed it to all the hustlers I met.”

It’s a passage I just happened upon, and reading it reminded me of Breitbart in this way: he saw conservatives as an invisibly victimized class, and although many before him had railed against the mainstream media, Hollywood, and other antagonists, he wanted to take us inside his own head, to explain the psychology of it, to tell us about his decadent time at Tulane, his squandered twenties as a default liberal, how the Clarence Thomas hearings radicalized him, and how his own biography helped him to see the master-narrative of the whole purportedly oppressive system. When he wielded a rhetorical flamethrower in the culture wars, he wanted us to know how his own observations led him to it, and made him feel self-righteous about spraying the flames. And yes, he wanted to entertain us, provoke us, dazzle us, and serve us Web ads. But he wanted it all to be true to the felt experience of aggrieved conservatives. He wanted to be their champion, to show them that someone was brazenly articulating their grievances. He felt he owed it to the nation’s Tea Partiers and denizens of flyover country. And his method was so hip-hop. Everything was filtered through the lens of Breitbart: his feuds, his put-downs, his crassness, the uncertain relationship between his public persona and what he was really like.

Were his grievances legitimate?

Breitbart’s critics saw an unaccountably angry man who grew up in a privileged West Los Angeles neighborhood, squandered his twenties in just the way the safety net of affluent parents permits, and ultimately had no good reason to be so angry. Wasn’t he socially popular in the ideologically diverse circles where he hung out? Wasn’t he welcomed without animus in the dread Hollywood haunts he continued to patronize even after he began to pillory them? Didn’t his interactions with actual liberals contradict his most sweeping generalizations about them?

But his anger resonated, especially among conservatives upset or threatened by their notion that the left is winning the culture war. These are people who earnestly defended the notion that “leftists are totalitarians,” as Breitbart once put it (referring not to Joseph Stalin, but to Hollywood producers, college professors, and New York journalists). They nodded along to his rants every bit as fanatically as poor teens in Bed Stuy listening to “99 Problems,” for hearing their long-held grievances unapologetically spun into a charismatic narrative was similarly gratifying.

To be sure, the Bed Stuy teens have a much better case for being aggrieved than Breitbart and his fans. And reveling in grievance is more cathartic than useful, as conservatives purport to understand. But Breitbart’s rise has coincided with a tendency in the conservative movement to indulge the notion that its problems stem from being treated unfairly. I wrote my college thesis on liberal media bias, gladly return to the subject periodically and still find myself annoyed and maddened by the frequency and hyperbole of conservative complaints. There is some liberal bias. It’s fine to call it out — but absurd to treat it as the very core of your worldview, the explanation for every ideological setback you suffer, or the main factor preventing a better society.*

Breitbart contributed to this counterproductive focus on the ways in which the world was being unfair to the right. The very names he gave his Web sites played into the conceit that a cabal of enemies was responsible for what ails America. His earliest target? “Big Hollywood.” Next up? “Big Journalism.” It’s a subsequent effort that most clearly shows the absurdity that ensued: “Big Peace” was launched amid two wars that rank as some of the longest in American history, an unprecedented number of American military bases around the globe, a military-industrial complex as powerful as it’s ever been, a transition to a Democratic president who himself didn’t even bother to get congressional approval before launching missile strikes on Libya, and whose undeclared drone war in an undisclosed number of countries is ongoing today. Breitbart acted as though a lot of his bogeymen wielded more power than was justified by reality, but the notion of a malign “big peace” lobby was surely his most bizarre unfounded conceit. Strange too is that neither his bellicose foreign policy instincts nor the platform he gave writers like Frank Gaffney ever seemed to bother his libertarian admirers. What was that about?

In the first piece I ever wrote about Breitbart, “At The Gates of the Fourth Estate,” I argued that his various bogeymen, and his hyperbolic notion of the left as “totalitarian,” was a core flaw in his world view, and especially ruinous to his avowed project to rescue culture (not politics) from liberal domination.

His insistence that the left is “totalitarian,” I wrote,

…implies that the left is supreme, ruthless, and all-powerful. Pushing back from within existing cultural institutions is futile; conservatives might as well withdraw into an ideologically safe dugout, nurse their resentments, and pretend that the height of courage is picking off the least careful leftists with the rhetorical equivalent of sniper fire. This needless retreat is among the biggest obstacles the right faces as it attempts to engage American culture on a more equal footing. Reversing its course depends on providing young conservatives with a less hysterical, more accurate assessment of their prospects: Ignore Andrew Breitbart!

Should you pursue your living in entertainment or the press, you will be outnumbered ideologically. But so long as you conduct yourself professionally, possess talent commensurate with your peers, and produce good work, behaving as a professional rather than a propagandist, you’ll go far. You’ll also meet a lot of nice people, many of them liberals, who’ll help you along the way.

This ought to have been the career advice Breitbart offered to young conservatives, given how often he insisted that culture mattered more than politics. But look at where he actually spent his time. Arguing with members of the political press at CPAC. Proving that a Democratic congressman tweeted photos of his penis to women on the Internet. Trying in vain to prove racism at the NAACP and destroying an innocent woman’s career in the process. Ask conservatives to cite his accomplishments and it’s invariably tactical political victories that they laud. Most often cited: his role in Anthony Wiener’s resignation and congressional defunding of ACORN.

In other words, short term political victories. Did they matter? It depends on what you care about. If it’s winning a Twitter pissing contest, a news cycle or even a congressional seat for a single term — or if you get catharsis from discovering that someone on the left has done something corrupt — Breitbart delivered. But if your desired end was meaningfully smaller government or improved public policy, he had a negligible impact, if any. ACORN just reorganized under different names. The party that holds NY-9 in the long term isn’t determined by a sex scandal.         

Blessed with a keen understanding of the Web and a knack for getting attention, Breitbart limited his impact by using the platforms he constructed for destructive purposes only. NPR, ACORN, Media Matters for America: he picked targets and found himself less and less able to “destroy the institutional left” as his objects inevitably got wise to his tactics. Meanwhile, Columbia’s journalism school graduated 300 twenty-somethings who’ll staff America’s newsrooms and Web media start-ups. For another generation. USC and NYU turned out countless aspiring directors and screen writers who’ll make whatever our kids watch on their iPad747s. For another generation. As far as I can tell, the grassroots conservative answer is James O’Keefe.

It would have been great if the Big sites aimed for higher quality journalism. Said libertarian press critic Jack Shafer in his obituary of Breitbart, “I liked the idea of Andrew Breitbart better than I liked any of his work at Big Government, Big Hollywood, Big Journalism, Big Peace, Breitbart or” And no wonder. What are the best 10 pieces published in the history of those sites? You’ll find more quality work in a single issue of City Journal than the sum total of everything Breitbart wrote or commissioned and published in his whole career. That magazine laid the intellectual foundation for a renaissance of conservative ideas, policy successes, and cultural transformation in New York City — as hostile a territory as there ever has been for the right.

You’d think that kind of success would inspire copycats. 

Yet Breitbart has persuaded a lot of people that his sites offer the best model for the future of right-leaning journalism, on the strength of forcing ACORN to reorganize and NPR to restaff. Who’ll even remember those much heralded victories in five years besides Ron and Vivian Schiller?

As Matt Welch wrote of Breitbart, “He didn’t actually have strong philosophical/policy beliefs — at all. An ideological movement that turns him into an icon isn’t taking itself seriously. In the long run, it is doubtful that Breitbart will retain the reputation he currently enjoys among conservatives, because the ideological icons history celebrates, the William F. Buckleys and Ronald Reagans and the James Q. Wilsons, are remembered for their contributions to lasting victories. Breitbart was a leader in the conservative movement during the Bush Administration and the Obama Administration, a period that has been disastrous for its avowed goals. What enduring conservative victory came between the launch of The Drudge Report in 1996 and today?

Ends Justifying Means

Last July, I attended a documentary about Sarah Palin on the night it opened in Orange County, California. It was a 12:01 a.m. showing, which theaters tend to do only when they expect big crowds. That’s what I expected when I drove out to interview moviegoers, but whether due to the late hour, the fact that Harry Potter was also opening that night, or the dearth of interest in the film, which didn’t ultimately do very well, I was practically the only one in the theater. I got home late, did a quick write-up in these pages, and went to bed. I woke up to mayhem. As I later detailed, Palin fans bizarrely accused me of conspiring with the AMC theater chain to schedule a secret, unadvertised showing of the Palin documentary so that I could, for anti-Palin propaganda purposes, attend it and cast its debut as a miserable failure. Or something.

Understand that this film appeared in Google’s movie listings, where I saw it, and was advertised in the Los Angeles Times on the appropriate day — which isn’t a particularly difficult thing to verify if, for example, you’re a publisher who lives in Brentwood. Rather than dismiss the absurd conspiracy theory or open a newspaper to check its veracity, a bullying Breitbart tweeted this:

Breitbart Tweet 1.jpg

And this:

Breitbart Tweet 2.jpg

Understand that Andrew Breitbart had roughly 75,000 followers on Twitter. I probably had less than 4,000 back then. As I later put it, “in three Tweets, we’ve got a juvenile made up name, an erroneous fact — my screening was at 12:01 am, not 12:45 am — plus the false implication that the films were unadvertised, requiring some special knowledge to know about them, and the false notion that I committed an unnamed firing offense. Needless to say, Breitbart didn’t contact me prior to publishing that. Nor has he corrected any of his numerous errors. But he’s a crusader for truth.”

Every last working journalist in America hates the idea of being falsely accused of fabricating a story — and having the accusation spread to tends of thousands of people you have no way of reaching to correct the record. It does groundless damage to your reputation. But that is the impression that Breitbart spread, on a lark, without even a shred of evidence, and despite having a spectacularly easy way to check the truth (check the previous day’s newspaper) as ever there was.

Breitbart’s behavior cost me two days fielding press requests, sending journalists a scanned image of the newspaper listing, receiving nasty emails and threats, and otherwise rebutting his lies. Writers from his sites piled on. So I ask Mickey Kaus, how does this square with your insistence that Breitbart “said what he thought was true, even when that hurt his side or put his own career at risk.” How does it square with the notion that Breitbart “had an instinctive honesty”? He didn’t with me. He never apologized or corrected the record. I doubt he even gave it much thought.

I am far from the only one he treated unethically. To cite just one more example, I’ve written before about Juan Carlos Vera, a lesser known victim of the Breitbart-O’Keefe partnership. I don’t mean to pick on Kaus. If anyone has an excuse for missing Breitbart’s flaws, it’s someone who knew him in Los Angeles for a decade before he launched his “Big” publishing career, saw the doubtlessly legion occasions when he did care about accuracy, and reacted as a mournful friend upon his death. But Breitbart’s journalist friends are unfair to his critics and those he wronged when they write against overwhelming evidence that he instinctively championed truth, even while writing, as Kaus does, “I don’t know the ins and outs of the Shirley Sherrod mess.”

I’ll say!

Kaus is a respected journalist whose personal archive is rife with solid and delightful pieces. He is, as well, a Web pioneer. He is one of several exceptional writers all of whom hold themselves to journalistic standards far higher than Breitbart ever met, but are seemingly able to see only the best things about his work, and are blind to its most conspicuous flaws. Again, they’re mostly people who liked Breitbart personally, and I am inclined to trust in their assurance that he was, despite the side of him I saw, a basically decent and well-intentioned person. In his work, he thought his ends sometimes justified bullying, cruelty, lying, negligence and intimidation. And not just to “destroy the institutional left.” Sometimes Breitbart sunk to those depths because someone wrote negatively about a Sarah Palin movie produced by his personal friend. If the reverse had been true, if I’d publicly and falsely accused Breitbart of fabricating an event, how would he have reacted? What nasty things would his fans have thought about me?

‘Be the Change’ or ‘Turnabout Is Fair Play’?

Gandhi famously advised humans to “be the change you want to see in the world.” Breitbart fell short of that admittedly lofty standard. His ends-justify-the-means attitude at times caused him to perpetrate exactly the sort of behavior that he’d only recently railed against as vile and corrupt. Sometimes the transgressions were minor. In his book, he points out the idiocy of liberal celebrities who ignorantly spew stereotypes about the entire American right… and then proceeds to write, “I would not be in your life if the political left weren’t so joyless, humorless, intrusive, taxing, anarchistic, controlling, rudderless, chaos-prone, pedantic, unrealistic, hypocritical, clueless, politically correct, angry, cruel, sanctimonious, retributive, redistributive, intolerant.”

Sometimes he showed a lack of self-awareness. For example, Breitbart once complained to me that I quoted something he said at an event that was supposed to be off-the-record. Perhaps someone explained the event to him that way. Certainly no one ever told me it was off-the-record. He acted very aggrieved about the whole thing, but eventually conceded that it wasn’t a big deal when I showed him that he’d said exactly the same thing in a television appearance. I wouldn’t have remembered the exchange except for the fact that Breitbart is, after all, a guy who built his Web sites publishing hidden video stings of unwitting subjects. And he excoriated me for quoting a line from a panel discussion he gave before an audience of 100 people?

There were, finally, serious transgressions. Breitbart complained bitterly about charges of racism frivolously made against conservatives. He seemed earnest when he insisted that it is morally wrong to imply that someone is a racist without rock solid proof, and that the media was often unforgivably derelict in fact-checking such claims (like the assertion that rallying Tea Partiers shouted the n-word at passing congressmen). Yet this same man published the Shirley Sherrod video — an attempt, by his own admission, to prove that some of the people in it were racist — before he even saw the whole thing! In that case, due diligence would’ve been harder than opening the movie pages. But every Breitbart fan knows that if NBC or CBS or ABC would’ve implied his racism by airing a selectively edited tape — if the full tape later went public, and cast him in a significantly better light — he’d condemn the whole mainstream media, jabbing a finger in the chest of any reporter in a 10-foot radius, and he wouldn’t accept it for a minute if the MSM producer explained, “Oh, our intention was to make the other people in the room with you look racist.”

If you’re someone who thinks that turnabout is fair play, that when conservatives do something immoral liberals are justified in behaving the same way and vice-versa, I am not going to persuade you otherwise. Suffice it to say that Breitbart sometimes subscribed to that flawed code.

The Tweets 

On Twitter, Breitbart deliberately stoked the worst impulses of his followers in a “look at what terrible people my critics are” project that was perhaps his most nakedly depraved. As Michele Malkin said in her obit, “If he were here, he’d be retweeting all the insane tweets from the Left rejoicing over his death. Even in death, he succeeds in exposing the hate-filled intolerance of the tolerance poseurs.” She’s right. He would’ve been glad to have inspired the tweets, and eagerly re-tweeting. But surely that isn’t a quality to be celebrated. A religious person might counsel, “love thy enemy” or “turn the other cheek.” A Kantian would say, “Don’t treat people as a means to an end.” I’d say that Breitbart’s baiting and re-tweets had no lasting impact on politics in America, but caused thousands of people to feel more anger, angst, and hate in their hearts than they would have otherwise. It was, at bottom, a not very admirable self-indulgence that his fans bear partial blame for encouraging. I’ve certainly lost my temper on Twitter. And it’s sometimes tempting to provoke people in that forum. I get it. I too have sinned! But this is a failing, not a pursuit worthy of the countless hours Breitbart invested, never mind something to be extolled in obituaries.

What Was Missing

Among conservatives, it is the Founding Fathers, Abraham Lincoln, philosophers like John Locke and Edmund Burke, journalists like William F. Buckley and politicians like Ronald Reagan who are regarded as heroes. All of those figures had a capacity for persuasion and a demonstrated willingness to square off against contemporaries with the strongest ideas contrary to theirs. And the conservative movement’s most popular champions don’t have those qualities anymore.

Breitbart could energize a subset of the base. He could mastermind a tactical victory, winning a news cycle or three. As he did so, he became more loved by the people who agreed with him and more hated by the people who didn’t. And for all his purported “courage” and “fearlessness” he’d only debate a certain kind of person, as I can personally attest. When I first challenged him to a public debate, he told me, uncharacteristically, that he had no intention of letting me free-ride on his platform, and that when I had access to a substantial audience of my own he’d be game. Sometime later, when I was writing for The Daily Beast, my editor, Tom Watson, reached out and got him to agree to a written debate, but he postponed, then backed out for reasons he would never explain. Perhaps I shouldn’t have telegraphed my preparedness.

In seriousness, the fact that he didn’t debate me personally doesn’t mean much, but it is telling that he never squared off, as William F. Buckley did on “Firing Line,” against some critical figure from “the other side.” In today’s America, I actually don’t think the pundit games requires much courage. The “happy warrior” talk always strikes me as overwrought and silly. But when Jon Stewart sits down with Chris Wallace, or when David Frum does a Bloggingheads with Jonah Goldberg, or when Christopher Hitchens debated Sam Harris, they were putting themselves out there in a way that Breitbart, who stuck to shouting matches and cable news spots, tended to avoid. Bluster, shamelessness, aggressiveness, a willingness to be confrontational — Breitbart had all those things. Lots of people don’t. Admire that or not, but courage it ain’t, and his fans acknowledge as much when they comment on leftists jabbing their fingers in the chests of CPAC attendees.   

What To Copy, What To Avoid

Breitbart’s successors should channel his passion. They should learn from his determination. They should challenge wrongheaded narratives in the media, create platforms that expand the ability of Americans to engage in political discourse, and inject mischievous humor into their work.

They should celebrate the best of what he did.

Unlike Breitbart, they should appeal to the best in people rather than intentionally eliciting their worst; produce journalism that is ambitious in its quality, not just its short-term political utility; refrain from falsely implying terrible things about people based on made up facts or misleadingly edited footage; show courage by exposing themselves to substantive debate with skilled antagonists; refrain from doing anything they regard as abhorrent when it’s done by other people; and grasp that focusing on the base’s sense of grievance hasn’t served conservatism well.

Image credit: Reuters

*If control of the American media is what matters most, if it is the main factor in deciding presidential elections, and controlling the media narrative through some means other than argument is the key to conservative success in the future, how do you explain 1980, 1984, and 2008? How is it that Ronald Reagan won the presidency and positively cruised to re-election, even though Rush Limbaugh was working for the Kansas City Royals at the time, cable news didn’t exist, there was no Drudge Report or blogosphere, and all three news networks took their cues from the front page of The New York Times? And then in 2008, when conservative media was reaching more people and making more money than ever before, the radio waves filled with Rush Limbaugh imitators, right-wing books topping the bestseller lists, Fox News the most popular cable news network in America, Red State up and running at full tilt… how is it that Barack Obama won? It’s almost as if the success of conservative media outlets and ideological entertainment isn’t the basic driver of American politics.


The answer is basically, ‘When they’re terrorists.’ Who gets to decide who qualifies? The president.

Eric Holder - REUTERS:Tami Chappell - banner.jpg

Says Attorney General Eric Holder, explaining when the United States can engage in the extrajudicial assassination of an American citizen abroad:

Let me be clear: an operation using lethal force in a foreign country, targeted against a U.S. citizen who is a senior operational leader of al Qaeda or associated forces, and who is actively engaged in planning to kill Americans, would be lawful at least in the following circumstances: First, the U.S. government has determined, after a thorough and careful review, that the individual poses an imminent threat of violent attack against the United States; second, capture is not feasible; and third, the operation would be conducted in a manner consistent with applicable law of war principles.

I thought I’d rewrite the speech to better reflect reality:

An operation using lethal force in a foreign country, targeted against a U.S. citizen who intelligence agencies say is a senior operational leader of al Qaeda or associated forces, and who intelligence agencies say is actively engaged in planning to kill Americans, would be lawful at least in the following circumstances: First, the U.S. government claims it has determined, after a review that it says is thorough and careful, that the individual poses an imminent threat of violent attack against the United States; second, intelligence agencies say that capture is not feasible; and third, the operation would be declared to be conducted in a manner consistent with the presidential interpretation of applicable law of war principles.

No sane country would ever operate under the language in the second paragraph. Yet that is effectively what we’re doing if all these supposed legal standards aren’t actually upheld by anyone – if the executive branch can just decide what to do for itself without any checks or balances. Its essence is, “We’re only empowered to kill terrorists, but we decide who is a terrorist.”

Or as Glenn Greenwald sardonically puts it:

…the President and his underlings are your accuser, your judge, your jury and your executioner all wrapped up in one, acting in total secrecy and without your even knowing that he’s accused you and sentenced you to death, and you have no opportunity even to know about, let alone confront and address, his accusations; is that not enough due process for you?

Said Hina Shamsi, director of the American Civil Liberties Union’s National Security Project:

Few things are as dangerous to American liberty as the proposition that the government should be able to kill citizens anywhere in the world on the basis of legal standards and evidence that are never submitted to a court, either before or after the fact. Anyone willing to trust President Obama with the power to secretly declare an American citizen an enemy of the state and order his extrajudicial killing should ask whether they would be willing to trust the next president with that dangerous power.

Finally, I give you my colleague Andrew Cohen:

Anyone who cares about this issue at all understands that what matters first is the legal rationale for the administration’s drone-strike policy. We need to know what the legal arguments are for such proclamations by the executive branch that, for example, the due process clause of the Constitution does not guarantee “judicial process” when a citizen’s life is on the line. What Holder delivered instead was what we already know — the political rationale for the “targeted killing” program. The New York Times told us that years ago.

It’s just not good enough to offer general platitudes about adherence to the Constitution. Everybody says that. The scoundrels who drafted the “torture memos” said that. It means nothing without specifics. And the memo was short on legal specifics. For example, only two federal statutes were cited, both having nothing to do with the drone-strike program. On that we got from Holder phrases like this: “This is an indicator of our times — not a departure from our laws and our values.” Honestly, it’s both, right?


Image credit: Reuters

Mar 092012

I awoke this morning (Friday a.m., on the other side of the Pacific) to find a slew of emails asking whether I had slacked off in proselytizing about my anti-”false equivalence” campaign to other members of the Atlantic’s staff. And apparently I have!

For those joining us late: in the five-plus years since they lost control of the Senate, Sen. Mitch McConnell and his Republican minority have dramatically ramped up the modern trend of subjecting almost everything the Senate does to the threat of a filibuster. Since it takes 60 votes to break a filibuster, versus only 51 to approve a bill or a nomination in the usual way; since the Democrats enjoyed a 60-vote coalition for only a few months in late 2009 and early 2010*; and since in modern practice the mere threat of a filibuster, rather than the full-blown Mr. Smith Goes to Washington-style speechathon, suffices to prevent a vote, through most of the Obama era the Republicans have been able to block unprecedented numbers of nominations and bills.

The “false equivalence” problem, as applied to the filibuster, is the media’s acquiescence in and routinization of the process. This happens when news accounts say that it takes 60 votes to “pass” or “approve” or “enact” a bill, rather than that we’re talking about the once- exceptional tool of the filibuster being applied day in and day out. After a while, people forget that it’s not so. The Washington Post has done this; so, occasionally, have NPR and the New York Times.

And so, today, has the Atlantic Wire. The analysis of a very interesting chart (below), showing the extreme polarization of the Senate, includes this line: “the Senate — with its reliance on supermajority procedural votes — was not designed to be a partisan, majority rules body like the House of Representatives.”


Nope! The Senate is indeed lumbered these days with supermajority requirements. But it was unambiguously designed to be a majority-rule body. You can look it up! The Constitution lays out a few narrow super-majority requirements: treaties, impeachment, etc. Otherwise, the majority rules, or is supposed to. The clearest evidence is the provision for the vice president to break a tie, if the two sides are “equally divided.”

What reassures me is knowing that the Republicans and then much of the press will remind us of the sacred importance of majority rule when control of the Senate changes again, as sooner or later it inevitably will. Meanwhile, we at the Atlantic will look, at least for a minute, at the beam in our eye rather than the mote that is anywhere else.

* The Democrats got their 60th vote in July, 2009, when Al Franken was sworn in after the bitterly contested and frequently recounted Minnesota Senate race. That 60-vote “Democratic” bloc included two independents, Bernie Sanders of Vermont and Joe Lieberman of Connecticut, plus Arlen Specter of Pennsylvania, who had switched to the Dems a few months earlier. The Democrats fell back to 59-vote “minority” status early in 2010, with the death of Edward Kennedy and his replacement in Massachusetts by Republican Scott Brown, who beat Martha “Who is This Curt Schilling You Speak Of?” Coakley.

Email this Article Add to digg Add to Reddit Add to Twitter Add to Add to StumbleUpon Add to Facebook

© 2012 Meslema Suffusion theme by Sayontan Sinha