Pages

Sunday, January 23, 2011

The war that made Bill Clinton president


Remember when every major Democrat in America was scared to run against George H.W. Bush?

Where Do Bad Ideas Come From?

 

And why don't they go away?

BY STEPHEN M. WALT | JANUARY/FEBRUARY 2011

We would all like to think that humankind is getting smarter and wiser and that our past blunders won't be repeated. Bookshelves are filled with such reassuring pronouncements, from the sage advice offered by Richard Neustadt and Ernest May in Thinking in Time: The Uses of History for Decision Makers to the rosy forecasts of Matt Ridley's The Rational Optimist: How Prosperity Evolves, not to mention Francis Fukuyama's famously premature claim that humanity had reached "the end of history." Encouraging forecasts such as these rest in part on the belief that we can learn the right lessons from the past and cast discredited ideas onto the ash heap of history, where they belong.
Those who think that humanity is making steady if fitful progress might point to the gradual spread of more representative forms of government, the largely successful campaign to eradicate slavery, the dramatic improvements in public health over the past two centuries, the broad consensus that market systems outperform centrally planned economies, or the growing recognition that action must be taken to address humanity's impact on the environment. An optimist might also point to the gradual decline in global violence since the Cold War. In each case, one can plausibly argue that human welfare improved as new knowledge challenged and eventually overthrew popular dogmas, including cherished but wrongheaded ideas, from aristocracy to mercantilism, that had been around for centuries.
Yet this sadly turns out to be no universal law: There is no inexorable evolutionary march that replaces our bad, old ideas with smart, new ones. If anything, the story of the last few decades of international relations can just as easily be read as the maddening persistence of dubious thinking. Like crab grass and kudzu, misguided notions are frustratingly resilient, hard to stamp out no matter how much trouble they have caused in the past and no matter how many scholarly studies have undermined their basic claims.
Consider, for example, the infamous "domino theory," kicking around in one form or another since President Dwight D. Eisenhower's 1954 "falling dominoes" speech. During the Vietnam War, plenty of serious people argued that a U.S. withdrawal from Vietnam would undermine America's credibility around the world and trigger a wave of pro-Soviet realignments. No significant dominoes fell after U.S. troops withdrew in 1975, however, and it was the Berlin Wall that eventually toppled instead. Various scholars examined the domino theory in detail and found little historical or contemporary evidence to support it.
Although the domino theory seemed to have been dealt a fatal blow in the wake of the Vietnam War, it has re-emerged, phoenix-like, in the current debate over Afghanistan. We are once again being told that if the United States withdraws from Afghanistan before achieving a clear victory, its credibility will be called into question, al Qaeda and Iran will be emboldened, Pakistan could be imperiled, and NATO's unity and resolve might be fatally compromised. Back in 2008, Secretary of State Condoleezza Rice called Afghanistan an "important test of the credibility of NATO," and President Barack Obama made the same claim in late 2009 when he announced his decision to send 30,000 more troops there. Obama also justified his decision by claiming that a Taliban victory in Afghanistan would spread instability to Pakistan. Despite a dearth of evidence to support these alarmist predictions, it's almost impossible to quash the fear that a single setback in a strategic backwater will unleash a cascade of falling dominoes.
There are other cases in which the lessons of the past -- sadly unlearned -- should have been even more obvious because they came in the form of truly devastating catastrophes. Germany's defeat in World War I, for example, should seemingly have seared into Germans' collective consciousness the lesson that trying to establish hegemony in Europe was almost certain to lead to disaster. Yet a mere 20 years later, Adolf Hitler led Germany into another world war to achieve that goal, only to suffer an even more devastating defeat.
Similarly, the French experience in Vietnam and Algeria should have taught American leaders to stay out of colonial independence struggles. In fact, French leaders warned Lyndon B. Johnson that the United States would lose in Vietnam, but the U.S. president ignored their advice and plunged into a losing war. The resulting disastrous experience in Vietnam presumably should have taught future presidents not to order the military to do "regime change" and "nation-building" in the developing world. Yet the United States has spent much of the past decade trying to do precisely that in Iraq and Afghanistan, at great cost and with scant success.
Why is it so hard for states to learn from history and, especially, from their own mistakes? And when they do learn, why are some of those lessons so easily forgotten? Moreover, why do discredited ideas come back into fashion when there is no good reason to resurrect them? Clearly, learning the right lessons -- and remembering them over time -- is a lot harder than it seems. But why?
THE LIMITS OF KNOWLEDGE
For starters, even smart people with good intentions have difficulty learning the right lessons from history because there are relatively few iron laws of foreign policy and the facts about each case are rarely incontrovertible.
And unfortunately, the theories that seek to explain what causes what are relatively crude. When a policy fails, reasonable people often disagree about why success proved elusive. Did the United States lose in Vietnam because the task was inherently too difficult, because it employed the wrong military strategy, or because media coverage undermined support back home? Interpreting an apparent success is no easier: Did violence in Iraq decline in 2007 because of the "surge" of U.S. troops, because al Qaeda affiliates there overplayed their hand, or because ethnic cleansing had created homogeneous neighborhoods that made it harder for Shiites and Sunnis to target each other? The implications for today depend on which of these interpretations you believe, which means that consensus about the "lessons" of these conflicts will be elusive and fragile.
What's more, even when past failures have discredited a policy, those who want to resurrect it can argue that new knowledge, new technology, or a clever new strategy will allow them to succeed where their predecessors failed. For more than 20 years, for example, a combination of academic economists and influential figures in the finance industry convinced many people that we had overcome the laws of economic gravity -- that sophisticated financial models and improved techniques of risk management like financial derivatives allowed governments to relax existing regulations on financial markets. This new knowledge, they argued, permitted a vast expansion of new credit with little risk of financial collapse. They were tragically wrong, of course, but a lot of smart people believed them.
Similarly, the Vietnam War did teach a generation of U.S. leaders to be wary of getting dragged into counterinsurgency wars. That cautious attitude was reflected in the so-called Powell doctrine, which dictated that the United States intervene only when its vital interests were at stake, rely on overwhelming force, and identify a clear exit strategy in advance. Yet after the U.S. military routed the Taliban in 2001, key figures in President George W. Bush's administration became convinced that the innovative use of special forces, precision munitions, and high-tech information management (together dubbed a "revolution in military affairs") would enable the United States to overthrow enemy governments quickly and cheaply and avoid lengthy occupations, in sharp contrast to past experience. The caution that inspired the Powell doctrine was cast aside, and the result was the war in Iraq, which dragged on for almost eight years, and the war in Afghanistan, where the United States seems mired in an endless occupation.
STRONG BUT FOOLISH STATES
All countries have obvious incentives to learn from past mistakes, but those that have successfully risen to the status of great powers may be less inclined to adapt quickly in the future. When it comes to learning the right lessons, paradoxically, nothing fails like prior success.
This wouldn't seem to make sense. After all, strong and wealthy states can afford to devote a lot of resources to analyzing important foreign-policy problems. But then again, when states are really powerful, the negative consequences of foolish behavior rarely prove fatal. Just as America's "Big Three" automakers were so large and dominant they could resist reform and innovation despite ample signs that foreign competition was rapidly overtaking them, strong and wealthy states can keep misguided policies in place and still manage to limp along for many years.
The history of the Soviet Union offers an apt example of this phenomenon. Soviet-style communism was woefully inefficient and brutally inhumane, and its Marxist-Leninist ideology both alarmed the capitalist world and created bitter schisms within the international communist movement. Yet the Soviet Union survived for almost 70 years and was one of the world's two superpowers for more than four decades. The United States has also suffered serious self-inflicted wounds on the foreign-policy front in recent decades, but the consequences have not been so severe as to compel a broader reassessment of the ideas and strategies that have underpinned many of these mistakes.
The tendency to cling to questionable ideas or failed practices will be particularly strong when a set of policy initiatives is bound up in a great power's ruling ideology or political culture. Soviet leaders could never quite abandon the idea of world revolution, and defenders of British and French colonialism continued to see it as the "white man's burden" or "la mission civilisatrice." Today, U.S. leaders remain stubbornly committed to the goals of nation-building and democracy promotion despite their discouraging track record with these endeavors.
Yet because the universal ideals of liberty and democracy are core American principles, it is hard for U.S. leaders to acknowledge that other societies cannot be readily remade in America's image. Even when U.S. leaders recognize that they cannot create "some sort of Central Asian Valhalla," as Defense Secretary Robert Gates acknowledged in 2009, they continue to spend billions of dollars trying to build democracy in Afghanistan, a largely traditional society that has never had a strong central state, let alone a democratic one.
CLOSED SOCIETIES AND CLOSED MINDS
In theory, democracies like the United States should have a built-in advantage. When governments stifle debate and restrict the public's access to information, bogus ideas and misguided policies are less likely to be exposed and either corrected or abandoned. In his masterful study of human-induced folly, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, James Scott argues that great man-made disasters arise when authoritarian governments pursue radical social transformations that are based on supposedly "scientific" principles such as Marxism-Leninism or Swiss architect Le Corbusier's "urban modernism." Because such schemes epitomize a certain notion of "progress" and also enhance central control, ambitious political leaders are understandably drawn to them. But because authoritarian regimes routinely suppress dissent, these same leaders may not learn that their ambitious schemes are failing until it is too late to prevent catastrophe. In the same vein, Nobel-winning economist Amartya Sen famously argued that authoritarian regimes are more prone to mass famines because such regimes lack the accountability and feedback mechanisms that give rulers a strong incentive to identify and correct mistakes in a timely manner.
Yet democracies, though they conceal less information and hold leaders more accountable, are hardly immune to similar pathologies. We often assume that open discourse and democratic debate will winnow out foolish policies before they can do much damage, but the "marketplace of ideas" that is supposed to perform that function is far from perfect. In separate accounts of the Bush administration's successful campaign to sell the invasion of Iraq, political scientist Chaim Kaufmann and New York Times columnist Frank Rich showed how easily democratic leaders can convince skeptical publics to go to war. Bush and his colleagues built support for the invasion by framing options in deliberately biased ways, manipulating a highly deferential media, and exploiting their control over classified information. The result was a nearly nonexistent debate about the wisdom of the war, with the deck heavily stacked in the president's favor.
The same strategy works to shield leaders from accountability: By concealing information behind walls of secrecy and classification, democratic as well as nondemocratic governments can cover up embarrassing policy failures and make it difficult to learn the right lessons from past mistakes. If citizens and scholars do not know what government officials have done and what the subsequent consequences of their actions are, it is impossible for them to assess whether those hidden policies made sense. To take an obvious current example, it is impossible for outside observers to evaluate the merits of U.S. drone attacks on suspected al Qaeda leaders without detailed information about the actual success rate, including the number of missed targets and innocent civilians killed, as well as the effects of those deaths on terrorist recruitment and anti-American attitudes more generally. When Pentagon officials tell us that increased drone strikes are working, we just have to take them at their word. Maybe they're right, but if they aren't, we won't know until long after the damage has been done.
The same problem can also arise when information is widely available but a subject is considered taboo and thus outside the boundaries of acceptable public discourse. John Mearsheimer and I argue that U.S. policy in the Middle East suffers from this problem because it is nearly impossible for American policymakers and politicians to question Washington's "special relationship" with Israel or criticize Israeli policy without triggering a hostile reaction, including being smeared as an anti-Semite or a "self-hating Jew." Ironically, making it harder for U.S. officials to tell Israel when its actions are misguided is harmful to Israel; a more open discourse and a more normal relationship would be better for both countries.
A similar taboo seems to be emerging in the realm of civil-military relations. With the United States mired in two lengthy conflicts, American politicians feel a need to constantly reiterate their support for "the troops" and their respect for the generals who run our wars, especially media-savvy commanders like Gen. David Petraeus. Criticizing the military would invite others to question one's patriotism and therefore is out of bounds. This trend is not healthy because civilians who are overly deferential to the military are unlikely to question military advice, even when it might be bad for the troops as well as the country. But generals are as fallible as the rest of us and should not receive a free pass from their civilian counterparts.
In short, whenever it becomes politically dangerous to challenge prevailing orthodoxies, misplaced policies are more likely to go unquestioned and uncorrected. Wouldn't it have been better if more well-placed people had objected to the U.S. decision to build massive nuclear overkill (including 30,000-plus nuclear warheads) during the Cold War, questioned the enduring fears of "monolithic communism" and Soviet military superiority, or challenged the wisdom of three decades of financial deregulation? Some did express such qualms, of course, but doing so loudly and persistently was a good way to find oneself excluded from the political mainstream and certainly from the highest corridors of power.
CUI BONO? BAD IDEAS COME FROM SOMEWHERE
Perhaps the most obvious reason why foolish ideas persist is that someone has an interest in defending or promoting them. Although open debate is supposed to weed out dubious ideas and allow facts and logic to guide the policy process, it often doesn't work that way. Self-interested actors who are deeply committed to a particular agenda can distort the marketplace of ideas.
A case in point is the long-standing U.S. embargo on Cuba, imposed in 1960 with the purpose of toppling Fidel Castro. It is hard to think of a better example of a failed policy that has remained in place for decades despite clear evidence that it is not just a failure, but actively counterproductive. If the embargo were going to bring Castro down, it surely would have happened by now, yet it is kept alive by the political influence of the Cuban-American lobby. Protectionist tariffs and farm subsidies illustrate the same problem. Every undergraduate economics major knows that these programs waste money and reduce overall consumer welfare; yet farmers, factory owners, and labor unions threatened by foreign competition routinely demand subsidies or protection, and they too often receive it. The same thing is true for costly initiatives like ballistic-missile defense, which has been assiduously promoted by aerospace and defense contractors with an obvious interest in getting the Pentagon to fill their coffers at public expense -- never mind that it might not actually work.
Even in areas where there is a clear scientific consensus, like climate change, public discourse has been distorted by well-organized campaigns to discredit the evidence and deny that any problem exists. Not surprisingly, those whose economic interests would be hurt if we significantly reduced our reliance on fossil fuels have aggressively funded such campaigns.
In the United States, this problem with self-interested individuals and groups interfering in the policy process appears to be getting worse, in good part because of the growing number of think tanks and "research" organizations linked to special interests.
Organizations like the American Enterprise Institute, the Center for a New American Security, the Washington Institute for Near East Policy, and the Center for American Progress -- to name but a few -- are not politically neutral institutions, in that their ultimate purpose is to assemble and disseminate arguments that advance a particular worldview or a specific policy agenda. The people who work at these institutions no doubt see themselves as doing serious and objective analysis -- and many probably are -- but such organizations are unlikely to recruit or retain anyone whose research challenges the organization's central aims. Their raison d'être, after all, is the promotion of policies favored by their founders and sponsors.
In addition to advocating bad ideas even after they have been found wanting, many of these institutions also make it harder to hold public officials accountable for major policy blunders. For example, one would think that the disastrous war in Iraq would have discredited and sidelined the neoconservatives who dreamed up the idea and promoted it so assiduously. Once out of office, however, they returned to friendly think tanks and other inside-the-Beltway sinecures and resumed their efforts to promote the discredited policies they had favored when they were in government. When a country's foreign-policy elite is insulated from failure and hardly anyone is held accountable, it will be especially difficult to learn from the past and formulate wiser policies in the future.
The rise of the Internet and blogosphere may have facilitated more open and freewheeling public debate about controversial issues, and websites like YouTube and WikiLeaks have fostered greater transparency and made the marketplace of ideas somewhat more efficient. In the blogosphere, at least, it is no longer taboo to talk critically about the "special relationship" with Israel, even if politicians and mainstream media figures remain reticent.
Nevertheless, there is a downside to these encouraging developments. The proliferation of websites and cable news outlets encourages some people to consume only the news and analysis that reinforce their existing views. Thus, a 2010 survey by the Pew Research Center for the People and the Press found that 80 percent of those who regularly listen to radio host Rush Limbaugh or watch Fox News's Sean Hannity are conservatives, even though conservatives are only 36 percent of the U.S. population. Similarly, the audience for MSNBC's Keith Olbermann and Rachel Maddow has nearly twice the fraction of liberals as the general public.
Moreover, competition between a growing number of news outlets seems to be fostering a media environment in which reasoned discourse matters less than entertainment value. Anyone who thinks that major issues of public policy should be dealt with on the basis of logic and evidence cannot help but be alarmed by the growing prominence of Glenn Beck and the know-nothing defiance of the Tea Party.
THE UNITED STATES OF AMNESIA
Last but not least, discredited ideas sometimes come back to life because societies simply forget important lessons about the past. Political psychologists generally agree that personal experiences have a disproportionate impact on our political beliefs, and lessons learned by older generations rarely resonate as strongly with their successors. And besides, as the years go by it becomes easier to argue that circumstances have changed and that "things are different now," encouraging the wrong-headed view that previous wisdoms about how to deal with particular problems might no longer hold. Of course, sometimes those arguments will be correct -- there are few timeless verities in political life -- and even seemingly unassailable truths might someday be seriously challenged if not discredited. All this just further complicates the problem of learning and retaining the right lessons from the past.
Regrettably, there is no hope of ever making the learning process work smoothly and flawlessly -- which is all the more reason why we have little choice but to be wary of firmly entrenched conventional wisdoms, wherever they're from, and relentlessly question our own judgments about the past as well.
For it just might turn out that a radically different version of events is the correct one, closer to the truth than our present reading of the past. Vigorous, unfettered, yet civil debate remains the most reliable mechanism for acquiring greater wisdom for the future. In the long run we are all dead, as John Maynard Keynes memorably quipped, but humanity could at least get something out of it.

Kristol, Kalecki, and a 19th Century Economist Defending Patriarchy all on Political Macroeconomics.

Rortybomb


Posted in Regulatory Reform by Mike on January 21, 2011
Gold! McKinley, campaigning on a Gold Standard.  (Source.)
Amateur political ideology speculating and ranting time.  Paul Krugman has been wondering about monetary morality lately, and more generally what is causing conservatives, Republicans and indifferent elites to believe Dark Age things about the economy. Some are Austrian economists who have their models and thoughts. But I think we see three general trends in thought that are going to be captured by three different comments:  supply-side myopia, business leaders pleading for more control, and the ‘Natural’ part of money.
Irving Kristol, The Heat Death of Supply-Side Economics
Trying to explain that the new Reagan-era supply-side economics wasn’t voodoo to his fellow neoconservatives, Irving Kristol wrote a 1981 article “Ideology and Supply-Side Economics” in Commentary magazine. A quote:
…Clearly, a great many people are nervous about “supply-side” economics, and seem to have difficulty understanding its rationale….Indeed, the trouble with the thing we call supply-side economics is that it is just too simple, too easy to understand…
It originates in deliberate contrast to the prevailing Keynesian approach, which emphasizes the need for government to manage and manipulate, through fiscal and monetary policies, aggregate demand so as to maintain full employment.  Supply-side economists say government cannot really do this, no matter how many clever economists it hires, but that if business enterprise is permitted to function with a minimum of interference, it will invest and innovate, so as to create the requisite demand for the goods it produces.
Here Kristol cleverly flips “full employment” arguments on their head by insisting that the natural result of a free market when government doesn’t step in and regulate, adjust demand, etc. is full employment.  If we want to get unemployment down and create jobs, the best thing to do is for government to get out of the way.   This ideology has captured the minds of most of our elites.
I realized that there was going to have to be an about-face on the part of economists, elites, opinion-leaders, etc. that the ideology of supply-side economics wasn’t relevant for this crisis. Yet opinion-leaders breath a deep sigh of relief when they learn of some contorted statistics that argues we have a structural problem. Thank God the government doesn’t have to act!
The constant digging for something the government is doing that is causing 15 million people to be unemployed is not just opinion-leaders. In our Survey of 30 Conservative Economists (Part One, Part Two), besides the Gold Bug tendencies, all the policy perscriptions were for tax cuts for the rich, ending unemployment insurance, and stopping “policy uncertainty.” “A tone and atmosphere of hostility from the government towards the general business community” scored high as a major factor in unemployment. Which leads me to…
Michal Kalecki, Our Narcissistic Business Elites
From time to time I get to meet people in what you would call the Professional Class. Lately I’ve noticed there’s a common critique of President Obama. Are you ready for it?  It goes something like “He’s alienating business. No wonder employment is suffering if he’s done a terrible job with including the business community.” I wish I could tell you I say something clever in response, or drop a neat factoid or statistics, but normally I am just concentrating on keeping my head from exploding like in that movie Scanners.
Which reminded me I wanted to blog about Michal Kalecki’s indictment of the business community during the Great Depression in his masterful 1943 essay Political Aspects of Full Employment. Why doesn’t the business community step up and demand more investment? Kalecki:
We shall deal first with the reluctance of the ‘captains of industry’ to accept government intervention in the matter of employment. Every widening of state activity is looked upon by business with suspicion, but the creation of employment by government spending has a special aspect which makes the opposition particularly intense. Under a laissez-faire system the level of employment depends to a great extent on the so-called state of confidence. If this deteriorates, private investment declines, which results in a fall of output and employment (both directly and through the secondary effect of the fall in incomes upon consumption and investment). This gives the capitalists a powerful indirect control over government policy: everything which may shake the state of confidence must be carefully avoided because it would cause an economic crisis. But once the government learns the trick of increasing employment by its own purchases, this powerful controlling device loses its effectiveness. Hence budget deficits necessary to carry out government intervention must be regarded as perilous. The social function of the doctrine of ‘sound finance’ is to make the level of employment dependent on the state of confidence.
Like Brad Delong in another context with Kalecki, when I first read that I thought it was iffy. Now I don’t believe so.
It seems that there is an increasing sense among certain types of neoliberal business elite that Obama hasn’t been favorable enough to the business community, and that he needs to reconcile this fast in order to fix the economy. Look at Peter Baker’s article on Obama’s job’s programs, which doesn’t mention housing but does stop by the Chamber of Commerce to note: “But Obama’s periodic forays into populism made it personal. He couldn’t seem to decide whether he was going to take Wall Street to task for its irresponsible behavior or cajole it into freeing up money to get the economy moving. One day he derided “fat-cat bankers” who caused the recession; another day, he soothed them by saying that he and the American people “don’t begrudge” multimillion-dollar bonuses.”
Think about that.  Why is Obama once calling members of the financial sector “fat-cat bankers” – while they were rolling the taxpayer, paying huge bonuses out of TARP and the Federal Reserve – important for an article about jobs? Is the idea that business leaders feel sad and offended important for why demand is down and unemployment is high?  The government can act, through monetary and fiscal policy, to get jobs going again, but that leaves business elite not in charge of the economy.
Yet here we are, with the confidence of the business sector being the main anxiety of our recovery, reading article after article arguing that Obama needs to appoint more business leaders to key positions to make the business community feel included.  We’ll discuss it more in 3 months when unemployment is still high and elites argue that President Obama should immediately replace Biden with a Xerox CEO.
To bring it back to conservatives, if you are the type who worries about (yet secretly hopes for) the time in which the business community leaves our country to create a gulch utopia, Obama saying very polite things about business leaders while letting them write regulatory rules is much more important than whether or not he gets people appointed to the Federal Reserve and draws lines in the sand over government stimulus.
A 19th Century Economist Defending Patriarchy
In 1889, Harvard economist Francis A. Walker wrote a book titled Money in Its Relation to Trade and Industry.Among many other things, he argued:
The social effects of a paper-money inflation are so fresh in the mind, through our recollections of our own Greenback Era, that I need not recall the wanton bravery of apparel and equipage; the creation of a countless host of artificial necessities in the family beyond the power of the husband and father to supply without a resort to questionable devices or reckless speculations, or to drafts on the proper business capital or the once sacred family reserve; the humiliating imitation of foreign habits of living, with but the faintest conception of the modes of thought and feeling and the customs of social intercourse which underlie them abroad; the loss of that fit and natural leadership of taste and fashion which is the best protection society can have against sordid material aims, and manners at once gross and effeminate, against democracy without equality or fraternity, and exclusiveness without nobility or pride of character.
Paper money decreases the power of the husband over his wife and the father over his family, loosens the natural leadership that serves as the best protection against “effeminate” manners, and gives us a democracy without nobility.
Which is to say, if you are a person who tends to use a capital N “Natural” to describe your political ideology (“I believe in a Natural Order with a Natural Hierarchy, which I get from my engagement with Natural Rights as observed through Natural Law….”), as many conservatives do, then you are going to be likely to think that the dollar is a Natural Thing too.  Like women wearing pants and voting, any attempt to disrupt the Natural Order is going to be dangerous.  That the value of a dollar is a social creation, and that if there is excessive demand for money the government should provide extra supply for money, isn’t going to be a convincing argument.
Michael O’Malley has written (h/t matthewstoller) some execellent stuff about this fight from a century ago (“Gold-standard arguments reflected a more generalized concern about and fascination with the insubstantiality of character, race, and value in labor…”).
So if you are a type who believes the government can only do bad, who believes that prosperity flows from how appreciated the business community feels, and who believes strongly in the Natural Order, then you are not going to be in favor of activist monetary and fiscal policy to fix the economy.  You also won’t have any actual coherent view of what is wrong with the economy.

As State of the Union Nears, Congress Plays Musical Chairs


Drew Angerer/The New York Times
President Obama on Friday at the General Electric plant in Schenectady, N.Y., where he named an economic adviser. On Tuesday, he will deliver the annual State of the Union address to Congress.
WASHINGTON — Mary from Louisiana asked Olympia from Maine because they are BFFs, but had a backup in Bob from Tennessee in case she was rebuffed. Kirsten from New York went the Sadie Hawkins route and asked John from South Dakota, and thus the deal between two members of the Senate with seriously good hair was sealed.
The talk in the West Wing may center on what President Obama plans to say on Tuesday in his State of the Union address to Congress about the still-ailing economy, or United States-China relations, or his education agenda. But here on Capitol Hill, the talk for the last few days has been all about the seating for the president’s speech and just who will be next to whom.
Ever since Senator Mark Udall, Democrat of Colorado,pushed for lawmakers of both parties to mix it up rather than sit among their own in the House chamber as if the other side has cooties, there has been a mad scramble among lawmakers for just the right partner.
Senator Charles E. Schumer, Democrat of New York, was early out of the box, saying he would sit next to his political antipode, Senator Tom Coburn, the conservative Republican gentleman from Oklahoma.
Others are doing it by delegation; for instance, Colorado’s two Democratic senators and its four House Republicans will assemble as a group. Illinois’s bipartisan Senate duo,Richard J. Durbin and Mark Steven Kirk, will be joined at the seat, as will the one from Pennsylvania, Bob Casey andPat Toomey.
Sometimes the link is shared interests, which in Washington does not mean cooking or cycling but committee assignments.
“I asked one of my best girlfriends to be my date for the night,” Senator Mary L. Landrieu, Democrat of Louisiana, said of her choice, Senator Olympia J. Snowe, Republican of Maine. “Of course, we share the Small Business Committee.”
“I had backups in case she said no, like Corker or Isakson,” Ms. Landrieu said, referring to Senators Bob of Tennessee and Johnny of Georgia. “These are really great guys. So, we may do a triple date.”
Others who have paired off include Senators Kirsten Gillibrand, Democrat of New York, and John Thune, Republican of South Dakota, generally considered two of the more well-coiffed and attractive members of the Senate.
The idea of mixing and mingling was originally advocated by the centrist group Third Way after the Tucson shooting that left Representative Gabrielle Giffords, a moderate Democrat from Arizona, critically wounded and spurred calls for a more civilized political discourse.
Mr. Udall quickly embraced it as a way for lawmakers to create new signs of civility visible to the public. It would be a stark contrast from previous years when the two sides of the aisle appeared to be listening to different speeches from different presidents, with Republicans leaping to their feet at the mention of tax cuts, for example, and Democrats embracing pledges of support for social programs.
Since mere moments after the idea was broached, lawmakers have also found themselves under steady questioning from the news media — local and national — demanding to know just whom they plan to sit with. It has made for some pressure, perhaps even some sweaty palms, in finding an available partner.
Steny Hoyer and I try to talk quite often,” Representative Kevin McCarthy of California, the No. 3 House Republican, told reporters, making his availability quite clear. “I would enjoy sitting next to him.”
Not everyone, though, is feeling the vibe.
“I already believe very firmly that it is a trap and a ruse that Democrats are proposing,” Representative Paul Broun, a conservative Republican from Georgia, said in a radio interview. Other Republicans have also scoffed at the idea as childish and irrelevant, calling it an effort to muzzle Republicans and prevent them from expressing reservations about Mr. Obama’s speech.
Asked whom the Republican Senate leader, Mitch McConnell of Kentucky, would sit with, his spokesman, Don Stewart, said, “Whoever sits next to him.”
President George Washington delivered his first regular Annual Message to a joint session of Congress in New York City on Jan. 8, 1790. Thomas Jefferson decided to put his message to Congress in writing in 1801, a practice that was followed by subsequent presidents until Woodrow Wilson traveled to the Capitol to deliver his personally to a joint session of Congress in 1913, restoring a tradition that has continued.
Both parties agree that in recent years the event has become something of a partisan pep rally as the strictly divided seating arrangement in the House — originated in the mid-1800s between Whigs and Democrats — took hold.
Some forced bipartisan fellowship has taken place behind presidents as they gave their addresses, as Democratic speakers of the House and Republican vice presidents — and vice versa — have taken their seats. This year’s version will feature the bipartisan duo of Vice President Joseph R. Biden Jr. and Representative John A. Boehner of Ohio, the new Republican House speaker.
Despite his having no choice in his seatmate, Mr. Biden embraced the new gesture, telling House Democrats at their retreat in Maryland on Friday that while it was a small step, it might be an important one.
“Hopefully it has the effect of generating the beginnings of a slightly different atmosphere,” Mr. Biden said. “Because, folks, if we don’t change it, if we don’t begin to get some kind of cooperation among us, I’m not sure how we deal with the dilemma the American people insist and have a right to insist on us dealing with.”