Monday, January 26, 2009

Watch Out for Stimulus 'Leaks'

The architect of Kennedy's tax cuts would have been skeptical of the Obama plan.

In economics textbooks the "leaky bucket" principle holds that when government transfers income or wealth from rich to poor a lot leaks out and is wasted. Some estimates put the leakage at most of what is meant for the poor, especially if you measure the damage to economic efficiency from the reduced work incentives of both the donor and the recipient.

The leaky bucket metaphor was coined years ago by Arthur Okun, a Yale economist who advised John F. Kennedy and later became chief economic adviser to Lyndon B. Johnson. Okun was a leading architect of the Kennedy tax cuts, passed shortly after the president's death with salutary economic effects. Because those cuts sharply reduced the top marginal rates on income, Okun has been called the first supply-sider -- one who worked for Democratic presidents no less. In his day, he would have been called a classical economist.

Although Okun's rule focused specifically on "antipoverty" programs, it can be argued that nearly all government outlays are transfers. Money from taxpayers and the credit markets goes to uses chosen by Congress. Thus, the leaky bucket has special application now as Democrats grope for ways to use wealth transfers to revive the economy.

Even before Barack Obama's inauguration, the Democrat-controlled House crafted a record-breaking $825 billion program to "stimulate" the U.S. economy. One measure -- withholding less income tax from paychecks over the next two years -- is a Keynesian effort to restore consumer demand for goods and services by sweetening take-home pay. Its authors assume that this $140 billion tax break will work better than last spring's $152 billion tax rebate, which seems not to have worked at all, judging by the economic debacle in late 2008.

The central question, however, is not whether "stimulus" programs are ineffective. It is whether they are counterproductive. A case can be made that the bucket not only leaks but that the leaks tend to drown out chances for economic recovery.

Circumstantial evidence that "stimulus" packages actually delay recovery can be derived from the Keynes-guided New Deal of the 1930s, which put large faith in federal deficits as a Depression cure. Federal debt climbed to 43.86% of GDP in 1939 from 16.34% in 1929 with very little relief from hard times. Huge Keynesian deficits in the 1970s did not arrest stagflation. The misery index, combining inflation and unemployment, soared above 20%.

Of the $825 billion package not slated for tax credits, the lion's share will go to states and cities, ostensibly for such things as infrastructure repairs. But in reality it is a partial federalization of state and city budgets -- of particular benefit to New York, New Jersey and California -- that are in large pools of red ink with no prospect of climbing out without federal help. The "stimulus" is partly a matter of the political class in Washington looking after its own in the nation's state houses. One danger of federalization is that it will reduce competition among states to attract investment, thereby weakening another driver of economic efficiency.

The Troubled Asset Relief Program (TARP) was at least a government attempt to make amends for the damage Washington did to the financial sector through monetary mismanagement and the pollution of the mortgage market by Fannie Mae and Freddie Mac. The second tranche of the $700 billion TARP authorization has now been approved by the Senate. It, like the $825 billion stimulus plan, is attracting lobbyists like flies to honey.

The central defect of government bailouts and stimulus packages is that the money is allocated through a political process. It goes to recipients who have the most political influence. Private entrepreneurs and even big business, by contrast, employ investment to earn a profit. The record shows that the latter yields greater economic efficiency, and hence creates real jobs.

The new stimulus package pays lip service to aiding the private sector with various tax incentives for hiring and investing capital. It acknowledges, just barely, that the private sector will be the engine for recovery if recovery is to be had. But the record for such measures is about as dismal as the one for short-term supplements to consumer income. They do very little to change the decisions or behavior of the recipient. If the recipient is wary and uncertain about the future, he or she will probably remain so.

Socialist economies, where governments decide how to allocate resources, are notoriously less efficient than market-capitalist economies. As in Washington, every politician demands his share. The late Abram Bergson of Harvard, after prodigious research, concluded that the old Soviet Union -- the ultimate in socialism -- employed capital only about half as efficiently as the U.S. That is one reason the Soviets collapsed from economic exhaustion.

Democrats are putting a lot of faith, to the tune of over $1.5 trillion, in economic policies with dodgy track records. At this time of a new president and great expectations, one hopes the political class will succeed better with massive spending than it has in the past. But don't bet the farm.

Geithner Is Exactly Wrong on China Trade

The dollar-yuan link has been a great boon to world prosperity.

Treasury Secretary-designate Tim Geithner's charge that China "manipulates" its currency proves only one thing. Three decades after Deng Xiaoping's capitalist rise, America's misunderstanding of China remains a key source of our own crisis and socialist tilt.

The new consensus is that America failed to react to the building trade deficit with China and the global "savings glut," which fueled our housing boom. A "passive" America allowed China to steal jobs from the U.S. while Americans binged with undervalued Chinese funny money.

This diagnosis is backwards. America did not underreact to the supposed Chinese threat. It overreacted. The problem wasn't "global imbalances" but a purposeful dollar imbalance. Our weak-dollar policy, intended to pump up U.S. manufacturing and close the trade gap, backfired. Currency chaos led to a $30 trillion global crash, an energy shock, bank and auto failures, and possibly a new big government era. For globalization and American innovation to survive, we must first understand the Chinese story and our own monetary mistakes.

We've heard the refrain: China's rapid growth was a mirage. China was stealing wealth by "manipulating" its currency. But in fact China's rise was based on dramatic decentralization and sound money.

After 500 years of inward looking stagnation, Deng opened 1979 with a bang. He freed 600 million peasants with history's largest tax cut. He emulated Hong Kong and Taiwan by establishing four Special Economic Zones on the sleepy southern coast. Before Beijing hard-liners knew it, mayors across China were demanding similar low-tax, local-control freedoms. By 1993, 8,000 of these of these entrepreneurial free trade zones had swept the nation. Two hundred fifty million people migrated to this "new China," where tax rates were low and regulations few. Capital poured in from China and the world.

Township and Village Enterprises (TVEs) were an unexpected but powerful innovation. Fiercely competitive and locally owned, these quasigovernment entities escaped Beijing taxation. Propelled by local knowledge and a zero corporate tax rate, the TVEs by 2000 accounted for half of China's output.

China needed an anchor for its complex transformation and in 1994 linked its currency, the yuan, to the U.S. dollar. The dollar-yuan link allowed a real price system to arise in China and created a single economic fabric stretching across the Pacific. Before long, the whole region had adopted what Stanford economist Ronald McKinnon calls the East Asian Dollar Standard.

The opposite of currency "manipulation," this dollar standard was a victory for free trade and global growth. But U.S. economists missed its portent. The Fed and Treasury of the late-1990s did not supply sufficient dollars to match rapidly growing global demand. A scarce dollar shot higher, and hard assets fell. Oil plummeted to $10 a barrel, gold fell to $250 from $400, credit shriveled, and dollar debtors across Asia went bankrupt. With an appreciating dollar and a world in turmoil, capital flooded into the U.S. and especially our soft, intellectual assets -- Cisco, Microsoft and dot-coms. The technology boom and bust was not a function of easy money but a scarce dollar.

In 2003, Alan Greenspan and Ben Bernanke identified an exotic threat: deflation. The Fed was seven years late. Mr. Greenspan's post-9/11 liquidity had already ended the 1997-2001 deflation. Yet the Fed persisted with 1% interest rates through 2003-04 and easy money thereafter. Meanwhile, Treasury Secretary John Snow targeted China and its trade surplus as a big threat. He and his successor Hank Paulson agitated for a stronger yuan and thus a weaker dollar.

Treasury's trade-deficit mania encouraged anti-China politicians. Messrs. Snow, Greenspan, Paulson and Bernanke several times talked Sens. Chuck Schumer and Lindsay Graham off the protectionist precipice. But the administration did not realize that the weak-dollar policy was itself protectionism.

China was imparting deep changes on the world economy. Yet in 2003 U.S. manufacturing was 50% larger than in 1994. U.S. knowledge industries were generating most of the world's profits and wealth. American consumers were benefiting from low-cost imports. Meanwhile, many Asian goods were rerouted through China for final assembly. The U.S.-China trade deficit thus grew even as the total portion of U.S. imports from East Asia fell below 35% from 40% in 1990.

The real threat was a devalued dollar. In mid-2005, we finally forced China to delink from the dollar and mildly appreciate the yuan. Nevertheless, the trade deficit accelerated. Robert Mundell -- Nobel laureate, China expert, father of the euro and supply-side economics -- continued to warn that the trade deficit was perfectly natural. Worry about currency instability instead.

But other eminent economists urged a "more competitive dollar." On May 13, 2006, this newspaper headlined: "U.S. Goes Along With Dollar's Fall to Ease Trade Gap." All these "more competitive" dollars had to go somewhere, and with amazing efficiency found their way into oil and subprime mortgages.

The weak dollar had the opposite of its intended effect. Cheap-dollar commodities exploded the trade gap. Conceived to make the U.S. "more competitive," the policy channeled money away from technology innovators and into home-building and home-equity consumption. Inflation for a time does pump up demand, and so U.S. consumers bought, and Chinese growth shot even higher. Chinese, Russian and Middle Eastern foreign reserves grew, further depressing the yields of U.S. Treasurys.

Some credit indicators are now improving, but the Fed's past destabilization policy will reverberate. The weak-dollar blunder helped scuttle the Doha Round of trade talks and will make the successful Bush tax cuts difficult to preserve. American interventionism could absolve Europe's anti-innovation "antitrust" policy and excuse China's worst intellectual property violations and "national champion" subsidies.

And yet, with sound-money advocate Paul Volcker in the Obama White House and Mr. Mundell plugged into Beijing, the monetary mayhem of the last decade could give way to a worldwide, sound-money revival in 2009 and beyond.

Mr. Swanson is a senior fellow and director of the Center for Global Innovation at the Progress & Freedom Foundation.

Drug Gangs Have Mexico on the Ropes

Law enforcement south of the border is badly outgunned.

A murder in the Mexican state of Chihuahua last week horrified even hardened crime stoppers. Police Commander Martin Castro's head was severed and left in an ice cooler in front of the police station in the town of Praxedis with a calling card from the Sinoloa drug cartel.

According to Mexico's attorney general, 6,616 people died in drug-trafficking violence in Mexico last year. A high percentage of those killed were themselves criminals, but many law enforcement agents battling organized crime were also murdered. The carnage continues. For the first 22 days of this year the body count is 354.

President Felipe Calderón began an assault on organized crime shortly after he took office in December 2006. It soon became apparent that the cartels would stop at nothing to preserve their operations, and that a state commitment to confrontation meant that violence would escalate.

As bad as the violence is, it could get worse, and it is becoming clear that the U.S. faces contagion. In recent months, several important American voices have raised concerns about the risks north of the border. This means there is hope that the U.S. may begin to recognize the connection between American demand for prohibited substances and the rising instability in Mexico.

The brutality of the traffickers is imponderable for most Americans. Commander Castro was not the first Mexican to be beheaded. It is an increasingly popular terror tactic. Last month, eight soldiers and a state police chief were found decapitated in the state of Guerrero.

The Americas in the News

Get the latest information in Spanish from The Wall Street Journal's Americas page.

There is also plenty of old-fashioned mob violence. As Agence France Presse reported on Jan. 19 from Chihuahua, 16 others -- besides Commander Castro -- died in suspected drug-related violence across the state the same night. Six bodies were found, with bullet wounds and evidence of torture, in the state capital. Five of the dead were police officers. On the same day, Reuters reported that Mexican vigilante groups appear to be striking back at the cartels.

Tally all this up and what you get is Mexico on the edge of chaos, and a mess that could easily bleed across the border. The U.S. Joint Forces Command in Norfolk, Va., warned recently that an unstable Mexico "could represent a homeland security problem of immense proportions to the United States." In a report titled "Joint Operating Environment 2008," the Command singles out Mexico and Pakistan as potentially failing states. Both "bear consideration for a rapid and sudden collapse . . . . The Mexican possibility may seem less likely, but the government, its politicians, police, and judicial infrastructure are all under sustained assault and pressure by criminal gangs and drug cartels."

The National Drug Threat Assessment for 2009 says that Mexican drug-trafficking organizations now "control most of the U.S. drug market," with distribution capabilities in 230 U.S. cities. The cartels also "maintain cross border communication centers" that use "voice over Internet Protocol, satellite technology (broadband satellite instant messaging), encrypted messaging, cell phone technology, two-way radios, scanner devices, and text messaging, to communicate with members" and even "high-frequency radios with encryption and rolling codes to communicate during cross-border operations."

A report by retired Gen. Barry McCaffrey, the former drug czar, makes similar observations. "The malignancy of drug criminality," he writes, "stretches throughout the U.S. in more than 295 cities." Gen. McCaffrey visited Mexico in December.

Here is how he sees the fight: "The outgunned Mexican law enforcement authorities face armed criminal attacks from platoon-sized units employing night vision goggles, electronic intercept collection, encrypted communications, fairly sophisticated information operations, sea-going submersibles, helicopters and modern transport aviation, automatic weapons, RPG's, Anti-Tank 66 mm rockets, mines and booby traps, heavy machine guns, 50 cal sniper rifles, massive use of military hand grenades, and the most modern models of 40mm grenade machine guns."

How is it that these gangsters are so powerful? Easy. As Gen. McCaffrey notes, Mexico produces an estimated eight metric tons of heroin a year and 10,000 metric tons of marijuana. He also points out that "90% of all U.S. cocaine transits Mexico" and Mexico is "the dominant source of methamphetamine production for the U.S." The drug cartels earn more than $25 billion a year and "repatriate more than $10 billion a year in bulk cash into Mexico from the U.S."

To put it another way, if Mexico is at risk of becoming a failed state, look no further than the large price premium the cartels get for peddling prohibited substances to Americans.

How Modern Law Makes Us Powerless

The real barrier to Barack Obama's 'responsibility' era.

Calling for a "new era of responsibility" in his inaugural address, President Barack Obama reminded us that there are no limits to "what free men and women can achieve." Indeed. America achieved greatness as the can-do society. This is, after all, the country of Thomas Paine and barn raisings, of Grange halls and Google. Other countries shared, at least in part, our political freedoms, but America had something different -- a belief in the power of each individual. President Obama's clarion call of self-determination -- "Yes We Can" -- hearkens back to the core of our culture.

[Commentary] David Klein

But there's a threshold problem for our new president. Americans don't feel free to reach inside themselves and make a difference. The growth of litigation and regulation has injected a paralyzing uncertainty into everyday choices. All around us are warnings and legal risks. The modern credo is not "Yes We Can" but "No You Can't." Our sense of powerlessness is pervasive. Those who deal with the public are the most discouraged. Most doctors say they wouldn't advise their children to go into medicine. Government service is seen as a bureaucratic morass, not a noble calling. Make a difference? You can't even show basic human kindness for fear of legal action. Teachers across America are instructed never to put an arm around a crying child.

The idea of freedom as personal power got pushed aside in recent decades by a new idea of freedom -- where the focus is on the rights of whoever might disagree. Daily life in America has been transformed. Ordinary choices -- by teachers, doctors, officials, managers, even volunteers -- are paralyzed by legal self-consciousness. Did you check the rules? Who will be responsible if there's an accident? A pediatrician in North Carolina noted that "I don't deal with patients the same way any more. You wouldn't want to say something off the cuff that might be used against you."

Here we stand, facing the worst economy since the Great Depression, and Americans no longer feel free to do anything about it. We have lost the idea, at every level of social life, that people can grab hold of a problem and fix it. Defensiveness has swept across the country like a cold wave. We have become a culture of rule followers, trained to frame every solution in terms of existing law or possible legal risk. The person of responsibility is replaced by the person of caution. When in doubt, don't.

All this law, we're told, is just the price of making sure society is in working order. But society is not working. Disorder disrupts learning all day long in many public schools -- the result in part, studies by NYU Professor Richard Arum found, of the rise of student rights. Health care is like a nervous breakdown in slow motion. Costs are out of control, yet the incentive for doctors is to order whatever tests the insurance will pay for. Taking risks is no longer the badge of courage, but reason enough to get sued. There's an epidemic of child obesity, but kids aren't allowed to take the normal risks of childhood. Broward County, Fla., has even banned running at recess.

The flaw, and the cure, lie in our conception of freedom. We think of freedom as political freedom. We're certainly free to live and work where we want, and to pull the lever in the ballot box. But freedom should also include the power of personal conviction and the authority to use your common sense. Analyzing the American character, Alexis de Tocqueville, considered "freedom less necessary in great things than in little ones. . . . Subjection in minor affairs does not drive men to resistance, but it crosses them at every turn, till they are led to sacrifice their own will. Thus their spirit is gradually broken and their character enervated."

This is not an ideological point. Freedom in daily choices is essential for practical reasons -- necessary for government officials and judges as well as for teachers, doctors and entrepreneurs. The new legal order doesn't honor the individuality of human accomplishment. People accomplish things by focusing on the goal, and letting their instincts, mainly subconscious, try to get them there. "Amazingly few people," management guru Peter Drucker observed, "know how they get things done." Most things happen, the philosopher Michael Polanyi wrote, through "the usual process of trial and error by which we feel our way to success." Thomas Edison put it this way: "Nothing that's any good works by itself. You got to make the damn thing work."

Modern law pulls the rug out from under all those human powers and substitutes instead a debilitating self-consciousness. Teachers lose their authority, Prof. Arum found, because the overhang of law causes "hesitation, doubt and weakening of conviction." Skyrocketing health-care costs are impossible to contain as long as doctors go through the day thinking about how they will defend themselves if a sick person sues.

The overlay of law on daily choices destroys the human instinct needed to get things done. Bureaucracy can't teach. Rules don't make things happen. Accomplishment is personal. Anyone who has felt the pride of a job well done knows this.

How do we restore Americans' freedom in daily choices? Freedom is notoriously malleable towards self-interest. "We all declare for liberty," Abraham Lincoln observed, "but in using the same word we do not all mean the same thing."

Freedom, however, is not just a shoving match. Freedom has a formal structure. It has two components:

1) Law sets boundaries that proscribe what we must do or can't do -- you must not steal, you must pay taxes.

2) Those same legal boundaries protect an open field of free choice in all other matters.

The forgotten idea is the second component -- that law must affirmatively define an area free from legal interference. Law must provide "frontiers, not artificially drawn," as philosopher Isaiah Berlin put it, "within which men should be inviolable."

This idea has been lost to our age. When advancing the cause of freedom, law today is all proscription and no protection. There are no boundaries, just a moving mudbank comprised of accumulating bureaucracy and whatever claims people unilaterally choose to assert. People wade through law all day long. Any disagreement in the workplace, any accident, any incidental touching of a child, any sick person who gets sicker, any bad grade in school -- you name it. Law has poured into daily life.

The solution is not just to start paring back all the law -- that would take 10 lifetimes, like trying to prune the jungle. We need to abandon the idea that freedom is a legal maze, where each daily choice is like picking the right answer on a multiple-choice test. We need to set a new goal for law -- to define an open area of free choice. This requires judges and legislatures to affirmatively assert social norms of what's reasonable and what's not. "The first requirement of a sound body of law," Justice Oliver Wendell Holmes Jr. wrote, "is that it should correspond with the actual feelings and demands of the community."

The profile of authority structures needed to defend daily freedoms is not hard to imagine. Judges would aspire to keep lawsuits reasonable, understanding that what people sue for ends up defining the boundaries of free interaction. Schools would be run by the instincts and values of the humans in charge -- not by bureaucratic micromanagement -- and be held accountable for how they do. Government officials would have flexibility to meet public goals, also with accountability. Public choices would aspire to balance for the common good, not, generally, to appease someone's rights.

Reviving the can-do spirit that made America great requires a legal overhaul of historic dimension. We must scrape away decades of accumulated legal sediment and replace it with coherent legal goals and authority mechanisms, designed to affirmatively protect individual freedom in daily choices. "A little rebellion now and then is a good thing," Thomas Jefferson wrote to James Madison, "and as necessary in the political world as storms are in the physical . . . ." The goal is not to change our public goals. The goal is make it possible for free citizens to achieve them.

Mr. Howard, a lawyer, is chair of Common Good (www.commongood.org), and author of the new book "Life Without Lawyers," published this month by W.W. Norton & Co.

Sunday, January 25, 2009

Bad News Is Better Than No News

We need to know what's mucking up the financial system.

We're now more than $1 trillion in taxpayer bailouts into the credit crisis, and the one enduring certainty is uncertainty. There is uncertainty about what caused the problem, uncertainty that either Wall Street or Washington knows what to do, and uncertainty about financial models that measured risk until they didn't. Markets thrive when information flows freely, and they seize up when uncertainty replaces understanding.

[Information Age] Stefan Rousseau/PA

So we should cheer a growing consensus that it's time to address the information gaps that caused the financial mess. The best-known unknown is the continuing mystery of the true value of the bad mortgage-backed and other assets held by banks whose collapse sparked the credit crisis. Addressing this basic issue was the original purpose last fall of the $700 billion government bailout program, but the Troubled Asset Relief Program didn't live up to its name, leaving the size of toxic debts unquantified.

Plan B is to go back to Plan A. Regulators urge using new bailout funds to return to the original goal of discovering the true value of these securities. "A continuing barrier to private investment in financial institutions is the large quantity of troubled, hard-to-value assets that remain on institutions' balance sheets," Federal Reserve Chairman Ben Bernanke said in a London speech earlier this month. "The presence of these assets significantly increases uncertainty about the underlying value of these institutions and may inhibit both new private investment and new lending."

Banks can't resume lending because they don't know how unsound they are. Private investors can't know how bad bank debt is, so they hesitate to invest in banks. There are echoes from the experience in Japan, where the collapse of a real-estate bubble in the 1980s became a drag on the economy for years as regulators put off the day of reckoning of the full losses.

Former Federal Reserve Chairman Paul Volcker suggests an updated version of the Resolution Trust Corp., which forced the bad savings-and-loan debts to be marked to market in the 1980s. Other ideas include creating a "bad bank" supported by the government to aggregate bad debt. Precisely how the bad debt is to be isolated is less important than the implied commitment finally to assess the value of houses, credit-card loans and other debt.

Achieving this price discovery is hard, but it is only half the battle. Before banks can get back on their feet and start lending again, financial professionals need more confidence that they know what went wrong and how to avoid more mispricing of risk.

Here, too, there is reason for optimism. Wall Street has been asking itself: Were its financial models fundamentally flawed, or did flawed financial professionals misuse the models? Take the example of the metric that banks use to manage their risk, or so they thought. Value at Risk calculations were developed in the early 1990s at J.P. Morgan to measure the different kinds of financial risk using a single measure. Banks analyzed historical data to understand the relative riskiness of a $50 million investment in three-year Treasurys versus 30-year Treasurys, or even a $50 million investment in Japanese yen versus 1,000 barrels of oil.

Bankers now recall the fine print of VaR analysis, which is that it always includes low but real risk that some new element could make the historical data a poor measure of the future. The late Dennis Weatherstone, the J.P. Morgan chairman who led this initiative, used to remind his team that the math of VaR alone could not measure risk in the outlying parts of the bell curve of probabilities. "The reason we pay as much as we do to traders is to manage the risk in the tails of the distribution," derivatives expert Mark Brickell, a former J.P. Morgan managing director, quotes Weatherstone as saying. "That's the hard part. For events inside the tails, it is not so difficult nor so remunerative."

It's now clear that the data that banks used were distorted by years of government initiatives to promote homeownership. Government-mandated loans led house prices ever higher and house-price volatility ever lower. When the VaR models looked back, they wrongly modeled a low risk of default. Wall Street shouldn't make the mistake again of ignoring the impact of politics on economics -- and politicians should find ways to achieve social goals without undermining the integrity of markets.

We've had credit crises before, but as this column has reminisced, a century ago J.P. Morgan was able to resolve the Panic of 1907 simply by gathering all the key bankers in his home library and forcing them to measure and accept their losses. Like all credit crises, the sooner ours is resolved, the sooner we can learn its lessons, take our losses, and move on. Tough as this process of discovering the full losses will be on shareholders and taxpayers, the alternative of continued uncertainty and market paralysis is even worse.

The Stimulus Time Machine

That $355 billion in spending isn't about the economy.

The stimulus bill currently steaming through Congress looks like a legislative freight train, but given last week's analysis by the Congressional Budget Office, it is more accurate to think of it as a time machine. That may be the only way to explain how spending on public works in 2011 and beyond will help the economy today.

According to Congressional Budget Office estimates, a mere $26 billion of the House stimulus bill's $355 billion in new spending would actually be spent in the current fiscal year, and just $110 billion would be spent by the end of 2010. This is highly embarrassing given that Congress's justification for passing this bill so urgently is to help the economy right now, if not sooner.

And the red Congressional faces must be very red indeed, because CBO's analysis has since vanished into thin air after having been posted early last week on the Appropriations Committee Web site. Officially, the committee says this is because the estimates have been superseded as the legislation has moved through committee. No doubt.

[Review & Outlook] AP

David Obey.

In addition to suppressing the CBO analysis, Democrats have derided it. Appropriations Chairman David Obey (D., Wis.) called it "off the wall," never mind that CBO is now run by Democrats. Mr. Obey also suggested that it would be a mistake to debate the stimulus "until the cows come home." We'd settle for a month or two, so at least the voters can inspect the various Congressional cattle they're buying with that $355 billion.

The stimulus bill is also a time machine in the sense that it's based on an old, and largely discredited, economic theory. As Harvard economist Robert Barro pointed out on these pages last Thursday, the "stimulus" claim is based on something called the Keynesian "multiplier," which is that each $1 of spending the government "injects" into the economy yields 1.5 times that in greater output. There's little evidence to support this theory, but you have to admire its beauty because it assumes the government can create wealth out of thin air. If it were true, the government should spend $10 trillion and we'd all live in paradise.

The problem is that the money for this spending boom has to come from somewhere, which means it is removed from the private sector as higher taxes or borrowing. For every $1 the government "injects," it must take $1 away from someone else -- either in taxes or by issuing a bond. In either case this leaves $1 less available for private investment or consumption. Mr. Barro wrote about this way back in 1974 in his classic article, "Are Government Bonds Net Wealth?", in the Journal of Political Economy. Larry Summers and Paul Krugman must have missed it.

The government spending will be a net stimulus only if its $1 goes to more productive purposes than those to which private investors would have put that same $1. There are some ways we may want the government to spend money -- on national defense, say -- but that doesn't mean it's a stimulus.

A similar analysis applies to the tax cuts that are part of President Obama's proposal. In contrast to the spending, at least the tax cuts will take effect immediately. But the problem is that Mr. Obama wants them to be temporary, which means taxpayers realize they will see no permanent increase in their after-tax incomes. Not being fools, Americans may either save or spend the money but they aren't likely to change their behavior in ways that will spur growth. For Exhibit A, consider the failure of last February's tax rebate stimulus, which was a bipartisan production of George W. Bush and Mr. Summers, who is now advising Mr. Obama.

To be genuinely stimulating, tax cuts need to be immediate, permanent and on the "margin," meaning that they apply to the next dollar of income that an individual or business earns. This was the principle behind the Kennedy tax cuts of 1964, as well as the Reagan tax cuts of 1981, which finally took full effect on January 1, 1983.

If the Obama Democrats can't abide this because it's a "tax cut for the rich," as an alternative they could slash the corporate tax to spur business incentives. The revenue cost of eliminating the corporate tax wouldn't be any more than their proposed $355 billion in new spending, and we guarantee its "multiplier" effects on growth would be far greater. Research by Mr. Obama's own White House chief economist, Christina Romer, has shown that every $1 in tax cuts can increase output by as much as $3.

As for all of that new spending, CBO will release an updated analysis this week. And we anticipate that the budget analysts will in the interim have discovered that much more of that $355 billion will somehow find its way to "shovel-ready" projects that the Obama Administration can start building before the crocuses bloom. But in the real world, the CBO's first estimate is likely to prove closer to the truth.

The spending portion of the stimulus, in short, isn't really about the economy. It's about promoting long-time Democratic policy goals, such as subsidizing health care for the middle class and promoting alternative energy. The "stimulus" is merely the mother of all political excuses to pack as much of this spending agenda as possible into a single bill when Mr. Obama is at his political zenith.

Apart from the inevitable waste, the Democrats are taking a big political gamble here. Congress and Mr. Obama are promoting this stimulus as the key to economic revival. Americans who know nothing about multipliers or neo-Keynesians expect it to work. The Federal Reserve is pushing trillions of dollars of monetary stimulus into the economy, and perhaps that along with a better bank rescue strategy will make the difference. But if spring and then summer arrive, and the economy is still in recession, Americans are going to start asking what they bought for that $355 billion.

Lincoln's Memo to Obama
by Ronald C. White Jr.

A distinguished Lincoln biographer imagines what advice the 16th president would offer the 44th as he takes ­office.

Illinois senator Everett Dirksen observed 50 years ago, “The first task of every politician is to get right with Lincoln.” As the inauguration of President Barack Obama converges with the beginning of the Abraham Lincoln Bicentennial, it is intriguing to think about what Lincoln might say across the years to the new president. In recent election campaigns many politicians, both Republicans and Democrats, have tried to associate themselves with Lincoln. President Obama has moved far beyond the invocation of Lincoln’s words to patterning his political spirit after his 19th-century model. Again and again Obama has buttressed his vision for America by beginning, “As Lincoln said. . . .”

Nearly 150 years after his assassination Lincoln continues to captivate us because he eludes our simple definitions and final judgments. Lincoln endured critics who libeled him as “the Black Republican,” “the original gorilla,” and “the dictator.” Obama is rapidly picking up his own libels—Rush Limbaugh has called him “The Messiah” and National Review labeled him “Our Memoirist in Chief.” Pundits always want to apply the conservative/liberal grid to politicians, but these political labels could not define Lincoln, nor can they confine Obama.

I believe Lincoln would begin by offering his own “Yes we can” to the election of America’s first ­African-­American president. Lincoln, the homely westerner with less than one year of formal education, was surprised by his nomination and election as president in 1860. Four years later, when he had become convinced he could not be ­re­elected, he told the men of the 168th Ohio Regiment, “I happen temporarily to occupy this big White House.” He said to the soldiers, “I am a living witness that any one of your children may look to come here as my father’s child has.” How like Lincoln to speak of himself as “my father’s child.” How like Obama to say on the eve of his victory, “If there is anyone out there who still doubts that America is a place where all things are possible . . . tonight is your answer.” In a world of “I,” both leaders pointed beyond themselves to the larger truth of the American “we.”

Lincoln would especially encourage Obama to use his public speeches as a key ­to his political leadership as president. Our most eloquent president would be distressed to hear the modern shibboleth, “It’s only words.” Lincoln, thinking of the role his speeches and public s desk on the third floor of his ­brother-­in-­law’s office building. In the White House, quite accessible to visitors, he often found time to write very early in the morning in his office (what is now the Lincoln Bedroom). He would write either at the large walnut table in the middle of the room where he convened cabinet meetings, or at an old mahogany writing desk with pigeonholes. Sometimes he would rise and ponder what to write as he gazed out the window at the unfinished Washington ­Monument.

The private Lincoln might offer some advice to the private Obama. Lincoln generated a running intellectual conversation with himself by developing the habit of writing down his ideas on little slips of paper or on the backs of envelopes. He stored these notes either in his tall silk hat or in the bottom drawer of his desk, ready to be retrieved to serve as the foundations of his finest speeches. Perhaps Obama already does something similar with his ever present Blackberry, and if, as reports suggest, he will have to give up this ­21st-­century technology in the White House, he could do worse than take up Lincoln’s old-fashioned pen and ­paper.

And what about speechwriters? Lincoln would not understand this modern phenomenon that probably began with FDR but has now become a full-time occupation, with a phalanx of writers backed up by even more researchers. Lincoln would advise Obama to write his own speeches, or at least the major ­ones.

Lincoln’s renown for compelling oratory has obscured the story of how much of his eloquence was the product of hard editing and rewriting. He might tell Obama the surprising story of his own first inaugural. As he worked on the speech, he showed it to a few Illinois friends who made but one significant suggestion. Arriving in Washington, he decided to give a copy to a new colleague who was not yet a friend: William Seward, the New York senator who had been his chief rival for the Republican nomination and would now be his secretary of state. Lincoln was surely surprised when Seward responded with six pages of suggestions. Seward, who fancied himself a great speaker, told Lincoln to throw out his last paragraph. He offered the ­president-­elect two possible replacements. Lincoln demonstrated his brilliance by editing Seward’s words to make them his own. We know this memorable paragraph by the words Lincoln revised to make it read like poetry:

Lincoln might also offer his counsel to President Obama on integrity and ambition.

Lincoln’s moral integrity was the strong trunk from which all the branches of his life grew. His integrity had many roots, including his intimate knowledge of the Bible, the Declaration of Independence, and the Constitution. He may not have read Aristotle’s Treatise on Rhetoric, but he embodied the ancient Greek philosopher’s conviction that persuasive speech is rooted in ethos, or integrity. Lincoln would advise contemporary politicians that the American public knows when they are acting out a political role and when they are speaking with integrity, or what people now call authenticity.

Lincoln wrote candidly of his “peculiar ambition” in his first announcement for public office, in 1832. Barely 23, he offered a definition of ambition worth passing on: “that of being truly esteemed by my fellow men, by rendering myself worthy of their esteem.” Over the years, Lincoln learned to prune the strong branch of personal ambition so that it did not grow out of proportion to his service to others. The biting satire the young Lincoln occasionally dispensed gave way over time to the magnanimity he expressed in the closing benediction of his second inaugural address: “With malice toward none, with charity for all.”

The 16th president would counsel Obama to resist the growing demands to act quickly in response to the admittedly dire crises facing the nation in 2009. During the long interregnum between his election and his inauguration on March 4, 1861, Lincoln found himself under tremendous pressure to declare his policies on the growing Southern secession movement. The pressure only increased when he embarked on a 12-day train trip from Springfield to Washington in February 1861, which allowed him to speak to far more Americans than any previous president. And they expected to hear answers from ­him.

Lincoln would probably tell Obama that he too had been accused of being distant in the face of pressing political problems. As president, Lincoln emerged as a leader who kept his own counsel. Members of his own party accused him of neither convening nor consulting his cabinet ­enough.

I think Lincoln might offer a word of caution as President Obama puts in place several layers of economic and national security advisers in today’s admittedly more complex administrative structure. On the one hand, Lincoln would applaud Obama for emulating what he ­did—­surround himself with strong leaders who would provide differing points of view. On the other hand, Lincoln might offer a gentle warning that Obama has appointed far more cooks than he did in the White House kitchen, which could end up spoiling his recipes for change.

With historical imagination, I can envision Lincoln putting his arm around Obama when offering this advice: Be comfortable with ambiguity. On a blue state/red state map, too often the question becomes, Are you for it or against ­it—­gun control, abortion, immigration reform? Ambiguity is too often seen as a weakness, an inability to decide. Not so for Lincoln. Ambiguity became for him the capacity to look at all sides of a problem. Ideologues are the persons who lack the capacity to see complexity in difficult issues. Lincoln voiced this ambiguity in a private memo to himself that was found only after his death. As he pondered the meaning and action of God in the Civil War, he wrote, “I am almost ready to say this is probably ­true—­that God wills this contest, and wills that it shall not end yet.” At the very moment that Lincoln, in private, offered the affirmation that God willed this ongoing war, he did so by admitting the partiality of his vision—“almost” and “probably.” Ambiguity is the mark of humility, not weakness. The question for the next four or eight years will be whether the American public can appreciate a president whose political autobiography, The Audacity of Hope, is filled with self-deprecating stories of his partial vision and even conflicting viewpoints.

Finally, Lincoln might have a ­heart-­to-­heart talk with Obama about the role of faith in politics. Lincoln, who never wore his faith on his sleeve, who did not formally join a church, has left us in his second inaugural address the most profound speech combining politics and religion ever delivered to the American public. In only 701 words, the second shortest inaugural address (George Washington delivered a second inaugural of only 134 words), Lincoln mentions God 14 times, quotes the Bible four times, and invokes prayer three times. Today, what the public may remember most about candidate Obama’s religion is his painful distancing of himself from his former pastor and congregation during the 2008 campaign. What the American public needs to know is in his thoughtful discussion of faith in The Audacity of Hope. If the Bill of Rights codifies the separation of church and state, Obama affirms that Americans, “as a religious people,” have never divided politics and religion. He couples the story of his own journey from skepticism to “embrace the Christian faith” with his admonition “to acknowledge the power of faith in the lives of the American people.” Obama says that part of the magnetism of the Christian faith that attracted him was the power of the African-American religious tradition to minister to the whole person and be an advocate for social ­justice.

Lincoln, in his second inaugural address, used inclusive language—“Both read the same Bible, and pray to the same God”—to appeal to his entire audience, North and South. He would commend Obama’s intention, in our increasingly multicultural and multireligious nation, to make his case for the religious and moral values that are the historical foundation of our society in order “to engage all persons of faith in the larger project of American renewal.”

At the end of a compelling discussion of the Constitution in The Audacity of Hope, Obama exclaims, “I am left then with Lincoln.” The remarkable tether between Lincoln and Obama, suddenly in such plain view in recent months, is not an end but a beginning. For many Americans, Lincoln, however appreciated before, has at the outset of a new presidency moved from there and then to here and now. He has become strangely contemporary. Obama, at the beginning of the Abraham Lincoln Bicentennial, reminds us that whenever contemporary Americans try to trace an idea or truth about our national identity, we will find Lincoln’s initials—AL—carved on some tree, for he was there before us.

No comments: