by Richard W. Rahn
Many in the establishment media and, in particular, some of the commentators on MSNBC have referred to members of the tea party and their supporters in Congress who did not vote for the debt ceiling compromise as delusional or worse.
A delusion is a false belief strongly held in spite of invalidating evidence. The tea party crowd argued that the compromise would do almost nothing to stop the United States from going over the budget deficit cliff and that the continuing, irresponsible deficits required stronger medicine, such as a balanced budget amendment to the Constitution. The idea of a constitutional amendment to balance the budget or to limit taxes and spending was mocked by smug, leftist commentators as both unneeded and unrealistic.
Those who look at numbers, in contrast to those who just express opinions, clearly thought that the budget compromise was insufficient, to put it mildly. The stock markets have plunged, Standard & Poor's downgraded U.S. debt and the U.S. gross domestic product to debt ratio reached 100 percent. Those who disparage the tea party ignore the fact that the S&P managing director said they would probably not have downgraded if the United States had a credible balance budget amendment. The anti-tea party crowd has been trying to explain the market reaction as due to the lack of a tax-increasing compromise.
Richard W. Rahn is a senior fellow at the Cato Institute and chairman of the Institute for Global Economic Growth.
More by Richard W. Rahn
If you were canoeing with a friend and you believed, based upon reading the maps, that you were within five minutes of going over a massive and perhaps lethal waterfall, while your math- and geography-challenged friend believed that you had another 15 minutes before going over the falls, would you compromise by agreeing to go another 10 minutes before returning to shore? It is delusional to agree to compromise in such a situation. Sometimes it makes sense to compromise — other times it can be fatal.
President Obama claims that the Republicans are being irresponsible in not wanting to close "tax loopholes" for private jet owners and oil companies. These loopholes turn out to be nothing more than amortizations of capital expenses (i.e., being allowed to recover your investment before being taxed on it), which is necessary for any business investment. There is no more expensive aircraft in the world than Air Force One (the president's Boeing 747). Can you imagine the howls if Congress insisted that the full cost of using Air Force One — including depreciation and all of the associated personnel — be charged to the president for any political or other non-essential use of the aircraft, rather than reimbursement at the rate of equivalent airfares, which is the current practice?
As long as the government continues to grow faster than the economy — as it has been doing — there is no tax increase that can solve the problem. It is delusional — or irresponsible — to think that the kind of tax increases the president has talked about would do anything serious to solve the debt situation — and would, instead, make the unemployment problem worse.
Steven Rattner, a former Obama Treasury official and one of the key architects of the current economic mess who is now a TV commentator on MSNBC, called the tea party folks "terrorists," as have other leading Democrats. Despite the empirical evidence and his own forecasts, Mr. Rattner continues to exhibit a certain form of delusion called "cognitive dissonance," by insisting that higher levels of spending and taxes will increase employment. Who is more of an economic "terrorist" — a person who insists on not increasing an unsustainable debt without a credible plan to bring it down, or someone who insists on continuing to follow policies that have only led to more unemployment and almost non-existent growth?
You may have heard comments by some Democratic elected officials or seen some of the TV ads that demand that we make no changes in Social Security, Medicare, Medicaid and the other entitlements. Government actuaries show that without reform, the entitlements will eat up the entire federal budget and then the economy. One of the ads is sponsored by a labor-union-affiliated group. I called the group to ask for its alternative to any proposed changes to these entitlements. I spoke to a fellow in its policy shop who did not dispute my numbers, but who told me that his organization was just demanding no cutbacks and it was up to others to come up with a solution. Hmmm, if someone ran ads demanding that the length of daylight be the same in January as in June, but had no proposal for accomplishing the impossible, you would correctly think they were delusional.
It is amazing the way the establishment media has largely overlooked the fact that more than half of the Democrats voted against the budget-ceiling compromise, for the most part, because they did not like even the small cutbacks in spending. But a smaller percentage of Republicans voted against the bill because they correctly saw it as insufficient to deal with the budget problem, and yet they are called "delusional." Who are the ones really denying reality?
We know from studies of cognitive dissonance that many who have invested a great deal in bad ideas, like those who predict with certainty a date when the world will come to an end, will continue to defend these false beliefs — with some lame excuse — when the date comes and goes. The people who claim that more and more government spending, taxing and regulation will bring economic salvation in higher growth and more jobs are equally false and destructive prophets.
Protecting Economic Liberty: The Essential Freedom
by Doug Bandow
"I'm from the government and I'm here to help you" has become a standard punch line. There is no greater joke when public officials limit competition in the name of protecting consumers. Such as Louisiana's now-defunct casket monopoly.
Professional licensing is routine across America. You want to be a lawyer or hairdresser? You want to be a doctor or manicurist? Get a license — from a government-backed panel dominated by your established competitors.
No one wants to be served by an incompetent, but in most cases, health and safety are not at issue. If a hair stylist gives you a bad haircut, you'll be embarrassed, nothing more.
Doug Bandow is a senior fellow at the Cato Institute. A former special assistant to Ronald Reagan, he is the author of Foreign Follies: America's New Global Empire (Xulon).
More by Doug Bandow
Even for services with greater impact the licensing process is designed to protect existing practitioners rather than consumers. Plenty of non-lawyers, such as paralegals and even legal secretaries, are capable of doing work now reserved for attorneys. However, state bar associations fiercely police the "unauthorized practice of law," which is not the same as the incompetent practice of law.
Doctors similarly create arbitrary barriers against other medical professionals caring for patients. The government should combat fraud and malpractice, not decide which provider can do which procedure.
But casket-making is a far easier case. Obviously, making a substandard coffin isn't going to hurt the corpse, let alone kill anyone. (Indeed, the state neither sets standards for casket construction nor even requires use of one for burial.) Yet Louisiana only allows licensed funeral directors to sell "funeral merchandise," including caskets.
It's not hard to see who benefits from this restriction. It certainly isn't the dead or the bereaved families of the dead. It's not the producers of "funeral merchandise." And it isn't the public.
In Louisiana the casket typically accounts for nearly a third of the cost of a funeral. It is known in the trade as a "high margin" item. In fact, some people buy coffins from Wal-mart or even online in order to save money.
The winners from the casket monopoly obviously are the funeral directors.
Everyone paying for a funeral is a victim of Louisiana's rule. Along with the Saint Joseph Abbey of Saint Benedict, established in 1889.
For years the Benedictine monks made simple caskets for their own members. Over the years they received numerous requests from others to buy similar coffins.
Monks typically support themselves through common trades. Saint Joseph Abbey harvested timber for income, but Hurricane Katrina badly damaged the abbey's pine forest. Starting in 2007, Saint Joseph's 36 monks followed the example of monasteries in Illinois, Indiana, Iowa, and Minnesota and started making handcrafted caskets. The coffins are both unique and less expensive than those offered by funeral homes. Moreover, the abbey stored caskets made in advance for free.
No customer complained. But a local competitor, the Mothe Funeral Homes, went to the Board of Embalmers and Funeral Directors. Leonard Dunn, the operator of Serenity Funeral Home, another nearby operation, explained: "They're cutting into our profit." So the state board, run by funeral directors, issued a "cease-and-desist" order even before the abbey sold its first casket. The board also employed an investigator to confirm that the monastery did, indeed, do what it claimed to do: sell caskets. The board threatened the monks with fines and imprisonment — up to six months in jail.
The abbey urged the state legislature to change the law, but individual funeral home directors joined with industry lobbyists to defend the coffin cartel. (Not every funeral director was on board. Darin Bordelon of LaVille Funeral Home complained that Saint Joseph's opponents were "making us all look greedy.")
The only way for the monks to avoid punishment was to abandon their religious routine, serve a year as apprentices in a licensed funeral home, and turn their monastery into a formal "funeral establishment" with embalming equipment. Just to make and sell caskets.
So the monks went to court. Taking their case was the Institute for Justice, a public interest legal organization dedicated to defending economic liberty. IJ filed suit contending that Louisiana's casket monopoly violated the 14th Amendment.
The monks had the facts on their side. An earlier Federal Trade Commission investigation found that state-protected monopolies do not protect consumers. The agency specifically criticized states which used licensing to restrict competition in the sale of caskets. Notably, Louisiana made no effort to prevent funeral directors from selling overpriced junk coffins. It just wanted to make sure that only funeral directors could sell overpriced junk coffins.
The case also demonstrated the importance of economic liberty. Those who promote individual liberty tend to favor freedom of speech and assembly. These liberties represent freedom of conscience and promote political liberty, and are critical to the development of the human person.
However, economic liberty is no less important. Freedom is indivisible. Freedom of expression and speech is the freedom to buy a printing press, create a website, and build a television studio. Without access to the practical economic tools of liberty it is difficult to exercise the political forms of liberty.
More broadly, economic development helps create an environment and ethos more conducive to the development of democracy. People who no longer have to worry about feeding their families are more likely to acquire the instruments of liberty. They also are more likely to use them.
But there is something even more basic. Economic freedom is about earning a living and supporting oneself and one's family. Few human duties are more important.
Moreover, earning an income enables one to seek life's transcendent values. For the monks, casket-making is a means to an important end. Blocking the means interferes with the end.
Finally, for many people work is a critical aspect of their development and happiness as human beings. Obviously, some individuals allow their job to become an idol, taking over their lives. But the opportunity to freely choose one's vocation is more than just a matter of dollars and cents.
The law doesn't always reflect good policy. But in this case justice triumphed. U.S. District Court Judge Stanwood Duval ruled for the monks, explaining: "The Court finds no rational relationship between the Act and 'public health and safety.' No evidence was presented to demonstrate that requiring the purchase of caskets from licensed funeral directors aids the public welfare."
No evidence was presented because none exists.
The judge added: "Simply put, there is nothing in the licensing procedures that bestows any benefit to the public in the context of the retail sales of caskets. The license has no bearing on the manufacturing and sale of coffins. It appears that the sole reason for these laws is the economic protection of the funeral industry which reason the Court has previously found not to be a valid government interest standing alone to provide a constitutionally valid reason for these provisions."
Unfortunately, the problem of "economic protection" is far broader than just Louisiana's protection of the funeral industry. However, Judge Duval's ruling is a good start. No government should misuse its power to sacrifice everyone's economic freedom for the enrichment of a few.
It doesn't happen often, but in this case someone came from the government and actually did help people. Judge Duval recognized that respecting people's liberty is the best form of assistance. We can only hope that he is not the last government official to do so.
Yes, It Is a Ponzi Scheme
by Michael D. Tanner
Texas governor Rick Perry is being criticized for calling Social Security a "Ponzi scheme." Even Mitt Romney is reportedly preparing to attack him for holding such a radical view. But if anything, Perry was being too kind.
The original Ponzi scheme was the brainchild of Charles Ponzi. Starting in 1916, the poor but enterprising Italian immigrant convinced people to allow him to invest their money. However, Ponzi never actually made any investments. He simply took the money he was given by later investors and gave it to his early investors, providing those early investors with a handsome profit. He then used these satisfied early investors as advertisements to get more investors. Unfortunately, in order to keep paying previous investors, Ponzi had to continue finding more and more new investors. Eventually, he couldn't expand the number of new investors fast enough, and the scheme collapsed. Ponzi was convicted of fraud and sent to prison.
Social Security, on the other hand, forces people to invest in it through a mandatory payroll tax. A small portion of that money is used to buy special-issue Treasury bonds that the government will eventually have to repay, but the vast majority of the money you pay in Social Security taxes is not invested in anything. Instead, the money you pay into the system is used to pay benefits to those "early investors" who are retired today. When you retire, you will have to rely on the next generation of workers behind you to pay the taxes that will finance your benefits.
Michael Tanner is a senior fellow at the Cato Institute and coauthor of Leviathan on the Right: How Big-Government Conservatism Brought Down the Republican Revolution.
More by Michael D. Tanner
As with Ponzi's scheme, this turns out to be a very good deal for those who got in early. The very first Social Security recipient, Ida Mae Fuller of Vermont, paid just $44 in Social Security taxes, but the long-lived Mrs. Fuller collected $20,993 in benefits. Such high returns were possible because there were many workers paying into the system and only a few retirees taking benefits out of it. In 1950, for instance, there were 16 workers supporting every retiree. Today, there are just over three. By around 2030, we will be down to just two.
As with Ponzi's scheme, when the number of new contributors dries up, it will become impossible to continue to pay the promised benefits. Those early windfall returns are long gone. When today's young workers retire, they will receive returns far below what private investments could provide. Many will be lucky to break even.
Eventually the pyramid crumbles.
Of course, Social Security and Ponzi schemes are not perfectly analogous. Ponzi, after all, had to rely on what people were willing to voluntarily invest with him. Once he couldn't convince enough new investors to join his scheme, it collapsed. Social Security, on the other hand, can rely on the power of the government to tax. As the shrinking number of workers paying into the system makes it harder to continue to sustain benefits, the government can just force young people to pay even more into the system.
In fact, Social Security taxes have been raised some 40 times since the program began. The initial Social Security tax was 2 percent (split between the employer and employee), capped at $3,000 of earnings. That made for a maximum tax of $60. Today, the tax is 12.4 percent, capped at $106,800, for a maximum tax of $13,234. Even adjusting for inflation, that represents more than an 800 percent increase.
In addition, at least until the final collapse of his scheme, Ponzi was more or less obligated to pay his early investors what he promised them. With Social Security, on the other hand, Congress is always able to change or cut those benefits in order to keep the scheme going.
Social Security is facing more than $20 trillion in unfunded future liabilities. Raising taxes and cutting benefits enough to keep the program limping along will obviously mean an ever-worsening deal for younger workers. They will be forced to pay more and get less.
Rick Perry got this one right.
Soaking the Rich Is Not Fair
by Jeffrey A. Miron
What is the "fair" amount of taxation on high-income taxpayers?
To liberals, the answer is always "more." Liberals view high income — meaning any income that exceeds their own — as the result of luck or anti-social behavior. Hence liberals believe "fairness" justifies government-imposed transfers from the rich to everyone else. Many conservatives accept this view implicitly. They oppose soak-the-rich policies because of concern over growth, but they do not dispute whether such policies are fair.
But high tax rates on the rich are not fair or desirable for any other reason; they are an expression of America's worst instincts, and their adverse consequences go beyond their negatives for economic growth.
The liberal hatred of the rich is a minority view, not a widely shared American value.
Consider first the view that differences in income result from luck rather than hard work: some people are born with big trust funds or innate skill and talent, and these fortuitous differences explain much of why some people have higher incomes than others.
Never mind that such a characterization is grossly incomplete. Luck undoubtedly explains some income differences, but this is not the whole story. Many trust fund babies have squandered their wealth, and inborn skill or talent means little unless combined with hard work.
But even if all income differences reflect luck, why are government-imposed "corrections" fair? The fact that liberals assert this does not make it true, any more than assertions to the contrary make it false. Fairness is an ill-defined, infinitely malleable concept, readily tailored to suit the ends of those asserting fairness, independent of facts or reason.
Worse, if liberals can assert a right to the wealth of the rich, why cannot others assert the right to similar transfers, such as from blacks to whites, Catholics to Protestants, or Sunni to Shia? Government coercion based on one group's view of fairness is a first step toward arbitrary transfers of all kinds.
Now consider the claim that income differences result from illegal, unethical, or otherwise inappropriate behavior. This claim has an element of truth: some wealth results from illegal acts, and policies that punish such acts are appropriate.
But most inappropriate wealth accumulations results from bad government policies: those that restrict competition, enable crony capitalism, and hand large tax breaks to politically connected interest groups. These differences in wealth are a social ill, but the right response is removing the policies that promote them, not targeting everyone with high income.
The claim that soaking the rich is fair, therefore, has no basis in logic or in generating desirable outcomes; instead, it represents envy and hatred.
Why do liberals hate the rich? Perhaps because liberals were the "smart" but nerdy and socially awkward kids in high school, the ones who aced the SATs but did not excel at sports and rarely got asked to the prom. Some of their "dumber" classmates, meanwhile, went on to make more money, marry better-looking spouses, and have more fun.
Liberals find all this unjust because it rekindles their emotional insecurities from long ago. They do not have the honesty to accept that those with less SAT smarts might have other skills that the marketplace values. Instead, they resent wealth and convince themselves that large financial gains are ill-gotten.
Jeffrey A. Miron is Senior Lecturer and Director of Undergraduate Studies at Harvard University and Senior Fellow at the Cato Institute. Miron blogs at JeffreyMiron.com and is the author of Libertarianism, from A to Z.
More by Jeffrey A. Miron
The liberal views on fairness and redistribution are far more defensible, of course, when it comes to providing for the truly needy. Reasonable people can criticize the structure of current anti-poverty programs, or argue that the system is overly generous, or suggest that private charity would be more effective at caring for the least vulnerable.
The desire to help the poor, however, represents a generous instinct: giving to those in desperate situations, where bad luck undoubtedly plays a major role. Soaking the rich is a selfish instinct, one that undermines good will generally.
And most Americans share this perspective. They are enthusiastic about public and private attempt to help the poor, but they do not agree that soaking the rich is fair. That is why U.S. policy has rarely embraced punitive income taxation or an aggressive estate tax. Instead, Americans are happy to celebrate well-earned success. The liberal hatred of the rich is a minority view, not a widely shared American value.
For America to restore its economic greatness, it must put aside the liberal hatred of the rich and embrace anew its deeply held respect for success. If it does, America will have enough for everyone.
Obama Must Pay for His Illegal War
by Doug Bandow
The rebels' looming triumph over Libyan dictator Muammar Qaddafi has caused supporters of the Obama administration to do a victory lap, but the conflict is not yet over — and the impact on America has just begun.
However, the imminent end of U.S. military action provides Congress with an opportunity to confront the president's apparent predilection to conduct illegal wars. When President Barack Obama took the United States into Libya's civil war in March, it was yet another war of choice that served no American security interests. To the contrary, bombing a government that had abandoned its nuclear program and dropped plans for long-range missiles made peaceful denuclearization of other nations, such as Iran and North Korea, well nigh impossible.
At the same time, Washington unleashed unpredictable political forces in Libya. Qaddafi's imminent demise is welcome, but a liberal, democratic future for the North African nation is not certain, and perhaps not even likely. Events in Egypt next door show the many barriers to creating a genuinely free society.
Doug Bandow is a senior fellow at the Cato Institute. A former special assistant to Ronald Reagan, he is the author of Foreign Follies: America's New Global Empire (Xulon).
More by Doug Bandow
Nor did the administration succeed in its alleged humanitarian mission to protect the Libyan people. The initial claims of prospective massacres were propaganda, a la George W. Bush's WMDs in Iraq. In fact, Qaddafi had slaughtered no civilians in any of the cities he had earlier retaken from the rebels, and his incendiary rhetoric was directed against armed insurgents.
Worse, by adopting a minimalist military policy, the administration prolonged the conflict, resulting in far more deaths. Low-tech civil wars are usually bloody: the administration turned a potentially quick victory into more than five months of arduous fighting. Having allegedly gotten involved to save lives, President Obama prosecuted the war in a manner almost designed to maximize civilian casualties.
Still, unwisely going to war is hardly unique to this president. The good news, so far at least, is that Libya is far less consequential than Iraq. President Bush's war blunder was catastrophic. President Obama's has been modest.
Where this administration outshone its predecessor was in ostentatiously conducting an illegal war, treating the U.S. Congress and, more important, the American people as idiots. At least President Bush sought congressional authorization, if not a formal declaration, and never denied that he was fighting a war. President Obama played George Orwell and claimed that no hostilities were occurring even as American planes, missiles, and drones killed Libyan military personnel and destroyed Libyan military materiel.
For a brief moment Congress flared in indignation, but it quickly retreated, in part cowed by the claim that it would be irresponsible to undercut the administration's ongoing non-hostile hostilities.
Now that U.S. and NATO participation is largely over — with limited strikes backing rebel advances on the final Qaddafi strongholds in the south — Congress should revisit the issue and stand by the Constitution. President Obama once taught constitutional law, but he obviously should have been a student again.
The Founders wrote the Constitution as they did to stop precisely such a unilateral war. Indeed, one of their greatest fears was that America's president would act like the British king, launching unnecessary wars on no one's authority but his own. The early Americans consciously tried to make U.S. participation in war less likely. Article 1, Sec. 8 (11) states that "Congress shall have the power ... to declare war." Observed James Madison: the "fundamental doctrine of the Constitution that the power to declare war is fully and exclusively vested in the legislature."
In fact, the Founders gave other important war-making powers to Congress as well, including raising an army, approving military expenditures, ratifying treaties, setting rules of war and issuing letters of marquee. The Constitution only made the president commander-in-chief of the military, which primarily empowered him to manage wars authorized by Congress.
Alexander Hamilton, perhaps the constitutional convention's strongest fan of executive power, nevertheless called the commander-in-chief the "first general and admiral" of the armed services. The president's authority, said Hamilton, was "in substance much inferior to" that of Britain's king, the model against which the convention delegates were reacting. The president's power "would amount to nothing more than the supreme command and direction of the land and naval forces ... while that of the British king extends to the declaring of war."
The Framers did change "make" to "declare" to highlight the fact that the president could respond to foreign attack. Contrary to the claims of today's fans of presidential war making, however, the Founders did not intend to limit the legislature's power to noting that the president had started a war.
Support for presidential war making has been minimal throughout American history. George Mason bluntly charged that the president "is not safely to be entrusted with" the authority to start wars. Thus, Mason favored "clogging rather than facilitating war." Pierce Butler reassured skeptical citizens of South Carolina that the convention rejected giving the president authority to start wars "as throwing into his hands the influence of a monarch, having an opportunity of involving his country in a war whenever he wished to promote her destruction." James Wilson was equally blunt. The new Constitution, he explained, "will not hurry us into war." Rather, the provision "is calculated to guard against it. It will not be in the power of a single man, or a single body of men, to involve us in such distress; for the important power of declaring war is in the legislature at large." Off in Paris, Thomas Jefferson approved of the "effectual check to the dog of war by transferring the power of letting him loose." Decades later, Abraham Lincoln praised the Framers for recognizing war "to be the most oppressive of all Kingly oppressions; and they resolved to so frame the Constitution that no one man should hold the power of bringing this oppression upon us."
To his credit, President Obama did not make any of these discredited arguments. When candidate Obama was asked whether he could bomb Iran, he answered: "The president does not have power under the Constitution to unilaterally authorize a military attack in a situation that does not involve stopping an actual or imminent threat to the nation."
Candidate, now Secretary of State, Hillary Clinton took a similar position: "I do not believe that the president can take military action — including any kind of strategic bombing — against Iran without congressional authorization."
Then-Senator, now Vice President Joseph Biden even advocated impeachment of President Bush if the latter bombed Iran without congressional authority. The convention delegates, said Biden, "were determined to deny the president" the "unfettered power to start wars." Indeed, the "Framers intended to grant to Congress the power to initiate all hostilities, even limited wars."
Yet on the latter point President Obama apparently disagrees with his vice president — and with many of his own legal advisers. When it came to Libya, the president announced that little wars don't count under the Constitution. If the adversaries aren't downing any of our airplanes, then it doesn't count as a war.
Advocates of investing the president with monarchical war-making power long have pointed to prior chief executives who deployed the military without congressional authority. The list of examples is long, but none of them offer a precedent for attempting to oust the internationally recognized government of another nation that has neither attacked nor threatened the United States or any American ally. Most were limited actions, many were carried out under colorable legal authority, some were undertaken for arguable defensive reasons, and others were initiated without Washington's authorization. Even if these operations were acts of war, they weren't waging war.
Equally important, the fact that other presidents acted unlawfully does not give President Obama authority to do the same. Presumably he recognizes this fact. So he came up with an entirely different argument: In March, he notified Congress pursuant the War Powers Resolution that he had sent U.S. forces into combat. Two months later, when the WPR required that he withdraw or seek congressional approval, he said: Never mind. The American military was not engaged in hostilities. Rather, U.S. personnel apparently were doing something else — perhaps vacationing in the Mediterranean.
President Obama explained that Congress need not concern itself with Washington's participation in Libya's war since America's role was "non-kinetic," "more limited" and "in support." An anonymous administration official declared that "the kinetic pieces of that are intermittent."
The argument was not just nonsense, but nonsense on stilts, as philosopher Jeremy Bentham once characterized an opposing claim.
The Justice Department and Pentagon's General Counsel concluded the war was illegal. Even then defense secretary Robert Gates admitted that if he was "in Qaddafi's shoes" he would think America was at war.
The administration's decision to scale back U.S. participation did not change the character of America's participation in the Libyan civil war. Compare President Dwight Eisenhower, who would not ignore the Constitution. He announced that "I am not going to order any troops into anything that can be interpreted as war, until Congress directs it."
While President Bush never let the Constitution get in the way of his preferred policies, even he always made a colorable claim to be living up to America's fundamental law. President Obama's argument was risible.
Moreover, noted Jack Goldsmith, who for a time headed the Bush Justice Department's Office of Legal Counsel: "this appears to be the first time that a president has violated the War Powers Resolution's requirement either to terminate the use of armed forces within 60 days after the initiation of hostilities or get Congress's support."
Columbia law professor John Bassett Moore is equally dismissive of the president's claim:
There can hardly be room for doubt that the Framers of the Constitution when they vested in Congress the power to declare war, never imagined that they were leaving it to the executive to use the military and naval forces of the United States all over the world for the purpose of actually coercing other nations, occupying their territory, and killing their soldiers and citizens, all according to his own notions of the fitness of things, as long as he refrained from calling his action war or persisted in calling it peace.
Although Congress did nothing earlier in response to President Obama's contemptuous dismissal of its role under the Constitution, legislators now can act without fear of compromising ongoing military operations. Congress should bar use of any federal funds for future military action in Libya. Moreover, it should exact a price for presidential lawlessness. Impeachment, as then-Sen. Biden recognized, would be the appropriate legal remedy, even if today politically inconceivable. At the very least the legislative branch should retaliate against the administration: one possibility would be to defund positions held by officials, such as Harold Koh at the State Department, who advanced the president's dishonest legal claim.
Candidate Obama promised the American people: "No more ignoring the law when it's inconvenient." Alas, Congress obviously must insist that he obey the law. The war in Libya is no great administration victory. Even if it were, it would be a prize too dearly bought. For the Constitution limits presidential authority to protect our liberty. And every time a president ignores a constitutional restraint, another legal guarantee for our freedom is effectively erased from America's governing document.
China: American Financial Colony or Mercantilist Predator?
China: American Financial Colony or Mercantilist Predator? – by Lewis E. Lehrman
China is an important trading partner of America. But it may also be a mortal threat. And not for the conventional reasons usually cited in the press. Ironically, it is a threat because China is in fact a financial colony of the United States, a colony subsidized and sustained by the pegged, undervalued, yuan-dollar exchange rate. Neither the United States nor its economic colony seems to understand the long-term destructive consequences of the dollarization not only of the Chinese economy but also of the world monetary system. While the Chinese financial system has been corrupted primarily by tyranny, deceit, and reckless expansionism, it is also destabilized by the workings of the world dollar standard. Neither the United States nor China has come to grips with the perverse effects of the world dollar standard.
The social and economic pathology of 19th century colonialism is well studied, but the monetary pathology of its successor, the neo-colonial reserve currency system of the dollar, is less transparent. In order to remedy this pathological defect, the United States must rid itself of its enormous Chinese financial colony, whose exports are subsidized by the undervalued yuan in return for Chinese financing of the U.S. twin deficits. Both China and the United States must also free themselves from the increasing malignancy of the dollar reserve currency system, the primary cause of inflation in both China and the United States.
In the end, only monetary reform, including an end to the reserve currency system, can permanently separate the dollar host from its yuan colony. Without monetary reform, the perverse effects of the dollar reserve currency system will surely metastasize into one financial and political crisis after another—even on the scale of the 2007–2009 crisis.
It is, of course, a counter intuitive fact that China has been financially colonized by the United States. But why is this a fact? Simply because China has chained itself to the world dollar standard at a pegged undervalued exchange rate, choosing therefore to hold the exchange value of its trade surplus—that is, its official national savings—in U.S. dollar securities. It is true that the dollar-yuan strategy of America’s Chinese colony has helped to finance a generation of extraordinary Chinese growth. But China now holds more than 3 trillion dollars of official reserves and more than a trillion dollars in U.S. government securities. These Chinese dollar reserves directly finance the deficits of the American colonial center. This arrangement clearly resembles the imperial system of the late 19th century. The value of a British colony’s reserves were often held in the currency of the imperial center, then invested in the London money market. Thus, the colony’s reserves were entirely dependent on the stability of the currency of the colonial center. While China is America’s largest financial colony, most other developing countries are also bound to neo-colonial status within the reserve currency hegemony of the dollarized world trading system.
China’s dollarized monetary system reminds us of nothing so much as the historic colonial financial arrangements imposed by the later British Empire on India before World War I—India actually remaining a financial colony of England long after its independence in 1947. How did the sterling financial empire work? The imperial colony of India, beginning in the late 19th century, held its official Indian currency reserves (savings) in British pounds deposited in the English money market; independent developed nations at that time, like France and Germany, held their reserves in gold. That is, France, Germany, and the United States settled their international payment imbalances in gold—a non-national, common, monetary standard—holding their official reserves, too, in gold. But the London-based reserves of colonial India were held not primarily in gold, but in British currency, helping to finance not only the imperial economic system, but also the imperial banking system, imperial debts, imperial wars, and British welfare programs. Eventually, as we know, both the debt-burdened British Empire and its official reserve currency system collapsed.
For more than a generation now, a similar process has been at work in China. China is America’s chief colonial appendage. The Chinese work hard and produce goods. Subsidized by an undervalued yuan, they export much of their surplus production to America. But, like the Indians who were paid in sterling, the exports of Chinese colonials are substantially paid in dollars, not yuan—because bilateral and world trade, and the world commodities market,
have been dollarized. And thus it may be said that the world financial system is today an unstable neocolonial appendage of the unstable dollar.
China, like its predecessor the British colony of India, has chosen to hold a significant fraction of what it is paid in the form of official dollar reserves (or savings). These dollars are promptly redeposited in the U.S. dollar market, where they are used to finance U.S. deficits. Every Thursday night, the Federal Reserve publishes its balance sheet, and there we now read that more than $2.5 trillion of U.S. government securities are held in custody for foreign monetary authorities, 40 percent of which is held for the account of America’s chief financial colony, Communist China. It is clear that without financial colonies to finance and sustain the immense U.S. balance of payments and budget deficits, the U.S. paper dollar standard and the growth of U.S. government spending would be unsustainable.
It is often overlooked that these enormous official dollar reserves held by China are a massive mortgage on the work and income of present and future American private citizens. This Chinese
mortgage on the American economy has grown rapidly since the suspension of dollar convertibility to gold in 1971. China—poor and undeveloped in 1971—was at that time very jealous of its sovereign independence, sufficiently so to reject its alliance with the Soviet Union—even earlier to attack U.S. armies on the Chinese border during the Korean War. In an
ironic twist of fate, China surrendered its former independence and, as a U.S. financial colony, joined the dollar-dominated world financial system. China’s monetary policy is anything but independent. It is determined primarily by the Federal Reserve Board in America, the pegged yuan-dollar exchange rate serving as the transmission mechanism of Fed-created excess dollars pouring into the Chinese economic system. Perennial U.S. balance of payments deficits
send the dollar flood not only into China but also into all emerging countries. The Chinese central bank buys up these excess dollars by issuing new yuan, thereby holding up the overvalued dollar, and holding down the undervalued yuan. Much of these Chinese official dollar purchases are then invested in U.S. government debt securities. So even though America exports excess dollars to China, China sends them back to finance the U.S. budget deficit—much like marionettes walking off one side of the stage, merely to reappear unchanged on the other side.
This is the little-understood arbitrage mechanism of the pegged exchange rate system by which Fed-created excess dollars are bought and held as reserves by the Chinese central bank, in exchange for which newly created yuan are issued, thereby supercharging inflation in China. The Chinese dollar reserves, which are reinvested in the United States, help to ignite inflation in the United States. It is clear that the workings of the official dollar reserve currency system cause purchasing power to be multiplied, or at least doubled, in both countries. But these central bank issues of new money are unassociated with the production of new goods and services during the same market period. Thus total spending, or purchasing power, exceeds the total value of goods and services at prevailing prices. When total demand exceeds total supply, the price level must rise.
But just as the subservient, colonial Indians were constrained not to sell their sterling reserves too quickly, so the Chinese are constrained—by politics, diplomacy, and self-interest—not to dump their depreciating American dollars. The Indians had to consult their imperial bankers, even though the English were debtors to their Indian colony, because the Indians did not wish to anger the colonial center, nor to precipitate a sterling crisis. From time immemorial, creditors with too large a stake in an over-sized debtor often beg leave of their debtor to get their money back.
China is frustrated by circumstances similar to those of a colony of imperial Britain. Hostility has arisen in the debtor—the United States. Fear of setting off a dollar slide haunts the hostile creditor, China. The difficulty of finding a suitable portfolio of alternatives for a trillion dollars in U.S. government debt annoys the outspoken Chinese financial colony, as it calls for a new world monetary system. But there seems to be no genuine alternative to the very liquid dollar market. De facto illiquidity of official Chinese dollar reserves is enforced by political sensitivities, not by market salability. The debtor, as the saying goes, is “too big to fail.” Thus arises an unstable stalemate, a yuan-dollar pegged exchange rate regime constantly on the edge of a crisis.
The “exorbitant privilege” of the dollar is matched by the insupportable burden of America’s overvalued reserve currency role, which has tended to deindustrialize the colonizer, gradually increasing social inequality by reducing the standard of living of lower- and middle-income American families. The reserve currency country then feels compelled, as the Fed does today, to depreciate the dollar in the vain hope of eliminating the trade deficit and the balance of payments deficit—by becoming more competitive abroad as it becomes poorer at home.
The perversity of the official reserve currency system is endless as China now endures high inflation engendered by its colonial status in the world dollar system.
The floating, pegged exchange rate system based on the dollar has been slowly decaying since the end of World War II. But the dollar-based reserve currency system, because of the unmatched scale and liquidity of the dollar markets, could last another generation. When it will collapse cannot be predicted. That it will collapse, without systemic reform, I think inevitable. Few predicted the timing of the collapse of the pegged dollar system of Bretton Woods. But it did collapse in August of 1971, followed by America’s worst decade since the Great Depression.
Ultimately America, the leader of the unstable world financial system, must choose bet ween two options.
1. The United States can wait for the eventual demise of the world dollar standard under chaotic
conditions, similar to the final sterling collapse and the subsequent collapse of Bretton Woods in 1971. This option is analogous to the intrepid daredevil who leaps from his 10th floor window, secure in the fact that he is still unhurt two floors from the street level.
2. Or, America could take the lead in reforming the official reserve currency system based on the dollar. Such a monetary reform program would entail a careful windup, by agreement, of the world dollar standard. At the same time America would reestablish by statute a dollar convertible to gold, i.e., a dollar defined in law as a weight unit of gold. Gold would replace the dollar as the world’s reserve currency.
The reform would, first and foremost, establish a tested, non-national, neutral monetary standard as the basis of a stable dollar—one which reasonable sovereign trading partners could accept. Gold would become the international settlements currency and thus would replace the dollar as the basis of world trade and finance. Inasmuch as monetary history shows that no unstable national currency can permanently serve as the crucial world reserve currency, it follows that neither can an unstable basket of national currencies, nor can a fiction such as the SDR—the reserve asset created by the International Monetary Fund to supplement member countries’reserves.
But we are left with the question: what does the evidence of American history suggest as the basis for a stable dollar?
The stability of the U.S. dollar has varied widely in its history. This variation is explained by two factors: the monetary standard chosen for the dollar, and whether other countries have simultaneously used cash and securities payable in dollars as their own reserves, even as their monetary standard itself (i.e., official reserve currencies in place of gold).
The United States has alternated between two kinds of standard money: inconvertible paper money and some precious metal (first silver, then gold). The dollar was an inconvertible paper money during and after the Revolutionary War (1776–92), the War of 1812 (1812–17), the Civil War and Reconstruction (1862–79), and again from 1971 to present. The dollar was effectively defined as a weight of silver (and gold) in 1792–1812 and 1817–34, and as a weight of gold in 1834–61 and 1879–1971. The minted gold eagle, set equal to 10 dollars, and subsidiaries thereof, was provided for in the Coinage Act of 1792. The dollar was not used by foreign monetary authorities as an official monetary reserve asset before 1913, but the dollar has been an official “reserve currency” for many countries since World War I (along with the pound sterling). The dollar has been the primary official reserve currency for most countries since 1944.
Applying two criteria divides the monetary history of the United States into distinct phases. We
can compare the stability of these monetary regimes by examining the variation in the Consumer Price Index (as reconstructed back to 1800) by two simple measures: long-term CPI stability (measured by the annual average change from beginning to end of the period of each monetary standard) and short-term CPI volatility (measured by the standard deviation of annual CPI changes during the period).
Weighting these criteria equally, the classical gold standard from 1879–1914 was the most stable of all U.S. monetary regimes (as the table shows).
After the failures of several generations of unhinged paper currencies, pegged and floating exchange rates, America should embrace a stable monetary system tested in the laboratory of human history—the cornerstone of which the elites have rejected for a century. It is now time to restore that cornerstone—the true gold standard, shorn of the economic pathology of official reserve currencies. Now is the time to restore the American monetary standard authorized by the Founders in the Constitution—Article I, Sections 8 and 10. Now is the historical moment for America to take the lead and again give the world a real money, the Founders’ gold dollar of the Coinage Act of 1792. What the Founders learned from the paper money inflation of the Revolution, the recent past has taught us again. America and the world need a monetary standard which, unlike the paper-credit dollar, cannot be created at zero marginal cost with which to dispossess the prudent and to subsidize the U.S. government and insolvent financial institutions at near zero interest rates.
For America to establish the gold standard would provide the least imperfect monetary solution to the problems of a century of financial disorder—engendered over and over by central bank-manipulated paper money, official reserve currencies, and floating pegged exchange rates. Only a stable dollar, a dollar defined by statute as a weight unit of gold, can pin down the long-term price level, restoring the incentive to save and ruling out extreme inflation and deflation. Such a dollar convertible to gold would reopen the road to confidence in the long-term value of the U.S. monetary standard. This is the durable road to economic growth and prosperity—financed by increased long-term savings, increased long-term investment, and rising demand for labor at rising real wages.
* Lewis E. Lehrman is a senior partner at L. E. Lehrman & Co. and chairman of the Lehrman Institute.
US: Climate prostitutes, charlatans and comedians
US: Climate prostitutes, charlatans and comedians – by Paul Driessen
Climate prostitutes, charlatans and comedians: Their “research” would be hilarious if it didn’t cost taxpayers and consumers so dearly.
Put these guys on Comedy Central. Put ‘em in an asylum … a mandatory restitution program … jail perhaps … or a witness protection program, if they turn state’s evidence on other perpetrators. But keep them away from our money – and our energy, economic, healthcare and education policies.
Climate prostitutes, parasites, charlatans and comedians have been devouring billions in US taxpayer dollars, year after year, plus billions more in corporate shareholder cash, activist foundation funds and state government grants. The laws, mandates, subsidies and regulations they advance have cost taxpayers and consumers still more billions for “alternative” energy and other schemes that send prices skyrocketing, kill jobs, and reduce health and living standards.
It’s time to end this destructive saga and, while we’re at it, pink-slip the politicians and bureaucrats who pour billions of hard-earned tax dollars into perpetual climate “research,” “education” and “environmental” programs. They’re actively complicit or have completely failed to perform proper due diligence.
Global cooling has morphed into global warming, climate change, global climate disruption, climate “weirding” and every extreme weather event – always manmade, always imminently catastrophic, always requiring eternal research and wrenching societal transformation, to “save the planet.”
The endless absurdity oozing out of the climate change cesspool would be hilarious if it weren’t so costly.
“Global warming: Is weight loss a solution?” the “peer-reviewed” International Journal of Obesity breathlessly wondered a few weeks ago. Most definitely. Fat people breathe more and thus emit more carbon dioxide. If the world’s 1.5 billion obese and overweight adults all lost 22 pounds apiece and kept if off for a year, the reduction in CO2 would equal 0.2% of global emissions from burning fossil fuels and manufacturing cement. (Translation: “health professionals” deserve more climate research loot.)
If you need more proof that “obesity and climate change are linked,” simply consider how awful life is now in Mexico, the same authors argued in an article for their Climate and Health Council. One in four Mexicans is now obese. “The planet is getting hotter, its people are getting fatter, and the use of fossil fuel energy is the cause of both. Large increases in motor vehicle traffic have decimated levels of physical activity. This, combined with increased availability of energy-dense food, has propelled the body mass index in the entire [Mexican] population upward.”
“Moving to a low-carbon economy could be the next great public health advance,” the C&HC “experts” suggested. But even eating less meat won’t be enough, nor reducing dependence on dairy products, nor even vegetarianism, pal reviewers intoned. “We have to be vegans,” get rid of cars – and reduce human populations, perhaps with “China’s one-child policy (entailing elements of compulsion)” as the model.
Didn’t we try that low-carb, low-carbon stuff for most of human history? Aren’t they still trying it in Sub-Saharan Africa? Do we want dictatorial one-child policies in an era of “choice” and aging pensioners?
Some aren’t sure this meatless diet craze is crazy. They claim the link between climate change and raising animals for meat is borne out by Earth history. According to a Texas paleontologist, dung and flatulence from herds of hadrosaurs, the Cretaceous equivalent to modern cattle, could have contributed to Arctic warming 70 million years ago. Other scientists say the hypothesis is a load of coprolite.
Nearly 2,000 animal species “are fleeing global warming by heading north much faster than they were less than a decade ago,” asserts new “research” just published in the once-credible journal Science. The opportunistic species are moving at the breakneck speed of “about a mile a year,” intrepid climate-chaos promoter Seth Borenstein anxiously noted in his AP wire story.
The situation could quickly reverse if reduced solar activity and the past two years’ frigid Northern Hemisphere winters become the new norm. But neither Science nor the AP mentioned that or explained how the current migrations of opportunistic plants and critters differ from what’s been happening since the last Pleistocene glaciers retreated and the Little Ice Age ended.
Instead, we’ve been repeatedly treated to amusingly convoluted back-peddling from earlier pronouncements that ski resorts will be a thing of the past and “children just aren’t going to remember what snow is.” Now we’re told that global warming can worsen winters and increase snowfalls. In fact, as one Greenpeace activist explained, “Global warming can mean colder. It can mean wetter. It can mean drier. That’s what we’re talking about.”
Actually, what we’re talking about is Earth’s constantly changing weather and climate caused – not by hydrocarbon use – but by complex, chaotic, unpredictable atmospheric, oceanic, solar, planetary and other forces whose interactions and effects scientists are only beginning to understand. To respond adequately to them, we need building, heating, air conditioning and other technologies to adapt to, cope with, and protect our lives and property against those forces – and the prosperity to afford those technologies.
Unfortunately, anti-hydrocarbon policies, laws and regulations (often driven by alarmist climate “research” and horror stories) are making it increasingly difficult to address those needs. Rather than developing our nation’s own vast natural resource and human resources, America is wasting billions on politically correct technologies and companies, like Evergreen Solar, which got $486 million in taxpayer subsidies before going belly-up this month. As Al Gore likes to say, that is unsustainable.
Meanwhile, a steady stream of headline-grabbing “studies” continues to power the climate scare and renewable energy gravy train. Retired professor John Brignell’s website (http://www.numberwatch.co.uk/
A new taxpayer-funded NASA/Penn State “scientific” study warns that “ecosystem-valuing universalist” (really “green”) aliens might realize that we have been altering “the chemical composition of Earth’s atmosphere,” conclude that we have “ecological destructive tendencies,” and “wipe humanity out in order to preserve the Earth system as a whole.” (And you thought James Hansen and Michael Mann were the only loons collecting big bucks at these institutions of “vital research” and “higher education.”)
This interminable pessimism undoubtedly prompted climate activist Danny Bloom to marry his “longtime companion and love of his life: Mother Earth” – in a charming ceremony officiated by an online justice of the peace. Perhaps he can consummate his marriage, using one of “the first-ever eco-friendly luxury condoms,” which were developed by “two French aristocrats” and introduced in the USA just in time for Valentine’s Day 2011. Unlike other condom manufacturers, the Original Condom Company is “extremely eco aware and makes every effort to cover [its] carbon footprint.”
These attention-getting stunts may not save the planet. But responsible citizens may be able to save the republic, by helping Congress, the White House and their “debt committee” find a few places where tens of billions are being wasted on excess bureaucrats, bogus research, useless reports and destructive policies.
President Reagan once observed that, if politics is the second oldest profession, it bears a striking resemblance to the first. A corollary might be that, even if the perpetrators are wearing eco-friendly luxury condoms, most citizens don’t like getting screwed by elected officials and unelected bureaucrats.
With Congress home for more fact-finding meetings with constituents, citizens have a perfect opportunity to send a powerful message. Let’s make the most of it.
* Paul Driessen is senior policy advisor for the Committee For A Constructive Tomorrow and Congress of Racial Equality, and author of Eco-Imperialism: Green power – Black death.
US: Looking Forward to Gold – The New York Sun
As President Obama was getting set to address the Congress in respect of jobs, our attention was on James Grant. We’re adding the editor of Grant’s Interest Rate Observer to our list of sages who can articulate the case for monetary reform in the spirit of Charles de Gaulle. Mr. Grant didn’t summon 1,000 journalists to a press conference in the salle de fetes the way the president of the Fifth French Republic did in 1965. He went him one better, appearing this week on an edition of CNBC’s Squawk Box devoted to the gold standard. The result is a memorable piece of journalism that is yet another instance of an important institution — CNBC — taking a new and suddenly more respectful look at what has in recent decades been set down as, to use Keynes’ phrase, a barbarous relic.
The show starts off with a piece by the CNBC graphics department that traces the history of the gold standard. It is followed by the host, Steve Liesman, senior economics correspondent of CNBC. He tells the audience that he “came into this with the economic orthodoxy that a gold standard is a stupid thing.” More recently, however, he is now “a little more in the middle on this.” This reflects a newsworthy phenomenon. People are rapidly getting ahead of the government on this. We have this sense not only from our own interviews but from the astonishing poll rankings of, say, Congressman Ron Paul, whose entire career has been centered on a campaign for sound money.
This in and of itself is not surprising. The dollar, after all, is in a historic collapse, its value having plunged to well less than an 1,800th of an ounce of gold. What is surprising — or, if not that, gratifying — is the raptness of the attention now being paid to a journalist like Mr. Grant, who offers the gold standard not in absolute terms but as “the least imperfect monetary arrangement available.”
Quoth Mr. Grant: “If one were interested in the following properties for a monetary standard, one might look to a gold standard. For example, you’d want something that is synchronous, that is reciprocal, that links . . .” “All nations together,” Mr. Liesman interjected. “Rather than what is happening now, which is that the G20 seems to be really fraying really at the center . . .”
“You’d want a monetary system that is objective, in which there were certain known rules. Now we have an improvisational one, with the mandarins at our Federal Reserve making stuff up as they go along … perfumed clouds of algebra and differential calculus … what are they talking about?”
Here Mr. Liesman interjects again: “Right now, when you read the Fed minutes, when you listen to all the Fed talkers, it is something again that recommends the gold standard. Because what is the monetary policy? It is the policy that is the compromise of the Fed officials and all their disparate views.”
“It is a compromise,” Mr. Grant responded, “among people who really seem to have no first or fixed principles. If you read Bernanke’s speech at Jackson Hole, he begins to tell us what the Treasury ought to do about the deficit, he begins to tell us what Congress might do to improve the process of budgeting, he tells us that we are deficient in K through 12 educational standards. What does the secretary of education think about QE3? I want to know.”
Mr. Liesman greets this point with an appreciative guffaw, then a cackle, which strikes us as what has to be a sobering moment for Messrs. Bernanke and Co. We are at a point where a major speech by a Fed chairman is seen as being met by one of our most serious journalists with merriment and laughter, which Mr. Grant returned with this straightforward point: “So what we want it seems to me is a monetary system that is objective, that we can understand, that has something at its bottom, as its root something that we can recognize as money. Gold is recognized as money most places on the planet.”
* * *
We wouldn’t want to get so far up on our high horse that we fail to acknowledge that Mr. Bernanke has some things he could do to cause value to flow back into the Federal Reserve notes he has been issuing. Chairman Volcker showed us that in the 1980s. But it’s going to be hard for Mr. Bernanke absent a president to work with of the vision of the Reagan that stood in the White House for much of the period that Mr. Volcker ran the Fed. The irony is that President Obama will shortly be giving a speech on jobs. We are in a cru in which, as Ralph Benko, among others, has been pointing out, sound money is the most important plank of a real jobs program. This has so far eluded the Obama administration, and the chorus grows for the kind of profound, era changing reform that is being called for by Mr. Grant and and those who are prepared to move forward to a gold standard.
No comments:
Post a Comment