Don’t go to Grad School

Graduate School in the Humanities: Just Don’t Go

By Thomas H. Benton

Nearly six years ago, I wrote a column called “So You Want to Go to Grad School?” (The Chronicle, June 6, 2003). My purpose was to warn undergraduates away from pursuing Ph.D.’s in the humanities by telling them what I had learned about the academic labor system from personal observation and experience.

It was a message many prospective graduate students were not getting from their professors, who were generally too eager to clone themselves. Having heard rumors about unemployed Ph.D.’s, some undergraduates would ask about job prospects in academe, only to be told, “There are always jobs for good people.” If the students happened to notice the increasing numbers of well-published, highly credentialed adjuncts teaching part time with no benefits, they would be told, “Don’t worry, massive retirements are coming soon, and then there will be plenty of positions available.” The encouragement they received from mostly well-meaning but ill-informed professors was bolstered by the message in our culture that education always leads to opportunity.

All these years later, I still get letters from undergraduates who stumble onto that column. They tell me about their interests and accomplishments and ask whether they should go to graduate school, somehow expecting me to encourage them. I usually write back, explaining that in this era of grade inflation (and recommendation inflation), there’s an almost unlimited supply of students with perfect grades and glowing letters. Of course, some doctoral program may admit them with full financing, but that doesn’t mean they are going to find work as professors when it’s all over. The reality is that less than half of all doctorate holders — after nearly a decade of preparation, on average — will ever find tenure-track positions.

The follow-up letters I receive from those prospective Ph.D.’s are often quite angry and incoherent; they’ve been praised their whole lives, and no one has ever told them that they may not become what they want to be, that higher education is a business that does not necessarily have their best interests at heart. Sometimes they accuse me of being threatened by their obvious talent. I assume they go on to find someone who will tell them what they want to hear: “Yes, my child, you are the one we’ve been waiting for all our lives.” It can be painful, but it is better that undergraduates considering graduate school in the humanities should know the truth now, instead of when they are 30 and unemployed, or worse, working as adjuncts at less than the minimum wage under the misguided belief that more teaching experience and more glowing recommendations will somehow open the door to a real position.

Most undergraduates don’t realize that there is a shrinking percentage of positions in the humanities that offer job security, benefits, and a livable salary (though it is generally much lower than salaries in other fields requiring as many years of training). They don’t know that you probably will have to accept living almost anywhere, and that you must also go through a six-year probationary period at the end of which you may be fired for any number of reasons and find yourself exiled from the profession. They seem to think becoming a humanities professor is a reliable prospect — a more responsible and secure choice than, say, attempting to make it as a freelance writer, or an actor, or a professional athlete — and, as a result, they don’t make any fallback plans until it is too late.

I have found that most prospective graduate students have given little thought to what will happen to them after they complete their doctorates. They assume that everyone finds a decent position somewhere, even if it’s “only” at a community college (expressed with a shudder). Besides, the completion of graduate school seems impossibly far away, so their concerns are mostly focused on the present. Their motives are usually some combination of the following:

  • They are excited by some subject and believe they have a deep, sustainable interest in it. (But ask follow-up questions and you find that it is only deep in relation to their undergraduate peers — not in relation to the kind of serious dedication you need in graduate programs.)
  • They received high grades and a lot of praise from their professors, and they are not finding similar encouragement outside of an academic environment. They want to return to a context in which they feel validated.
  • They are emerging from 16 years of institutional living: a clear, step-by-step process of advancement toward a goal, with measured outcomes, constant reinforcement and support, and clearly defined hierarchies. The world outside school seems so unstructured, ambiguous, difficult to navigate, and frightening.
  • With the prospect of an unappealing, entry-level job on the horizon, life in college becomes increasingly idealized. They think graduate school will continue that romantic experience and enable them to stay in college forever as teacher-scholars.
  • They can’t find a position anywhere that uses the skills on which they most prided themselves in college. They are forced to learn about new things that don’t interest them nearly as much. No one is impressed by their knowledge of Jane Austen. There are no mentors to guide and protect them, and they turn to former teachers for help.
  • They think that graduate school is a good place to hide from the recession. They’ll spend a few years studying literature, preferably on a fellowship, and then, if academe doesn’t seem appealing or open to them, they will simply look for a job when the market has improved. And, you know, all those baby boomers have to retire someday, and when that happens, there will be jobs available in academe.

I know I experienced all of those motivations when I was in my early 20s. The year after I graduated from college (1990) was a recession, and the best job I could find was selling memberships in a health club, part time, in a shopping mall in Philadelphia. A graduate fellowship was an escape that landed me in another city — Miami — with at least enough money to get by. I was aware that my motives for going to graduate school came from the anxieties of transitioning out of college and my difficulty finding appealing work, but I could justify it in practical terms for the last reason I mentioned: I thought I could just leave academe if something better presented itself. I mean, someone with a doctorate must be regarded as something special, right?

Unfortunately, during the three years that I searched for positions outside of academe, I found that humanities Ph.D.’s, without relevant experience or technical skills, generally compete at a moderate disadvantage against undergraduates, and at a serious disadvantage against people with professional degrees. If you take that path, you will be starting at the bottom in your 30s, a decade behind your age cohort, with no savings (and probably a lot of debt).

What almost no prospective graduate students can understand is the extent to which doctoral education in the humanities socializes idealistic, naïve, and psychologically vulnerable people into a profession with a very clear set of values. It teaches them that life outside of academe means failure, which explains the large numbers of graduates who labor for decades as adjuncts, just so they can stay on the periphery of academe. (That’s another topic I’ve written about before; see “Is Graduate School a Cult?” (The Chronicle, July 2, 2004.)

I fell for the line about faculty retirements that went around back in the early 90s, thanks to the infamous Bowen and Sosa Report. I still hear that claim today, from people who ought to know better. Even if the long-awaited wave of retirements finally arrives, many of those tenure lines will not be retained, particularly not now, in the context of yet another recession.

Just to be clear: There is work for humanities doctorates (though perhaps not as many as are currently being produced), but there are fewer and fewer real jobs because of conscious policy decisions by colleges and universities. As a result, the handful of real jobs that remain are being pursued by thousands of qualified people — so many that the minority of candidates who get tenure-track positions might as well be considered the winners of a lottery.

Universities (even those with enormous endowments) have historically taken advantage of recessions to bring austerity to teaching. There will be hiring freezes and early retirements. Rather than replacements, more adjuncts will be hired, and more graduate students will be recruited, eventually flooding the market with even more fully qualified teacher-scholars who will work for almost nothing. When the recession ends, the hiring freezes will become permanent, since departments will have demonstrated that they can function with fewer tenured faculty members.

Nearly every humanities field was already desperately competitive, with hundreds of applications from qualified candidates for every tenure-track position. Now the situation is becoming even worse. For example, the American Historical Association’s job listings are down 15 percent and the Modern Language’s listings are down 21 percent, the steepest annual decline ever recorded. Apparently, many already-launched candidate searches are being called off; some responsible observers expect that hiring may be down 40 percent this year.

What is 40 percent worse than desperate?

The majority of job seekers who emerge empty-handed this year will return next year, and for several years after that, and so the competition will snowball, with more and more people chasing fewer and fewer full-time positions.

Meanwhile, more and more students are flattered to find themselves admitted to graduate programs; many are taking on considerable debt to do so. According to the Humanities Indicators Project of the American Academy of Arts and Sciences, about 23 percent of humanities students end up owing more than $30,000, and more than 14 percent owe more than $50,000.

As things stand, I can only identify a few circumstances under which one might reasonably consider going to graduate school in the humanities:

  • You are independently wealthy, and you have no need to earn a living for yourself or provide for anyone else.
  • You come from that small class of well-connected people in academe who will be able to find a place for you somewhere.
  • You can rely on a partner to provide all of the income and benefits needed by your household.
  • You are earning a credential for a position that you already hold — such as a high-school teacher — and your employer is paying for it.

Those are the only people who can safely undertake doctoral education in the humanities. Everyone else who does so is taking an enormous personal risk, the full consequences of which they cannot assess because they do not understand how the academic-labor system works and will not listen to people who try to tell them.

It’s hard to tell young people that universities recognize that their idealism and energy — and lack of information — are an exploitable resource. For universities, the impact of graduate programs on the lives of those students is an acceptable externality, like dumping toxins into a river. If you cannot find a tenure-track position, your university will no longer court you; it will pretend you do not exist and will act as if your unemployability is entirely your fault. It will make you feel ashamed, and you will probably just disappear, convinced it’s right rather than that the game was rigged from the beginning.

Thomas H. Benton is the pen name of William Pannapacker, an associate professor of English at Hope College, in Holland, Mich. He writes about academic culture and welcomes reader mail directed to his attention at careers@chronicle.com.

Copyright © 2010 The Chronicle of Higher Education

You are what you grow

You Are What You Grow

April 22, 2007

A few years ago, an obesity researcher at the University of Washington named Adam Drewnowski ventured into the supermarket to solve a mystery. He wanted to figure out why it is that the most reliable predictor of obesity in America today is a person’s wealth. For most of history, after all, the poor have typically suffered from a shortage of calories, not a surfeit. So how is it that today the people with the least amount of money to spend on food are the ones most likely to be overweight?

Drewnowski gave himself a hypothetical dollar to spend, using it to purchase as many calories as he possibly could. He discovered that he could buy the most calories per dollar in the middle aisles of the supermarket, among the towering canyons of processed food and soft drink. (In the typical American supermarket, the fresh foods–dairy, meat, fish and produce–line the perimeter walls, while the imperishable packaged goods dominate the center.) Drewnowski found that a dollar could buy 1,200 calories of cookies or potato chips but only 250 calories of carrots. Looking for something to wash down those chips, he discovered that his dollar bought 875 calories of soda but only 170 calories of orange juice.

As a rule, processed foods are more “energy dense” than fresh foods: they contain less water and fiber but more added fat and sugar, which makes them both less filling and more fattening. These particular calories also happen to be the least healthful ones in the marketplace, which is why we call the foods that contain them “junk.” Drewnowski concluded that the rules of the food game in America are organized in such a way that if you are eating on a budget, the most rational economic strategy is to eat badly–and get fat.

This perverse state of affairs is not, as you might think, the inevitable result of the free market. Compared with a bunch of carrots, a package of Twinkies, to take one iconic processed foodlike substance as an example, is a highly complicated, high-tech piece of manufacture, involving no fewer than 39 ingredients, many themselves elaborately manufactured, as well as the packaging and a hefty marketing budget. So how can the supermarket possibly sell a pair of these synthetic cream-filled pseudocakes for less than a bunch of roots?

For the answer, you need look no farther than the farm bill. This resolutely unglamorous and head-hurtingly complicated piece of legislation, which comes around roughly every five years and is about to do so again, sets the rules for the American food system–indeed, to a considerable extent, for the world’s food system. Among other things, it determines which crops will be subsidized and which will not, and in the case of the carrot and the Twinkie, the farm bill as currently written offers a lot more support to the cake than to the root. Like most processed foods, the Twinkie is basically a clever arrangement of carbohydrates and fats teased out of corn, soybeans and wheat–three of the five commodity crops that the farm bill supports, to the tune of some $25 billion a year. (Rice and cotton are the others.) For the last several decades–indeed, for about as long as the American waistline has been ballooning–U.S. agricultural policy has been designed in such a way as to promote the overproduction of these five commodities, especially corn and soy.

That’s because the current farm bill helps commodity farmers by cutting them a check based on how many bushels they can grow, rather than, say, by supporting prices and limiting production, as farm bills once did. The result? A food system awash in added sugars (derived from corn) and added fats (derived mainly from soy), as well as dirt-cheap meat and milk (derived from both). By comparison, the farm bill does almost nothing to support farmers growing fresh produce. A result of these policy choices is on stark display in your supermarket, where the real price of fruits and vegetables between 1985 and 2000 increased by nearly 40 percent while the real price of soft drinks (a k a liquid corn) declined by 23 percent. The reason the least healthful calories in the supermarket are the cheapest is that those are the ones the farm bill encourages farmers to grow.

A public-health researcher from Mars might legitimately wonder why a nation faced with what its surgeon general has called “an epidemic” of obesity would at the same time be in the business of subsidizing the production of high-fructose corn syrup. But such is the perversity of the farm bill: the nation’s agricultural policies operate at cross-purposes with its public-health objectives. And the subsidies are only part of the problem. The farm bill helps determine what sort of food your children will have for lunch in school tomorrow. The school-lunch program began at a time when the public-health problem of America’s children was undernourishment, so feeding surplus agricultural commodities to kids seemed like a win-win strategy. Today the problem is overnutrition, but a school lunch lady trying to prepare healthful fresh food is apt to get dinged by U.S.D.A. inspectors for failing to serve enough calories; if she dishes up a lunch that includes chicken nuggets and Tater Tots, however, the inspector smiles and the reimbursements flow. The farm bill essentially treats our children as a human Disposall for all the unhealthful calories that the farm bill has encouraged American farmers to overproduce.

To speak of the farm bill’s influence on the American food system does not begin to describe its full impact–on the environment, on global poverty, even on immigration. By making it possible for American farmers to sell their crops abroad for considerably less than it costs to grow them, the farm bill helps determine the price of corn in Mexico and the price of cotton in Nigeria and therefore whether farmers in those places will survive or be forced off the land, to migrate to the cities–or to the United States. The flow of immigrants north from Mexico since Nafta is inextricably linked to the flow of American corn in the opposite direction, a flood of subsidized grain that the Mexican government estimates has thrown two million Mexican farmers and other agricultural workers off the land since the mid-90s. (More recently, the ethanol boom has led to a spike in corn prices that has left that country reeling from soaring tortilla prices; linking its corn economy to ours has been an unalloyed disaster for Mexico’s eaters as well as its farmers.) You can’t fully comprehend the pressures driving immigration without comprehending what U.S. agricultural policy is doing to rural agriculture in Mexico.

And though we don’t ordinarily think of the farm bill in these terms, few pieces of legislation have as profound an impact on the American landscape and environment. Americans may tell themselves they don’t have a national land-use policy, that the market by and large decides what happens on private property in America, but that’s not exactly true. The smorgasbord of incentives and disincentives built into the farm bill helps decide what happens on nearly half of the private land in America: whether it will be farmed or left wild, whether it will be managed to maximize productivity (and therefore doused with chemicals) or to promote environmental stewardship. The health of the American soil, the purity of its water, the biodiversity and the very look of its landscape owe in no small part to impenetrable titles, programs and formulae buried deep in the farm bill.

Given all this, you would think the farm-bill debate would engage the nation’s political passions every five years, but that hasn’t been the case. If the quintennial antidrama of the “farm bill debate” holds true to form this year, a handful of farm-state legislators will thrash out the mind-numbing details behind closed doors, with virtually nobody else, either in Congress or in the media, paying much attention. Why? Because most of us assume that, true to its name, the farm bill is about “farming,” an increasingly quaint activity that involves no one we know and in which few of us think we have a stake. This leaves our own representatives free to ignore the farm bill, to treat it as a parochial piece of legislation affecting a handful of their Midwestern colleagues. Since we aren’t paying attention, they pay no political price for trading, or even selling, their farm-bill votes. The fact that the bill is deeply encrusted with incomprehensible jargon and prehensile programs dating back to the 1930s makes it almost impossible for the average legislator to understand the bill should he or she try to, much less the average citizen. It’s doubtful this is an accident.

But there are signs this year will be different. The public-health community has come to recognize it can’t hope to address obesity and diabetes without addressing the farm bill. The environmental community recognizes that as long as we have a farm bill that promotes chemical and feedlot agriculture, clean water will remain a pipe dream. The development community has woken up to the fact that global poverty can’t be fought without confronting the ways the farm bill depresses world crop prices. They got a boost from a 2004 ruling by the World Trade Organization that U.S. cotton subsidies are illegal; most observers think that challenges to similar subsidies for corn, soy, wheat or rice would also prevail.

And then there are the eaters, people like you and me, increasingly concerned, if not restive, about the quality of the food on offer in America. A grass-roots social movement is gathering around food issues today, and while it is still somewhat inchoate, the manifestations are everywhere: in local efforts to get vending machines out of the schools and to improve school lunch; in local campaigns to fight feedlots and to force food companies to better the lives of animals in agriculture; in the spectacular growth of the market for organic food and the revival of local food systems. In great and growing numbers, people are voting with their forks for a different sort of food system. But as powerful as the food consumer is–it was that consumer, after all, who built a $15 billion organic-food industry and more than doubled the number of farmer’s markets in the last few years–voting with our forks can advance reform only so far. It can’t, for example, change the fact that the system is rigged to make the most unhealthful calories in the marketplace the only ones the poor can afford. To change that, people will have to vote with their votes as well–which is to say, they will have to wade into the muddy political waters of agricultural policy.

Doing so starts with the recognition that the “farm bill” is a misnomer; in truth, it is a food bill and so needs to be rewritten with the interests of eaters placed first. Yes, there are eaters who think it in their interest that food just be as cheap as possible, no matter how poor the quality. But there are many more who recognize the real cost of artificially cheap food–to their health, to the land, to the animals, to the public purse. At a minimum, these eaters want a bill that aligns agricultural policy with our public-health and environmental values, one with incentives to produce food cleanly, sustainably and humanely. Eaters want a bill that makes the most healthful calories in the supermarket competitive with the least healthful ones. Eaters want a bill that feeds schoolchildren fresh food from local farms rather than processed surplus commodities from far away. Enlightened eaters also recognize their dependence on farmers, which is why they would support a bill that guarantees the people who raise our food not subsidies but fair prices. Why? Because they prefer to live in a country that can still produce its own food and doesn’t hurt the world’s farmers by dumping its surplus crops on their markets.

The devil is in the details, no doubt. Simply eliminating support for farmers won’t solve these problems; overproduction has afflicted agriculture since long before modern subsidies. It will take some imaginative policy making to figure out how to encourage farmers to focus on taking care of the land rather than all-out production, on growing real food for eaters rather than industrial raw materials for food processors and on rebuilding local food economies, which the current farm bill hobbles. But the guiding principle behind an eater’s farm bill could not be more straightforward: it’s one that changes the rules of the game so as to promote the quality of our food (and farming) over and above its quantity.

Such changes are radical only by the standards of past farm bills, which have faithfully reflected the priorities of the agribusiness interests that wrote them. One of these years, the eaters of America are going to demand a place at the table, and we will have the political debate over food policy we need and deserve. This could prove to be that year: the year when the farm bill became a food bill, and the eaters at last had their say.

Copyright © Michael Pollan

So we’re not running out of oil?

Plentiful Petroleum

by George Giles

As the first decade of the twenty-first century draws to a close it is worth reviewing topics that may impact every man, woman and child. The disastrous Bush Administration pulled the legs out from under the baby boomers retirement with his inability to veto any spending bill. He has even crippled the future of the unborn. Lil’ Bush launched a global war on an adjective (terrorism) without any strategic objective. Previous wars had nouns as the subject. Social spending increased as well, not to mention the subversion of the Constitution and the Bill of Rights. Obama HAS taken up right where Lil’ Bush left off. Immense deficits and a monstrous national debt are threatening to consume us. As awful as this scenario is these are abstract concepts to Joe Public, something that plays little or no part in their daily lives. What about the impending eco-doom and the fact that we are running out of oil? These are abstract concepts that all the benighted masses can reify for themselves.

The conventional wisdom is that the burning of fossil fuels is leading to a global catastrophe that must be averted. At the same time we are running out of oil so quickly, the story goes, that oil production will plummet any day now sending the American lifestyle and economy into a tail spin from which it will never recover. This phenomenon called Peak Oil means that oil production has peaked and that existing fields are near depletion with no new significant oil fields being found.

I have beaten Al Gore like a government mule on multiple occasions in these hallowed pages. Al Gore is the Teflon ecologist as none of his lies, misrepresentations, and twisting of facts stick to him. Even for a politician he sets records for mendacity; Al lies like a rug. No longer in office this crank ecologist managed to get a Nobel Prize for not actually knowing anything about climate science. Al’s mantra must be that he never lets the truth interfere with a good story. This man is unbounded; there is literally nothing that he will not take the credit for (creating the Internet, saving the whole planet, blah blah blah …) with no more credentials than being a rich man’s son and dropping out of graduate school. Not exactly the credentials most Nobel laureate’s possess.

I have written about peak oil before in LRC. The most recent piece can be found here. The basic thesis is quite simple: we are not running out of oil. Hubbert is as wrong as wrong can be. The facts backing up this assertion are simple: every year in the past 60 years has had more petroleum reserves at the end of the year than at the beginning. This is certainly an odd phenomenon to be sure for something that is dooming the entire planet. This is pretty much the exact opposite of what Hubbert forecasted. Let’s get some reification: in December of 2009 world petroleum reserves were estimated to be 1.31 trillion barrels aka 131,000,000,000 of them. A barrel as a concept is only useful to a petroleum engineer. A barrel is 42 gallons. So a trillion barrels is 5,502,000,000,000 gallons, aka 5.5 quadrillion gallons, which is actually a whole lot. Experts will say that reserve estimates are crude (no pun intended) at best, and that variance can be as large of 10% or more. Ecologists always round these numbers down presaging disaster, and marketing people will round them up. Nevertheless entrepreneurs continue to explore; which is a powerful statement that the free market makes about the future profit potential. Petroleum exploration is very expensive using techniques that are right on the forefront of science and engineering.

Even if skeptics will quibble about the number I have chosen the point is still obvious: how can we be running out of something we always seem to have more of, even including the 100 million barrels or so that we consume each and every day. The balance of justice will tip in my favor, Hubbert is wrong and I am right at least in the short run. Clearly there is not an infinite reserve of petroleum because everything is finite in the very long run.

The gas giant planets of Jupiter, Saturn, Uranus and Neptune are largely composed of hydrogen and methane which indicates that crude petroleum feedstock is abundant in our solar system. Thus there is some reason to think that some of this lies in the interior of our planet which slowly but surely percolates through the rock to the upper layers of the crust. Along the way methane gets oxidized into longer chain molecule as byproducts of methanogen metabolism. Man sees rocks as solid, but scientists know that they are porous at the molecular level; a large sponge, retarding, and filtering, but not stopping migrations thanks to Archimedes’ Principal, aka buoyancy. Oil exploration does not look for oil directly. They look for the rock formations that trap oil migration through the crust. This is the abiotic theory of petroleum creation. Thomas Gold raised 25 million dollars (USD) and sunk a well 6 miles into the Swedish bedrock. Sweden is a country without any known petroleum reserves. Gold found methane; methanogen’s and longer chain organic molecules. His excellent book The Deep Hot Biosphere should be on everyone’s reading list, except apparently Al Gore’s.

The earth’s oxygen is not a product of planet formation but has been and continues to be a product of primitive organism metabolism. Plants and other organism’s convert carbon dioxide into the oxygen we breathe. Carbon dioxide (CO2) is a food to this ecosystem. In some ecological sense the burning of petroleum is a symbiotic relationship between man (no other animal consumes the petroleum found in the crust) and the plant world. Mankind produces CO2 as a byproduct of petroleum consumption and the plants respond by producing oxygen.

Global warming is the catastrophe which has been hyped by the media and has thus been transmogrified into a cultural bogeyman targeting humanity. Global warming is the direct result of increased atmospheric CO2 which is produced by mankind in their insidious combustion of petroleum. Global warming will cause the north and south poles to melt which will destroy all waterfront property around the world; it will cause new biological threats to develop. New strains of influenza and other viral and bacterial threats will see the additional few parts per million of CO2 as the go signal to run wild. This is of course a pile of crap, but indicative of the shallow reasoning environmentalists use. Most of us learned this as a fable in grade school: "Chicken Little". Smokin’ Hot Al Gore runs around the world like a power-drunk Chicken Little searching for ignorant rubes that he can save. Runs is not accurate: Al flies around the world on private jets like a power-drunk Chicken Little. There are no TSA cavity searches for this vanguard of mankind. Still Al’s fable always ends the same way he is anointed king and the environmental ninnies open up concentration, I mean, reeducation camps. Al was so close to the levers of power that he cannot get it out of his mind. His fever dreams are replete with scenarios where he would act Presidential and force people to do the right thing. To date the American people and the Constitution have kept him at bay. He is not deterred; he spews his eco-madness non-stop as an attention-getting tool lest the public forget who their savior is.

Unfortunately the climate science to support these assertions is actually very thin even if present in copious amounts. It consists mainly of computer simulations of questionable predictive value. These models must be carefully tuned in order to converge at all. This same tuning can show many results like a perpetual ice ball planet or a raging inferno similar to Venus. A reputable climate scientist will admit this in professional publications but these facts always seems to get dropped from discussion in propaganda venues like Time, Newsweek, Financial Times, or the Economist. These models can generate many forms of disastrous outcome in glossy four-color graphics. What they can’t do is accurately predict the climate future. No amount of spending on supercomputers (always taxpayer funded) will change this fact. It is an initial condition boundary value problem for a system of coupled non-linear partial differential equations that can never be compensated for. If you read the hyperlinked data the problem is in the smoothness (the weather can never be smoothed out).

Temperature data over the last 150 is a worthless statistic when you consider that this planet has been around for 4.5 billion years, and that life emerged only in the last 600 million years. For millions of years the Earth was covered by a massive ice sheet that prevented sunlight from striking the ground. In the last hundred thousand years there have been multiple ice ages and warm periods, yet life prevails.

The mendacity of this "science" threatens all of humanity. The third world struggles to get into the first world. They do this by increasing their energy consumption in their daily lives by working in some form of hampered market economy. This increase in energy consumption allows them to produce goods like running shoes, trendy fleece jackets, and iPods. If either global-warming or peak petroleum cranks have their way energy consumption across the planet will decrease. New laws will only serve to strangle the economy of utility to be found from the division of labor. This strangulation will be most severe in the developing world; it literally threatens billions.

Socialism kills millions when confined to a single country (China, Russia, and Cambodia). If adopted on the world scale it threatens billions. Democracy is the god that the west is trying to get the whole world to implement. Ecologists and Socialists have failed miserably as the body politic knows to reject their eco-prophets at the polling place. Al Gore, John Kerry, current communist bozos and Ralph Nader have all been rejected in their manifold attempts to have their hands on the most powerful levers in the world: the American Imperial Presidency.

It is the developing world that will be punished the most. It should seem odd that a celebrity like Al Gore can have a mansion that consumes twenty times the monthly average utilities in the town where he lives, Nashville. He can also fly around the world in private jets to spew his provocative mendacity. His eco-devotee worshippers never question big Al’s carbon footprint either personal (currently) or the 8 years of president on-deck imperialist to Clinton. Big Al will tell us with a straight face that global cooling is really global warming that has gone into a stealth mode, just to fool us. Bill Clinton is now a fulltime buffoon and acts like one; he knows where he belongs. Gore takes himself serious and is very dangerous to mankind thanks to his Nobel status.

The Intra-Government Panel on Climate Change spews questionable science at the entire world through mass media replication of the mendacious façade of impending doom. Taxpayers around the world get bilked for this science. Their tentacles reach into all other domains via regulation. If the triple threat of climate change, petroleum production, and intra-governmental socialism are successful then billions will suffer and millions will certainly perish.

These clowns who cannot predict the weather next week want to predict it forever. Unlike TV weathermen political predictions turn into laws thanks to prevaricating politicians like Al Gore. These laws are directly targeted at capitalism to destroy it. Politicians know that capitalism via Human Action is their mortal enemy as it strips them of all their power. Politicians are just ordinary citizens that have power over others. It is this power that they crave to yield. Capitalism and Political "Science" are in a death match to the end because one cannot exist without destroying the other.

Socialism is the stealth political platform that peak oil and climate doomsayers promulgate. It is a thinly veiled menace to the body politic of the first, second and third worlds. Socialism via ecological regulation will punish the taxpayers and workers of the entire planet. The first world can adapt to some extent as they have shown through all the market crashes, recessions, regulations wars and market hampering of the twentieth century. It is the developing world that will be damaged the most, and these are the planet’s most endangered ecosystems. Energy and market transactions are the only way out of the cruel misery of poverty and ignorance. They do not love their children or their cultures any less than the developed world and should be accorded the same chance to prosper.

Let’s make the next decade one in which reason takes the field of the political debate and rolls back the twin threats of "Peak Oil" and Climate Change socialism. Ignore the Hubbert’s and Gore’s of the world. Let us welcome the developing nations like China, India and others into the fraternity of peace and prosperity through capitalism.

December 31, 2009

George Giles [send him mail] is the founder of the Gonzo School of Economics, the radical branch of Austrian Economic Theory. He was the youngest Republican ever elected in 1972 at age 17. You could be elected at age 17 if the office was not assumed until after age 18. It only took 3 months of local GOP meetings to become a virulent Libertarian ever after.

Copyright © 2009 by LewRockwell.com. Permission to reprint in whole or in part is gladly granted, provided full credit is given.

The Best of George Giles

Dollar = Peso

Killing the Currency

How Barack Obama and Ben Bernanke are destroying the dollar — and perhaps ushering in the amero

By Robert P. Murphy

First under the Bush Administration and even more so under President Obama, the federal government has been seizing power and spending money as it hasn’t done since World War II. But as bold as the Executive Branch has been during this financial crisis, the innovations of Fed chairman Ben Bernanke have been literally unprecedented. Indeed, it is entirely plausible that before Obama leaves office, Americans will be using a new currency.

Bush and Obama have engaged in record peacetime deficit spending; so too did Herbert Hoover and then Franklin Roosevelt (even though in the 1932 election campaign, FDR promised Americans a balanced budget). Bush and Obama approved massive federal interventions into the financial sector, at the behest of their respective Treasury secretaries. Believe it or not, in 1932 the allegedly “do-nothing” Herbert Hoover signed off on the creation of the Reconstruction Finance Corporation (RFC), which was given billions of dollars to prop up unsound financial institutions and make loans to state and local governments. And as with so many other elements of the New Deal, FDR took over and expanded the RFC that had been started under Hoover.

In the past year, the government has seized control of more than half of the nation’s mortgages, it has taken over one of the world’s biggest insurers, it literally controls major car companies, and it is now telling financial institutions how much they can pay their top executives. On top of this, the feds are seeking vast new powers over the nation’s energy markets (through the House Waxman-Markey “Clean Energy and Security Act” and pending Kerry-Boxer companion bill in the Senate) and, of course, are trying to “reform” health care by creating expansive new government programs.

For anyone who thinks free markets are generally more effective at coordinating resources and workers, these incredible assaults on the private sector from the central government surely must translate into a sputtering economy for years. Any one of the above initiatives would have placed a drag on a healthy economy. But to impose the entire package on an economy that is mired in the worst postwar recession, is a recipe for disaster.

Debt and Inflation

Conventional economic forecasts for government tax receipts are far too optimistic. The U.S. Treasury will need to issue far more debt in the coming years than most analysts now realize. Yet even the optimistic forecasts are sobering. For example, in March the Congressional Budget Office projected that the Obama administration’s budgetary plans would lead to a doubling of the federal debt as a share of the economy, from 41 percent of GDP in 2008 to 82 percent of GDP by 2019. The deficit for fiscal year 2009 (which ended Sept. 30) alone was $1.4 trillion. For reference, the entire federal budgetwas less than $1.4 trillion in the early years of the Clinton administration.

Clearly the U.S. government will be incurring massive new debts in the years to come. The situation looks so grim that economist Jeffrey Hummel has predicted that the Treasury will default on its obligations, just as Russia defaulted on its bonds in 1998. But another scenario involves the Federal Reserve wiping out the real burden of the debt by writing checks out of thin air to buy up whatever notes the Treasury wants to issue.

Many analysts are worried about Fed chairman Ben Bernanke’s actions during the financial crisis; Marc Faber is openly warning of “hyperinflation.” To understand what the fuss is about, consider some facts about our monetary and banking system.

The United States has a fractional reserve banking system. When someone deposits $100 in a checking account, most of that money is lent out again to other bank customers. Only a fraction—typically around 10 percent—needs to be held “on reserve” to back up the $100 balance of the original depositor. A bank’s reserves can consist of either cash in the vault or deposits with the Federal Reserve itself. For example, suppose a given bank has customer checking accounts with a combined balance of $1 billion. Assuming a 10 percent reserve requirement, the bank needs $100 million in reserves. It can satisfy this legal requirement by keeping, say, $30 million in actual cash on hand in its vaults and putting $70 million on deposit in the bank’s account with the Fed.

Normally, the Fed expands the money supply by engaging in “open market operations.” For example, the Fed might buy $1 billion worth of government bonds from a dealer in the private sector. The Fed adds the $1 billion in bonds to the asset side of its balance sheet, while its liabilities also increase by $1 billion. But Bernanke faces no real constraints on his purchasing decisions. When the Fed buys $1 billion in new bonds, it simply writes a $1 billion check on itself. There is no stockpile of money that gets drained because of the check; the recipient simply deposits the check in his own bank, and the bank in turn sees its reserves on deposit with the Fed go up by $1 billion. In principle, the Fed could write checks to buy every asset in America.

Monetary Catastrophe

Since the start of the present financial crisis, the Federal Reserve has implemented extraordinary programs to rescue large institutions from the horrible investments they made during the bubble years. Because of these programs, the Fed’s balance sheet more than doubled from September 2008 to the end of the year, as Bernanke acquired more than a trillion dollars in new holdings in just a few months.

If Bernanke has been so aggressive in creating new money, why haven’t prices skyrocketed at the grocery store? The answer is that banks have chosen to let their reserves with the Fed grow well above the legal minimum. In other words, banks have the legal ability to make new loans to customers, but for various reasons they are choosing not to do so. This chart from the Federal Reserve shows these “excess reserves” in their historical context.

U.S. depository institutions have typically lent out their excess reserves in order to earn interest from their customers. Yet currently the banks are sitting on some $850 billion in excess reserves, because (a) the Fed began paying interest on reserves in October 2008, and (b) the economic outlook is so uncertain that financial institutions wish to remain extremely liquid.

The chart explains why Faber and others are warning about massive price inflation. If and when the banks begin lending out their excess reserves, they will have the legal ability to create up to $8.5 trillion in new money. To understand how significant that number is, consider that right now the monetary aggregate M1—which includes physical currency, traveler’s checks, checking accounts, and other very liquid assets—is a mere $1.7 trillion.

What does all this mean? Quite simply, it means that if Bernanke sits back and does nothing more, he has already injected enough reserves into the financial system to quintuple the money supply held by the public. Even if Bernanke does the politically difficult thing, jacking up interest rates and sucking out half of the excess reserves, there would still be enough slack in the system to triple the money supply.

The End of the Dollar?

Aware of the above considerations, central banks around the world have been quietly distancing themselves from the U.S. dollar. Over the summer, officials in India, China, and Russia opined publicly on the desirability of a new global financial system, anchored on a basket of currencies or even gold.

We thus have in motion two huge trains of supply and demand, and the result will be an inevitable crash in the value of the dollar. Just as the Federal Reserve is embarking on a massive printing spree, the rest of the world is looking to dump its dollar holdings. It’s impossible to predict the exact timing, but sooner or later the dollar will fall very sharply against commodities and other currencies.

A crashing dollar will translate immediately into huge spikes in the price of gasoline and other basic items tied to the world market. After a lag, prices at Wal-Mart and other stores will also skyrocket, as their reliance on “cheap imports from Asia” will no longer be possible when the price of the dollar against the Chinese yuan falls by half.

The consequences will be so dramatic that what now may sound like a “conspiracy theory” could become possible. Fed officials might use such an opportunity to wean Americans from the U.S. dollar. Influential groups such as the Council on Foreign Relations have discussed the desirability of coordination among the North American governments. For example, CFR president Richard N. Haas wrote in the foreword to a 2005 Task Force report titled, “Building a North American Community”:

The Task Force offers a detailed and ambitious set of proposals that build on the recommendations adopted by the three governments [Canada, the U.S., and Mexico] at the Texas summit of March 2005. The Task Force’s central recommendation is establishment by 2010 of a North American economic and security community, the boundaries of which would be defined by a common external tariff and an outer security perimeter.

The “Texas summit of March 2005” refers to the “Security and Prosperity Partnership (SPP) of North America,” which came out of a meeting in Waco, Texas between President George W. Bush, Canadian Prime Minister Paul Martin, and Mexican President Vicente Fox. For the record, the federal government’s website has a special section devoted to refuting the (alleged) myths of the SPP, including the claim that the SPP is a prelude to a North American Union, comparable to the European Union. Yet despite the official protestations to the contrary, the global trend toward ever larger political and monetary institutions is undeniable. And there is a definite logic behind the process: with governments in control of standing armies, the only real check on their power is the ability of their subjects to change jurisdictions. By “harmonizing” tax and regulatory regimes, various countries can extract more from their most productive businesses. And by foisting a fiat currency into the pockets of more and more people, a government obtains steadily greater control over national—or international—wealth.

But if indeed key players had wanted to create a North American Union with a common currency, up till now they would have faced an insurmountable barrier: the American public would never have agreed to turn in their dollars in exchange for a new currency issued by a supranational organization. The situation will be different when the U.S. public endures double-digit price inflation, even as the economy still suffers from the worst unemployment since the Great Depression. Especially if Obama officials frame the problem as an attack on the dollar by foreign speculators, and point to the strength of the euro, many Americans will be led to believe that only a change in currency can save the economy.

For those who consider such a possibility farfetched, remember that one of FDR’s first acts as president was to confiscate monetary gold held by U.S. citizens, under threat of imprisonment and a huge fine. Yet nowadays, that massive crime is described as “taking us off the gold standard” which “untied the Fed’s hands and allowed it to fight the Depression.” The same will be said in future history books, when they explain matter-of-factly the economic crisis that gave birth to the amero.

What Can One Man Do?

If events play out as described, what should average investors do right now to protect themselves? First and most obvious, they should rid themselves of dollar-denominated assets. For example, government and corporate bonds promising to make a fixed stream of dollar payments will all become virtually worthless if huge price inflation occurs. (In contrast, holding U.S. stocks is not a bad idea from the point of view of inflation; a stock entitles the owner to a portion of the revenue stream from a company’s sales, which themselves will rise along with prices in general.)

Second, investors should acquire an emergency stockpile of gold and silver. If and when dollar-prices begin shooting through the roof, there will be a lag for most workers: They will see the prices of milk, eggs, and gasoline increasing by the week, yet their paychecks will remain the same for months or longer. If the dollar crashes in the foreign exchange markets, gold and silver would see their prices (quoted in U.S. dollars) increase in the opposite direction.

We can’t know the timing of the impending monetary catastrophe, but it is coming. Smart investors will minimize their dependence on the dollar before it crashes. At this late date, no one should trust the government and media “experts” who assure us that the worst is over.

Robert P. Murphy has a Ph.D. in economics from New York University. He is an economist with the Institute for Energy Research and author of The Politically Incorrect Guide to the Great Depression and the New Deal.