Hard Science of Global Warming

German Physicists Trash Global Warming “Theory”

December 26, 2009

guest article by John O’Sullivan

For any non-scientist interested in the climate debate, there is nothing better than a ready primer to guide you through the complexities of atmospheric physics – the “hardest” science of climatology. Here we outline the essential points made by Dr. Gerhard Gerlich, a respected German physicist, that counter the bogus theory of Anthropogenic Global Warming (AGW).

Before going further, it’s worth bearing in mind that no climatologist ever completed any university course in climatology–that’s how new this branch of science really is. Like any new science the fall-back position of a cornered AGW proponent is the dreaded “appeal to authority” where the flustered debater, out of his or her depth, will say, “Well, professor so-and-so says it’s true – so it must be true.” Don’t fall for that proxy tree-ring counter’s gambit any longer. Here is the finest shredding of junk science you will ever read.

In a recently revised and re-published paper, Dr Gerlich debunks AGW and shows that the IPCC “consensus” atmospheric physics model tying CO2 to global warming is not only unverifiable, but actually violates basic laws of physics, i.e. the First and Second Law of Thermodynamics. The latest version of this momentous scientific paper appears in the March 2009 edition of the International Journal of Modern Physics.

The central claims of Dr. Gerlich and his colleague, Dr. Ralf Tscheuschner, include, but are not limited to:

  1. The mechanism of warming in an actual greenhouse is different than the mechanism of warming in the atmosphere, therefore it is not a “greenhouse” effect and should be called something else.
  2. The climate models that predict catastrophic global warming also result in a net heat flow from atmospheric greenhouse gasses to the warmer ground, which is in violation of the second law of thermodynamics.

Essentially, any machine which transfers heat from a low temperature reservoir to a high temperature reservoir without external work applied cannot exist. If it did it would be a “perpetual motion machine” – the realm of pure sci-fi.

Gerlich’s and Tscheuschner’s independent theoretical study is detailed in a lengthy (115 pages), mathematically complex (144 equations, 13 data tables, and 32 figures or graphs), and well-sourced (205 references) paper. The German physicists prove that even if CO2 concentrations double (a prospect even global warming advocates admit is decades away), the thermal conductivity of air would not change more than 0.03%. They show that the classic concept of the glass greenhouse wholly fails to replicate the physics of Earth’s climate. They also prove that a greenhouse operates as a “closed” system while the planet works as an “open” system and the term “atmospheric greenhouse effect” does not occur in any fundamental work involving thermodynamics, physical kinetics, or radiation theory. All through their paper the German scientists show how the greenhouse gas theory relies on guesstimates about the scientific properties involved to “calculate” the chaotic interplay of such a myriad and unquantifiable array of factors that is beyond even the abilities of the most powerful of modern supercomputers.

The paper’s introduction states it neatly:

(a) there are no common physical laws between the warming phenomenon in glass houses and the fictitious atmospheric greenhouse effects, (b) there are no calculations to determine an average surface temperature of a planet, (c) the frequently mentioned difference of 33 degrees Celsius is a meaningless number calculated wrongly, (d) the formulas of cavity radiation are used inappropriately, (e) the assumption of a radiative balance is unphysical, (f) thermal conductivity and friction must not be set to zero, the atmospheric greenhouse conjecture is falsified.

This thorough debunking of the theory of man made warming disproves that there exists a mechanism whereby carbon dioxide in the cooler upper atmosphere exerts any thermal “forcing” effect on the warmer surface below. To do so would violate both the First and Second Laws of Thermodynamics. As there is no glass roof on the earth to trap the excess heat, it escapes upward into space.Thus we may conclude that the common sense axioms are preserved so that the deeper the ocean, the colder the water and heat rises, it does not fall. QED.

John O’Sullivan is a legal advocate and writer who for several years has litigated in government corruption and conspiracy cases in both the US and Britain. Visit his website.

Conservatism isn’t really about small government

The Intellectual Incoherence of Conservatism

Hans-Hermann Hoppe

Modern conservatism, in the United States and Europe, is confused and distorted. Under the influence of representative democracy and with the transformation of the U.S. and Europe into mass democracies from World War I, conservatism was transformed from an anti-egalitarian, aristocratic, anti-statist ideological force into a movement of culturally conservative statists: the right wing of the socialists and social democrats.

Most self-proclaimed contemporary conservatives are concerned, as they should be, about the decay of families, divorce, illegitimacy, loss of authority, multiculturalism, social disintegration, sexual libertinism, and crime. All of these phenomena they regard as anomalies and deviations from the natural order, or what we might call normalcy.

However, most contemporary conservatives (at least most of the spokesmen of the conservative establishment) either do not recognize that their goal of restoring normalcy requires the most drastic, even revolutionary, antistatist social changes, or (if they know about this) they are engaged in betraying conservatism’s cultural agenda from inside in order to promote an entirely different agenda.

That this is largely true for the so-called neoconservatives does not require further explanation here. Indeed, as far as their leaders are concerned, one suspects that most of them are of the latter kind. They are not truly concerned about cultural matters but recognize that they must play the cultural-conservatism card so as not to lose power and promote their entirely different goal of global social democracy.1 The fundamentally statist character of American neoconservatism is best summarized by a statement of one of its leading intellectual champions Irving Kristol:

“[T]he basic principle behind a conservative welfare state ought to be a simple one: wherever possible, people should be allowed to keep their own money—rather than having it transferred (via taxes to the state)—on the condition that they put it to certain defined uses.” [Two Cheers for Capitalism, New York: Basic Books, 1978, p. 119].

This view is essentially identical to that held by modern, post-Marxist European Social-Democrats. Thus, Germany’s Social Democratic Party (SPD), for instance, in its Godesberg Program of 1959, adopted as its core motto the slogan “as much market as possible, as much state as necessary.”

A second, somewhat older but nowadays almost indistinguishable branch of contemporary American conservatism is represented by the new (post World War II) conservatism launched and promoted, with the assistance of the CIA, by William Buckley and his National Review. Whereas the old (pre-World War II) American conservatism had been characterized by decidedly anti-interventionist foreign policy views, the trademark of Buckley’s new conservatism has been its rabid militarism and interventionist foreign policy.

In an article, “A Young Republican’s View,” published in Commonweal on January 25, 1952, three years before the launching of his National Review,  Buckley thus summarized what would become the new conservative credo: In light of the threat posed by the Soviet Union, “we [new conservatives] have to accept Big Government for the duration—for neither an offensive nor a defensive war can be waged . . . except through the instrument of a totalitarian bureaucracy within our shores.”

Conservatives, Buckley wrote, were duty-bound to promote “the extensive and productive tax laws that are needed to support a vigorous anti-Communist foreign policy,” as well as the “large armies and air forces, atomic energy, central intelligence, war production boards and the attendant centralization of power in Washington.”

Not surprisingly, since the collapse of the Soviet Union in the late 1980s, essentially nothing in this philosophy has changed. Today, the continuation and preservation of the American welfare-warfare state is simply excused and promoted by new and neo-conservatives alike with reference to other foreign enemies and dangers: China, Islamic fundamentalism, Saddam Hussein, “rogue states,” and the threat of “global terrorism.”

However, it is also true that many conservatives are genuinely concerned about family disintegration or dysfunction and cultural decline. I am thinking here in particular of the conservatism represented by Patrick Buchanan and his movement. Buchanan’s conservatism is by no means as different from that of the conservative Republican party establishment as he and his followers fancy themselves. In one decisive respect their brand of conservatism is in full agreement with that of the conservative establishment: both are statists. They differ over what exactly needs to be done to restore normalcy to the U.S., but they agree that it must be done by the state. There is not a trace of principled antistatism in either.

Let me illustrate by quoting Samuel Francis, who was one of the leading theoreticians and strategists of the Buchananite movement. After deploring “anti-white” and “anti-Western” propaganda, “militant secularism, acquisitive egoism, economic and political globalism, demographic inundation, and unchecked state centralism,” he expounds on a new spirit of “America First,” which “implies not only putting national interests over those of other nations and abstractions like ‘world leadership,’ ‘global harmony,’ and the ‘New World Order,’ but also giving priority to the nation over the gratification of individual and subnational interests.”

How does he propose to fix the problem of moral degeneration and cultural decline? There is no recognition that the natural order in education means that the state has nothing to do with it. Education is entirely a family matter and ought to be produced and distributed in cooperative arrangements within the framework of the market economy.

Moreover, there is no recognition that moral degeneracy and cultural decline have deeper causes and cannot simply be cured by state-imposed curriculum changes or exhortations and declamations. To the contrary, Francis proposes that the cultural turn-around—the restoration of normalcy—can be achieved without a fundamental change in the structure of the modern welfare state. Indeed, Buchanan and his ideologues explicitly defend the three core institutions of the welfare state: social security, medicare, and unemployment subsidies. They even want to expand the “social” responsibilities of the state by assigning to it the task of “protecting,” by means of national import and export restrictions, American jobs, especially in industries of national concern, and “insulate the wages of U.S. workers from foreign laborers who must work for $1 an hour or less.”

In fact, Buchananites freely admit that they are statists. They detest and ridicule capitalism, laissez-faire, free markets and trade, wealth, elites, and nobility; and they advocate a new populist—indeed proletarian—conservatism which amalgamates social and cultural conservatism and socialist economics. Thus, continues Francis,

while the left could win Middle Americans through its economic measures, it lost them through its social and cultural radicalism, and while the right could attract Middle Americans through appeals to law and order and defense of sexual normality, conventional morals and religion, traditional social institutions and invocations of nationalism and patriotism, it lost Middle Americans when it rehearsed its old bourgeois economic formulas.

Hence, it is necessary to combine the economic policies of the left and the nationalism and cultural conservatism of the right, to create “a new identity synthesizing both the economic interests and cultural-national loyalties of the proletarianized middle class in a separate and unified political movement.”2 For obvious reasons this doctrine is not so named, but there is a term for this type of conservatism: It is called social nationalism or national socialism.

(As for most of the leaders of the so-called Christian Right and the “moral majority,” they simply desire the replacement of the current, left-liberal elite in charge of national education by another one, i.e., themselves. “From Burke on,” Robert Nisbet has criticized this posture, “it has been a conservative precept and a sociological principle since Auguste Comte that the surest way of weakening the family, or any vital social group, is for the government to assume, and then monopolize, the family’s historic functions.” In contrast, much of the contemporary American Right “is less interested in Burkean immunities from government power than it is in putting a maximum of governmental power in the hands of those who can be trusted. It is control of power, not diminution of power, that ranks high.”)

I will not concern myself here with the question of whether or not Buchanan’s conservatism has mass appeal and whether or not its diagnosis of American politics is sociologically correct. I doubt that this is the case, and certainly Buchanan’s fate during the 1995 and 2000 Republican presidential primaries does not indicate otherwise. Rather, I want to address the more fundamental questions: Assuming that it does have such appeal; that is, assuming that cultural conservatism and socialist economics can be psychologically combined (that is, that people can hold both of these views simultaneously without cognitive dissonance), can they also be effectively and practically (economically and praxeologically) combined? Is it possible to maintain the current level of economic socialism (social security, etc.) and reach the goal of restoring cultural normalcy (natural families and normal rules of conduct)?

Buchanan and his theoreticians do not feel the need to raise this question, because they believe politics to be solely a matter of will and power. They do not believe in such things as economic laws. If people want something enough, and they are given the power to implement their will, everything can be achieved. The “dead Austrian economist” Ludwig von Mises, to whom Buchanan referred contemptuously during his presidential campaigns, characterized this belief as “historicism,” the intellectual posture of the German Kathedersozialisten, the academic Socialists of the Chair, who justified any and all statist measures.

But historicist contempt and ignorance of economics does not alter the fact that inexorable economic laws exist. You cannot have your cake and eat it too, for instance. Or what you consume now cannot be consumed again in the future. Or producing more of one good requires producing less of another. No wishful thinking can make such laws go away. To believe otherwise can only result in practical failure. “In fact,” noted Mises, “economic history is a long record of government policies that failed because they were designed with a bold disregard for the laws of economics.”3

In light of elementary and immutable economic laws, the Buchananite program of social nationalism is just another bold but impossible dream. No wishful thinking can alter the fact that maintaining the core institutions of the present welfare state and wanting to return to traditional families, norms, conduct, and culture are incompatible goals. You can have one—socialism (welfare)—or the other—traditional morals—but you cannot have both, for social nationalist economics, the pillar of the current welfare state system Buchanan wants to leave untouched, is the very cause of cultural and social anomalies.

In order to clarify this, it is only necessary to recall one of the most fundamental laws of economics which says that all compulsory wealth or income redistribution, regardless of the criteria on which it is based, involves taking from some—the havers of something—and giving it to others—the non-havers of something. Accordingly, the incentive to be a haver is reduced, and the incentive to be a non-haver increased. What the haver has is characteristically something considered “good,” and what the non-haver does not have is something “bad” or a deficiency. Indeed, this is the very idea underlying any redistribution: some have too much good stuff and others not enough. The result of every redistribution is that one will thereby produce less good and increasingly more bad, less perfection and more deficiencies. By subsidizing with tax funds (with funds taken from others) people who are poor, more poverty (bad) will be created. By subsidizing people because they are unemployed, more unemployment (bad) will be created. By subsidizing unwed mothers, there will be more unwed mothers and more illegitimate births (bad), etc.

Obviously, this basic insight applies to the entire system of so-called social security that has been implemented in Western Europe (from the 1880s onward) and the U.S. (since the 1930s): of compulsory government “insurance” against old age, illness, occupational injury, unemployment, indigence, etc. In conjunction with the even older compulsory system of public education, these institutions and practices amount to a massive attack on the institution of the family and personal responsibility.

By relieving individuals of the obligation to provide for their own income, health, safety, old age, and children’s education, the range and temporal horizon of private provision is reduced, and the value of marriage, family, children, and kinship relations is lowered. Irresponsibility, shortsightedness, negligence, illness and even destructionism (bads) are promoted, and responsibility, farsightedness, diligence, health and conservatism (goods) are punished.

The compulsory old age insurance system in particular, by which retirees (the old) are subsidized from taxes imposed on current income earners (the young), has systematically weakened the natural intergenerational bond between parents, grandparents, and children. The old need no longer rely on the assistance of their children if they have made no provision for their own old age; and the young (with typically less accumulated wealth) must support the old (with typically more accumulated wealth) rather than the other way around, as is typical within families.

Consequently, not only do people want to have fewer children—and indeed, birthrates have fallen in half since the onset of modern social security (welfare) policies—but also the respect which the young traditionally accorded to their elders is diminished, and all indicators of family disintegration and malfunctioning, such as rates of divorce, illegitimacy, child abuse, parent abuse, spouse abuse, single parenting, singledom, alternative lifestyles, and abortion, have increased.

Moreover, with the socialization of the health care system through institutions such as Medicaid and Medicare and the regulation of the insurance industry (by restricting an insurer’s right of refusal: to exclude any individual risk as uninsurable, and discriminate freely, according to actuarial methods, between different group risks) a monstrous machinery of wealth and income redistribution at the expense of responsible individuals and low-risk groups in favor of irresponsible actors and high-risk groups has been put in motion. Subsidies for the ill, unhealthy and disabled breed illness, disease, and disability and weaken the desire to work for a living and to lead healthy lives. One can do no better than quote the “dead Austrian economist” Ludwig von Mises once more:

being ill is not a phenomenon independent of conscious will. . . . A man’s efficiency is not merely a result of his physical condition; it depends largely on his mind and will. . . . The destructionist aspect of accident and health insurance lies above all in the fact that such institutions promote accident and illness, hinder recovery, and very often create, or at any rate intensify and lengthen, the functional disorders which follow illness or accident. . . . To feel healthy is quite different from being healthy in the medical sense. . . . By weakening or completely destroying the will to be well and able to work, social insurance creates illness and inability to work; it produces the habit of complaining—which is in itself a neurosis—and neuroses of other kinds. . . . As a social institution it makes a people sick bodily and mentally or at least helps to multiply, lengthen, and intensify disease. . . . Social insurance has thus made the neurosis of the insured a dangerous public disease. Should the institution be extended and developed the disease will spread. No reform can be of any assistance. We cannot weaken or destroy the will to health without producing illness.4

I do not wish to explain here the economic nonsense of Buchanan’s and his theoreticians’ even further-reaching idea of protectionist policies (of protecting American wages). If they were right, their argument in favor of economic protection would amount to an indictment of all trade and a defense of the thesis that each family would be better off if it never traded with anyone else. Certainly, in this case no one could ever lose his job, and unemployment due to “unfair” competition would be reduced to zero.

Yet such a full-employment society would not be prosperous and strong; it would be composed of people (families) who, despite working from dawn to dusk, would be condemned to poverty and starvation. Buchanan’s international protectionism, while less destructive than a policy of interpersonal or interregional protectionism, would result in precisely the same effect. This is not conservatism (conservatives want families to be prosperous and strong). This is economic destructionism.

In any case, what should be clear by now is that most if not all of the moral degeneration and cultural decline—the signs of decivilization—all around us are the inescapable and unavoidable results of the welfare state and its core institutions. Classical, old-style conservatives knew this, and they vigorously opposed public education and social security. They knew that states everywhere were intent upon breaking down and ultimately destroying families and the institutions and layers and hierarchies of authority that are the natural outgrowth of family based communities in order to increase and strengthen their own power. They knew that in order to do so states would have to take advantage of the natural rebellion of the adolescent (juvenile) against parental authority. And they knew that socialized education and socialized responsibility were the means of bringing about this goal.

Social education and social security provide an opening for the rebellious youth to escape parental authority (to get away with continuous misbehavior). Old conservatives knew that these policies would emancipate the individual from the discipline imposed by family and community life only to subject him instead to the direct and immediate control of the state.

Furthermore, they knew, or at least had a hunch, that this would lead to a systematic infantilization of society—a regression, emotionally and mentally, from adulthood to adolescence or childhood.

In contrast, Buchanan’s populist-proletarian conservatism—social nationalism—shows complete ignorance of all of this. Combining cultural conservatism and welfare-statism is impossible, and hence, economic nonsense. Welfare-statism—social security in any way, shape or form—breeds moral and cultural decline and degeneration. Thus, if one is indeed concerned about America’s moral decay and wants to restore normalcy to society and culture, one must oppose all aspects of the modern social-welfare state. A return to normalcy requires no less than the complete elimination of the present social security system: of unemployment insurance, social security, Medicare, Medicaid, public education, etc.—and thus the near complete dissolution and deconstruction of the current state apparatus and government power. If one is ever to restore normalcy, government funds and power must dwindle to or even fall below their nineteenth century levels. Hence, true conservatives must be hard-line libertarians (antistatists). Buchanan’s conservatism is false: it wants a return to traditional morality but at the same time advocates keeping the very institutions in place that are responsible for the destruction of traditional morals.

Most contemporary conservatives, then, especially among the media darlings, are not conservatives but socialists—either of the internationalist sort (the new and neoconservative welfare-warfare statists and global social democrats) or of the nationalist variety (the Buchananite populists). Genuine conservatives must be opposed to both. In order to restore social and cultural norms, true conservatives can only be radical libertarians, and they must demand the demolition—as a moral and economic distortion—of the entire structure of the interventionist state.

Hans-Hermann Hoppe is professor of economics at the University of Nevada, Las Vegas. Read and sign the Hoppe Victory Blog. This essay is based on a chapter from Democracy, The God that Failed (2001) that was given as a speech in 1996. Post Comments on the main blog.

1 On contemporary American conservatism see in particular Paul Gottfried, The Conservative Movement, rev. ed. (New York: Twayne Publishers, 1993); George H. Nash, The Conservative Intellectual Movement in America (New York: Basic Books, 1976) Justin Raimondo, Reclaiming the American Right: The Lost Legacy of the Conservative Movement (Burlingame, Calif.: Center for Libertarian Studies, 1993); see further also chap. 11.

2 Samuel T. Francis, “From Household to Nation: The Middle American populism of Pat Buchanan,” Chronicles (March 1996): 12-16; see also idem, Beautiful Losers:Essays on the Failure of American Conservatism (Columbia: University of Missouri Press, 1993); idem, Revolution from the Middle (Raleigh, N.C.: Middle American Press, 1997).

3 Ludwig von Mises, Human Action: A Treatise on Economics, Scholar’s Edition (Auburn, Ala.: Ludwig von Mises Institute, 1998), p. 67. “Princes and democratic majorities,” writes Mises, “are drunk with power. They must reluctantly admit that they are subject to the laws of nature. But they reject the very notion of economic law. Are they not the supreme legislators? Don’t they have the power to crush every opponent? No war lord is prone to acknowledge any limits other than those imposed on him by a superior armed force. Servile scribblers are always ready to foster such complacency by expounding the appropriate doctrines. They call their garbled presumptions “historical economics.”

4 Ludwig von Mises, Socialism: An Economic and Sociological Analysis (Indianapolis, md.: Liberty Fund, 1981), pp. 43 1-32.

Science Eats Its Own

Suppression of Science Within Science

by Henry Bauer

I wasn’t as surprised as many others were, when it was revealed that climate-change "researchers" had discussed in private e-mails how to keep important data from public view lest it shake public belief in the dogma that human activities are contributing significantly to global warming.

I wasn’t particularly surprised because just a few weeks earlier I had spoken at the Oakland Rethinking AIDS Conference about the dogmatism and strong-arm tactics that are rampant in a seemingly increasing range of fields of medicine and science. PowerPoint presentations of most of the talks at the Conference are available at the Conference website. Here’s a slightly modified, more readable, text version of my own talk. The theme in a nutshell:

For several centuries, modern science was pretty much a free intellectual market populated by independent entrepreneurs who shared the goal of understanding how the world works. Nowadays it’s a corporate enterprise where patents, pay-offs, prestige, and power take priority over getting at the scientific truth, and the powers-that-be have established knowledge monopolies.

I had met Peter Duesberg in person only at the Conference, but I had been quite familiar with him from many videos. What had always stuck in my mind was his expression of surprise, astonishment, sheer disbelief, as he told what happened to him after he questioned whether HIV could be the cause of AIDS:

I had all the students I wanted . . . lab space . . . grants . . . . elected to the National Academy. . . . became California Scientist of the Year. All my papers were published. I could do no wrong . . . professionally . . . until I started questioning . . . that HIV is the cause of AIDS. Then everything changed.

What happened then was that he got no more grants; his manuscripts were rejected without substantive critiques, just that "everyone knows that HIV causes AIDS"; Robert Gallo, who earlier had talked of Duesberg’s distinction as a leading retrovirologist, now publicly called him dishonest on scientific matters. Defenders of the mainstream view have even held Duesberg responsible for the deaths of hundreds of thousands of South Africans and have described him as the moral equivalent of a Holocaust denier.

What had Duesberg done to bring about that radical change?

Absolutely nothing. He was doing science just as before: gathering data, documenting his sources, making his analyses, presenting his conclusions for comment by others. Of course Duesberg was surprised that suddenly he had gone from lauded leading scientist to discredited crackpot.

Of course Duesberg was surprised, because his experience of suddenly being sent beyond the pale was obviously an aberration. Science isn’t like this. Science is done by the objective self-correcting scientific method. Peer review is impersonal and impartial. Arguments are substantive, not ad hominem. This experience must be unprecedented, unique.

Or, perhaps, shared just by other AIDS Rethinkers, because questioning that HIV causes AIDS is just too outrageous, and quite justifiably it puts AIDS "denialists" outside the norms of scientific behavior and discourse. You wouldn’t find anything like this in other, more normal fields of medicine or science.

Well, actually, you would. You do. Duesberg and AIDS Rethinkers are not alone in this. Duesberg’s experience is not unique, it’s even far from unique.

For example, there’s The Skeptical Environmentalist (Cambridge University Press, 2001) in which Bjørn Lomborg discussed global warming and pointed out, documented by >500 mainstream source-references, that Kyoto-type policies would not reduce warming enough to avoid such major consequences as sea-level rises. Therefore it makes sense to devise adaptations that will be needed in any case, a much better investment than trying to reduce global CO2 emissions.

A rather unremarkable economic argument based solidly on calculations from mainstream data.

So Lomborg was surely just as surprised, astonished, disbelieving, as Duesberg had been, to find that his scholarly discussion placed him beyond the pale of civilized scientific discourse. The Chair of the International Panel on Climate Change asked, Where is the difference between Lomborg’s view on humans and Hitler’s? An Australian columnist agreed: Perhaps there is a case for making climate change denial an offence–it is a crime against humanity after all. An American environmentalist seconded the notion, writing that there should be "war crimes trials for these bastards–some sort of climate Nuremberg."

Of course those comments were not made in the scientific literature, which doesn’t countenance that sort of character assassination. Or so one might hope. Hope in vain, it turns out, because a book review in Nature (414: 149-50) held that Lomborg’s text employs the strategy of those who . . . argue that gay men aren’t dying of AIDS, that Jews weren’t singled out by the Nazis for extermination. . . .

So global-warming denialism is as much beyond the pale as AIDS denialism. Except that –and perhaps you’ve noticed–Duesberg has never denied that AIDS exists, he just has a different explanation for what caused it. And Lomborg doesn’t deny that global warming is occurring, he doesn’t even question that human activities are contributing significantly to it, he is just making a cost-benefit argument.

Of course, both HIV/AIDS and global warming are matters that involve not just science but public policy and large public expenditures. You wouldn’t find anything like this in a pure science like astronomy or cosmology, would you?

Yes, you would. Yes, you do.

Take cosmology and the Big-Bang theory of the origin of the universe. Halton Arp was a respected, senior American observational astronomer. He noticed that some pairs of quasars that are physically close together nevertheless have very different redshifts. How exciting! Evidently some redshifts are not Doppler effects, in other words, not owing to rapid relative motion away from us. That means the universe-expansion calculations have to be revised. It may not have started as a Big Bang!

That’s just the sort of major potential discovery that scientists are always hoping for, isn’t it?

Certainly not in this case. Arp was granted no more telescope time to continue his observations. At age 56, Halton Arp emigrated to Germany to continue his work at the Max Planck Institute for Astrophysics.

But Arp was not alone in his views. Thirty-four senior astronomers from 10 countries, including such stellar figures as Hermann Bondi, Thomas Gold, Amitabha Ghosh, and Jayant Narlikar, sent a letter to Nature pointing out that Big Bang theory

  • relies on a growing number of hypothetical . . . things . . . never observed;
  • that alternative theories can also explain all the basic phenomena of the cosmos
  • and yet virtually all financial and experimental resources in cosmology go to Big-Bang studies.

Just the sort of discussion that goes on in science all the time, arguing pros and cons of competing ideas.

Except that Nature refused to publish the letter.

It was posted on the Internet, and by now hundreds of additional signatures have been added – just like what happened with the letter the Group for Rethinking AIDS had sent to Nature, Science, the Lancet, and the New England Journal of Medicine, all of which had refused to publish it.

At a mainstream conference on "Outstanding questions for the standard cosmological model"–there was not even a mention of the stunningly outstanding question of those anomalous redshifts. So the non-Big-Bang cosmologists organized their own separate meeting–again, like AIDS Rethinkers, or like those who question the mainstream dogma about how to cope with global warming.

For some reason, non-Big-Bang cosmology is as much beyond the pale as AIDS "denial" which isn’t denial or global warming "denial" which isn’t denial.

Then there’s that most abstract of fundamental sciences, theoretical physics. The problem has long been, How to unify relativity and quantum mechanics? Quantum mechanics regards the world as made up of discrete bits whereas relativity regards the world as governed by continuous, not discrete, fields. Since the mid-1970s, there has been no real progress. Everyone has been working on so-called "string theory," which has delivered no testable conclusions and remains a hope, a speculation, not a real theory. Nevertheless, theoretical physicists who want to look at other approaches can’t find jobs, can’t get grants, can’t get published. (Read Lee Smolin, The Trouble with Physics.)

You begin to wonder, don’t you, how many other cases there could be in science, where a single theory has somehow captured all the resources? And where competent scientists who want to try something different are not only blocked but personally insulted?

Well, there’s the matter of what killed off the dinosaurs. Everyone knows that the dinosaurs were killed off 65 million years ago when an asteroid hit the Earth. Everyone knows that, that is, except the paleontologists, whose specialty this sort of question is supposed to be.

The asteroid theory had been developed by Luis Alvarez, Nobel Laureate in physics, and his son Walter, a geologist. Paleontologist Dewey McLean had earlier developed a detailed theory based on volcanism –it had long been known that tremendous volcanic activity, the "Deccan Traps," had occurred at the relevant time.

Do you think Alvarez engaged McLean in civilized, substantive discussion?

Or would you be surprised to hear that at a conference, Alvarez said to McLean in private: "I’ll wreck your career if you persist." And Alvarez did indeed contact McLean’s university and tried to block McLean’s promotion–I know that for sure because I was Dean of Dewey McLean’s College at the time.

Of course, there’s always been resistance to change in science, as in other human activities. But this degree of suppression of minority views and the use of gutter language and character assassination makes it seem like a new phenomenon. At least it has seemed so to the people who have found themselves suddenly ejected from mainstream discourse and resources.

Arp, Duesberg, Lomborg, McLean and other "denialists" of various mainstream theories are surprised because it isn’t supposed to be like that in science. Lomborg doesn’t know that "AIDS denialists" are treated rather like "global warming denialists." Arp doesn’t know that AIDS and global warming "denialists" have it even worse than those who question the Big Bang. McLean doesn’t know that "denialists" about AIDS, Big-Bang, and global warming also have their careers threatened. Everyone who experiences personally this sort of thing imagines it’s a unique experience, because science isn’t supposed to be like this.

But science nowadays IS like this: Disagree with the conventional contemporary scientific wisdom and you won’t get grants, won’t get published, will be compared to Holocaust deniers.

And it really wasn’t always this way. Nowadays "science," "pure research," has become cutthroat in the extreme, and there’s much corner-cutting and sheer dishonesty in science. For example, NIH newsletters routinely name specific individuals who are being barred from seeking grants for some specified period because of some act of dishonesty.

There was no need, in the good not-so-old days, for a federal Office of Research Integrity–a designation that George Orwell would have relished. But now we do have such an Office, and at colleges there are Centers for Research Ethics, and publishers put out journals like Accountability in Research–there’s a burgeoning young academic industry devoted to telling scientists how to behave properly.

That’s what science has come to. Genuine science, the search for better understanding, has been hijacked by self-interest and vested interests and is now captive to knowledge monopolies and research cartels: A single theory exerts dogmatic control over grants, publications, jobs, promotions.

WHY?? How did this happen?

In a follow-up piece, I’ll describe how we arrived at this New World Order in Science.

December 17, 2009

Henry H. Bauer [send him mail] is Dean Emeritus of Arts & Sciences and Professor Emeritus of Chemistry & Science Studies at Virginia Tech. His books about science and scientific unorthodoxies include Scientific Literacy and the Myth of the Scientific Method (1992), Science or Pseudoscience (2001), and The Origin, Persistence and Failings of HIV/AIDS Theory (2007). He currently writes an HIV Skepticism blog.

Copyright © 2009 by LewRockwell.com. Permission to reprint in whole or in part is gladly granted, provided full credit is given.

End of Science as We Know It

The New World Order in Science

by Henry Bauer

I’m going to sketch a chronology and analysis that draw on the history of several centuries of science and on many volumes written about that. In being concise, I’ll make some very sweeping generalizations without acknowledging necessary exceptions or nuances. But the basic story is solidly in the mainstream of history of science, philosophy of science, sociology of science, and the like, what’s nowadays called "science & technology studies" (STS).

It never was really true, of course, as the conventional wisdom tends even now to imagine, that "the scientific method" guarantees objectivity, that scientists work impersonally to discover truth, that scientists are notably smarter, more trustworthy, more honest, so tied up in their work that they neglect everything else, don’t care about making money . . . But it is true that for centuries scientists weren’t subject to multiple and powerful conflicts of interest. There is no "scientific method." Science is done by people; people aren’t objective. Scientists are just like other professionals – to use a telling contemporary parallel, scientists are professionals just like the wheelers and dealers on Wall Street: not exactly dishonest, but looking out first and foremost for Number One.

"Modern" science dates roughly from the 17th century. It was driven by the sheer curiosity of lay amateurs and the God-worshipping curiosity of churchmen; there was little or no conflict of interest with plain truth-seeking. The truth-seekers formed voluntary associations: academies like the Royal Society of London. Those began to publish what happened at their meetings, and some of those Proceedings and Transactions have continued publication to the present day. These meetings and publications were the first informal steps to contemporary "peer review."

During the 19th century, "scientist" became a profession, one could make a living at it. Research universities were founded, and with that came the inevitable conflict of interest between truth-seeking and career-making, especially since science gained a very high status and one could become famous through success in science. (An excellent account is by David Knight in The Age of Science.)

Still it was pretty much an intellectual free market, in which the entrepreneurs could be highly independent because almost all science was quite inexpensive and there were a multitude of potential patrons and sponsors, circumstances that made for genuine intellectual competition.

The portentous change to "Big Science" really got going in mid-20th century. Iconic of the new circumstances remains the Manhattan Project to produce atomic bombs. Its dramatic success strengthened the popular faith that "science" can do anything, and very quickly, given enough resources. More than half a century later, people still talk about having a "Manhattan Project" to stop global warming, eradicate cancer, whatever.

So shortly after World War II, the National Science Foundation (NSF) was established, and researchers could get grants for almost anything they wanted to do, not only from NSF but also from the Atomic Energy Commission, the Army, the Navy, the Air Force, the Defense Advanced Research Projects Agency (DARPA), the Department of the Interior, the Agriculture Department . . . as well as from a number of private foundations. I experienced the tail end of this bonanza after I came to the United States in the mid-1960s. Everyone was getting grants. Teachers colleges were climbing the prestige ladder to become research universities, funded by grant-getting faculty "stars": colleges just had to appoint some researchers, those would bring in the moolah, that would pay for graduate students to do the actual work, and the "overhead" or "indirect costs" associated with the grants – often on the order of 25%, with private universities sometimes even double that – allowed the institutions to establish all sorts of infrastructure and administrative structures. In the 1940s, there had been 107 PhD-granting universities in the United States; by 1978 there were more than 300.

Institutions competed with one another for faculty stars and to be ranked high among "research universities," to get their graduate programs into the 20 or so "Top Graduate Departments" – rankings that were being published at intervals for quite a range of disciplines.

Everything was being quantified, and the rankings pretty much reflected quantity, because of course that’s what you can measure "objectively": How many grants? How much money? How many papers published? How many citations to those papers? How many students? How many graduates placed where?

This quantitative explosion quickly reached the limits of possible growth. That had been predicted early on by Derek de Solla Price, historian of science and pioneer of "scientometrics" and "Science Indicators," quantitative measures of scientific and technological activity. Price had recognized that science had been growing exponentially with remarkable regularity since roughly the 17th century: doubling about every 15 years had been the numbers of scientific journals being published, the numbers of papers being published in them, the numbers of abstracts journals established to digest the flood of research, the numbers of researchers . . . .

Soon after WWII, Price noted, expenditures on research and development (R&D) had reached about 2.5% of GDP in industrialized countries, which meant quite obviously that continued exponential growth had become literally impossible. And indeed the growth slowed, and quite dramatically by the early 1970s. I saw recently that the Obama administration expressed the ambition to bring R&D to 3% of GDP, so there’s indeed been little relative growth in the last half century.

Now, modern science had developed a culture based on limitless growth. Huge numbers of graduates were being turned out, many with the ambition to do what their mentors had done: become entrepreneurial researchers bringing in grants wholesale and commanding a stable of students and post-docs who could churn out the research and generate a flood of publications. By the late 1960s or early 1970s, for example, to my personal knowledge, one of the leading electrochemists in the United States in one of the better universities was controlling annual expenditures of many hundreds of thousands of dollars (1970s dollars!), with several postdocs each supervising a horde of graduate students and pouring out the paper.

The change from unlimited possibilities to a culture of steady state, to science as zero-sum game, represents a genuine crisis: If one person gets a grant, some number of others don’t. The "success rate" in applications to NSF or the National Institutes of Health (NIH) is no more than 25% on average nowadays, less so among the not-yet-established institutions. So it would make sense for researchers to change their aims, their beliefs about what is possible, to stop counting success in terms of quantities: but they can’t do that because the institutions that employ them still count success in terms of quantity, primarily the quantity of dollars brought in. To draw again on a contemporary analogy, scientific research and the production or training of researchers expanded in bubble-like fashion following World War II; that bubble was pricked in the early 1970s and has been deflating with increasingly obvious consequences ever since.

One consequence of the bubble’s burst is that there are far too many would-be researchers and would-be research institutions chasing grants. Increasing desperation leads to corner-cutting and frank cheating. Senior researchers established in comfortable positions guard their own privileged circumstances jealously, and that means in some part not allowing their favored theories and approaches to be challenged by the Young Turks. Hence knowledge monopolies and research cartels.

A consequence of Big Science is that very few if any researchers can work as independent entrepreneurs. They belong to teams or institutions with inevitably hierarchical structures. Where independent scientists owed loyalty first and foremost to scientific truth, now employee researchers owe loyalty first to employers, grant-givers, sponsors. (For this change in ideals and mores, see John Ziman, Prometheus Bound, 1994.) Science used to be compared to religion, and scientists to monks – in the late 19th century, T. H. Huxley claimed quite seriously to be giving Lay Sermons on behalf of the Church of Scientific; but today’s scientists, as already said, are more like Wall Street professionals than like monks.

Since those who pay the piper call the tune, research projects are chosen increasingly for non-scientific reasons; perhaps political ones, as when President Nixon declared war on cancer at a time when the scientific background knowledge made such a declaration substantively ludicrous and doomed to failure for the foreseeable future. With administrators in control because the enterprises are so large, bureaucrats set the rules and make the decisions. For advice, they naturally listen to the senior well-established figures, so grants go only to "mainstream" projects.

Nowadays there are conflicts of interest everywhere. Researchers benefit from individual consultancies. University faculty establish personal businesses to exploit their specialized knowledge which was gained largely at public expense. Institutional conflicts of interest are everywhere: There are university-industry collaborations; some universities have toyed with establishing their own for-profit enterprises to exploit directly the patents generated by their faculty; research universities have whole bureaucracies devoted to finding ways to make money from the university’s knowledge stock, just as the same or parallel university bureaucracies sell rights to use the university’s athletics logos. It is not at all an exaggeration to talk of an academic-government-industry complex whose prime objective is not the search for abstract scientific truth.

Widely known is that President Eisenhower had warned of the dangers of a military-industrial complex. Much less well known is that Eisenhower was just as insightful and prescient about the dangers from Big Science:

in holding scientific research and discovery in respect . . . we must also be alert to the . . . danger that public policy could itself become the captive of a scientific-technological elite

That describes in a nutshell today’s knowledge monopolies. A single theory acts as dogma once the senior, established researchers have managed to capture the cooperation of the political powers. The media take their cues also from the powers that be and from the established scientific authorities, so "no one" even knows that alternatives exist to HIV/AIDS theory, to the theory that human activities are contributing to climate change, that the Big Bang might not have happened, that it wasn’t an asteroid that killed the dinosaurs, and so on.

The bitter lesson is that the traditionally normal process of science, open argument and unfettered competition, can no longer be relied upon to deliver empirically arrived at, relatively objective understanding of the world’s workings. Political and social activism and public-relations efforts are needed, as public policies are increasingly determined by the actions of lobbyists backed by tremendous resources and pushing a single dogmatic approach. No collection of scientifically impeccable writings can compete against an International Panel on Climate Change and a Nobel Peace Prize awarded for Albert Gore’s activism and "documentary" film – and that is no prophesy, for the evidence is here already, in the thousands of well-qualified environmental scientists who have for years petitioned for an unbiased analysis of the data. No collection of scientifically impeccable writings can compete against the National Institutes of Health, the World Health Organization, UNAIDS, innumerable eminent charities like the Bill and Melinda Gates Foundation, when it comes to questions of HIV and AIDS – and again that is no prophesy, because the data have been clear for a couple of decades that HIV is not, cannot be the cause of AIDS.

As to HIV and AIDS, maybe the impetus to truth may come from politicians who insist on finding out exactly what the benefits are of the roughly $20 billion we – the United States – are spending annually under the mistaken HIV/AIDS theory. Or maybe the impetus to truth may come from African Americans, who may finally rebel against the calumny that it is their reprehensible behavior that makes them 7 to 20 times more likely to test "HIV-positive" than their white American compatriots; or perhaps from South African blacks who are alleged to be "infected" at rates as high as 30%, supposedly because they are continually engaged in "concurrent multiple sexual relationships," having multiple sexual partners at any given time but changing them every few weeks or months. Or from a court case or series of them, because of ill health caused by toxic antiretroviral drugs administered on the basis of misleading "HIV" tests; or perhaps because one or more of the "AIDS denialists" wins libel judgment against one or more of those who call them Holocaust deniers. Maybe the impetus to truth may come from the media finally seizing on any of the above as something "news-worthy."

At any rate, the science has long been clear, and the need is for action at a political, social, public-relations, level. In this age of knowledge monopolies and research cartels, scientific truth is suppressed by the most powerful forces in society. It used to be that this sort of thing would be experienced only in Nazi Germany or the Soviet Union, but nowadays it happens in democratic societies as a result of what President Eisenhower warned against: "public policy . . . become the captive of a scientific-technological elite."

December 19, 2009

Henry H. Bauer [send him mail] is Dean Emeritus of Arts & Sciences and Professor Emeritus of Chemistry & Science Studies at Virginia Tech. His books about science and scientific unorthodoxies include Scientific Literacy and the Myth of the Scientific Method (1992), Science or Pseudoscience (2001), and The Origin, Persistence and Failings of HIV/AIDS Theory (2007). He currently writes an HIV Skepticism blog.

Copyright © 2009 by LewRockwell.com. Permission to reprint in whole or in part is gladly granted, provided full credit is given.

Santa ain’t got nothin on ole St. Nick

Nothing captures the commercialisation of Christmas quite as effectively as the history of Santa Claus. To illustrate the point, look at a wonderful 16th-century painting that hangs in Room 7 of the National Gallery in London. Most people who pass by it are unaware of its significance. But this panel, circa 1555-60, preserves a particularly pure element of the Christmas spirit.

The painting, attributed to the Tuscan mannerist Girolamo Macchietti, depicts the most important legend of St Nicholas of Myra.

To the right, a nobleman slumbers, surrounded by his three daughters, who are also asleep. According to tradition, the family was so poor that the father was on the brink of selling his girls into prostitution. To the left, their saviour appears at the window, dressed in a sumptuous orange tunic adorned with a red robe. Under the cover of darkness, he prepares to lob through the aperture the second of three balls of gold (each represents a purse stuffed with money) that will provide dowries for all of the daughters, so that they won’t have to sell their bodies to survive.

For Macchietti’s contemporaries, this youth would have been instantly recognisable as St Nicholas. But, today, he goes by a much more familiar name: Father Christmas.

This may come as a surprise. How can Macchietti’s Mr Goldenballs, with his gilt sandals and curly, glowing hair, be related to roly-poly, red-faced Santa Claus? For starters, he’s too thin. And beardless. He is dressed like an inhabitant of the Mediterranean, not Lapland. He is standing by the window, not peering down the chimney. And, anyway, where’s his retinue of reindeer?

But, then, this is one of the sad truths at the heart of Christmas present. These days, it isn’t only the birth of Christ that is threatened with oblivion. The charitable St Nicholas, associated with Christmas since time immemorial, is rapidly sliding towards anonymity, too. Meanwhile, the stock of his more recent incarnation as Santa Claus, the darling of department-store managers, filmmakers and advertising copy-writers the world over, continues to rise.

Canon James Rosenthal, who has earned a reputation as one of the Church’s leading authorities on St Nicholas, believes it’s time that the “real” Father Christmas is remembered.

“I always think it’s sad that people are ignorant of the origins of our customs,” he says. “Santa Claus is fine, but St Nicholas is so much better. Like us, he is real.

“I believe there is a bit of St Nicholas in all of us. For Christians, he is a model to push chubby Santa back into fairyland.”

St Nicholas’s standing is currently so low that Canon Rosenthal was recently banned from visiting children held at an immigration centre in Bedfordshire. When he arrived at Yarl’s Wood, dressed as St Nicholas, wearing a magnificent fake white beard and a bishop’s mitre, he had hoped to deliver presents donated by the congregations of several London churches. But he was turned away by security guards, who eventually called the police. “I felt like a criminal for trying to spread cheer and a few gifts,” Canon Rosenthal told me this week.

St Nick must have been a pretty impressive figure to inspire such devotion, but, in truth, few facts about him are known. He was probably born around AD 260 in the port of Patara on the southern coast of what was then Asia Minor, now Turkey. He grew up in the eastern reaches of the Roman Empire, which was still hostile to Christianity, but found himself drawn to the new religion, and rose to become the bishop of Myra, now the Lycian town of Demre. He died in Myra in 343, possibly on December 6 (the date on which he is usually venerated today).

“The first Life of St Nicholas is from the 9th century,” says Robin Cormack, professor emeritus of the Courtauld Institute of Art, and an expert on Byzantium. “Maybe St Nicholas was a bishop in the 4th century AD; all else seems fiction.”

But what fiction! Scintillating legends quickly gathered around the memory of Myra’s bishop, who acquired a reputation for generosity. Aside from rescuing the daughters of the impoverished nobleman in Macchietti’s picture, he is said to have resurrected three boys who had been killed by a psychotic butcher, who’d chopped them up, salted their remains in a barrel, and planned to sell their cured body parts as ham during a period of famine.

Over the centuries, St Nicholas evolved into the patron saint of sailors and fishermen, pawnbrokers, children, scholars, druggists – and even people being mugged.

By the 10th century, a basilica containing his relics had been built at Myra. In those days, the remains of holy figures were big business, since thousands of pilgrims flocked to shrines all over Europe. In 1087, a bunch of brigands from the Italian port of Bari on the Adriatic Sea set sail for Myra, where they looted the basilica, before returning home with the exhumed remains of St Nick. A shrine was quickly established back in Italy, and people flocked to Bari to peddle the holy “manna” which was said to drip from the saint’s bones.

The arrival of St Nicholas in Italy accounts for his popularity among Italian artists of the Renaissance. Fra Angelico and Masaccio both depicted the saint in altarpieces. Veronese painted the saint, with a white beard, in a grand canvas from 1562 that can be seen in the National Gallery.

St Nicholas was soon venerated across Europe. He was especially beloved in Holland, where, to this day, children receive gifts on the Feast of St Nicholas rather than Christmas Day. The tangerines traditionally left as gifts in the stockings of children who have been good allude to St Nicholas’s emblem – three balls of gold.

His transformation into Father Christmas only occurred after the Dutch had emigrated to North America in the 17th century. In the New World, they continued to observe the feast day of Sinterklaas, as they called St Nicholas. This dialectical quirk became “Santa Claus”.

Most of Santa Claus’s current iconography – the flowing beard, red-and-white livery, reindeer – dates from 19th-century America, where the traditions of the early Dutch settlers were fondly recalled. Clement C Moore’s poem The Night Before Christmas, published anonymously in 1823, cemented the image of Father Christmas in the popular imagination as a jolly old soul with a white beard who arrives through chimneys to deliver gifts into stockings, before riding off into the night on a sleigh laden with toys and powered by prancing reindeer.

The New York caricaturist Thomas Nast later refined our image of Father Christmas, fattening him up in a series of cartoons that appeared in Harper’s Weekly from 1863 onwards. Nast was also responsible for changing the colour of Santa’s cloak from tan or green to red, decades before the Coca-Cola advertising campaigns of the mid-20th century, featuring Swedish artist Haddon Sundblom’s vision of Father Christmas swigging from a bottle of Coke. The first of these ads appeared in 1931, marking a watershed in St Nicholas’s transformation from icon of Christian self-sacrifice to the plump, friendly face of yuletide capitalism.

The question remains why it was specifically St Nicholas, rather than any other saint, who became indelibly linked with Christmas. For Canon Rosenthal, the answer is simple. “St Nicholas hit all the right chords in the hearts, minds and imaginations of the people,” he says. “He went to prison for his faith, he smashed pagan altars, he gave away his wealth, and he even restored three boys to life. Not bad for one person.”

But Prof Cormack is not so sure. “No one understands the reason for the popularity of St Nicholas,” he says. “All the stories [associated with him] are conventional saints’ stuff. He just got lucky.”