by Pat Heyman | Jan 1, 2010 | Uncategorized
German Physicists Trash Global Warming “Theory”
December 26, 2009
guest article by John O’Sullivan
For any non-scientist interested in the climate debate, there is nothing better than a ready primer to guide you through the complexities of atmospheric physics – the “hardest” science of climatology. Here we outline the essential points made by Dr. Gerhard Gerlich, a respected German physicist, that counter the bogus theory of Anthropogenic Global Warming (AGW).
Before going further, it’s worth bearing in mind that no climatologist ever completed any university course in climatology–that’s how new this branch of science really is. Like any new science the fall-back position of a cornered AGW proponent is the dreaded “appeal to authority” where the flustered debater, out of his or her depth, will say, “Well, professor so-and-so says it’s true – so it must be true.” Don’t fall for that proxy tree-ring counter’s gambit any longer. Here is the finest shredding of junk science you will ever read.
In a recently revised and re-published paper, Dr Gerlich debunks AGW and shows that the IPCC “consensus” atmospheric physics model tying CO2 to global warming is not only unverifiable, but actually violates basic laws of physics, i.e. the First and Second Law of Thermodynamics. The latest version of this momentous scientific paper appears in the March 2009 edition of the International Journal of Modern Physics.
The central claims of Dr. Gerlich and his colleague, Dr. Ralf Tscheuschner, include, but are not limited to:
- The mechanism of warming in an actual greenhouse is different than the mechanism of warming in the atmosphere, therefore it is not a “greenhouse” effect and should be called something else.
- The climate models that predict catastrophic global warming also result in a net heat flow from atmospheric greenhouse gasses to the warmer ground, which is in violation of the second law of thermodynamics.
Essentially, any machine which transfers heat from a low temperature reservoir to a high temperature reservoir without external work applied cannot exist. If it did it would be a “perpetual motion machine” – the realm of pure sci-fi.
Gerlich’s and Tscheuschner’s independent theoretical study is detailed in a lengthy (115 pages), mathematically complex (144 equations, 13 data tables, and 32 figures or graphs), and well-sourced (205 references) paper. The German physicists prove that even if CO2 concentrations double (a prospect even global warming advocates admit is decades away), the thermal conductivity of air would not change more than 0.03%. They show that the classic concept of the glass greenhouse wholly fails to replicate the physics of Earth’s climate. They also prove that a greenhouse operates as a “closed” system while the planet works as an “open” system and the term “atmospheric greenhouse effect” does not occur in any fundamental work involving thermodynamics, physical kinetics, or radiation theory. All through their paper the German scientists show how the greenhouse gas theory relies on guesstimates about the scientific properties involved to “calculate” the chaotic interplay of such a myriad and unquantifiable array of factors that is beyond even the abilities of the most powerful of modern supercomputers.
The paper’s introduction states it neatly:
(a) there are no common physical laws between the warming phenomenon in glass houses and the fictitious atmospheric greenhouse effects, (b) there are no calculations to determine an average surface temperature of a planet, (c) the frequently mentioned difference of 33 degrees Celsius is a meaningless number calculated wrongly, (d) the formulas of cavity radiation are used inappropriately, (e) the assumption of a radiative balance is unphysical, (f) thermal conductivity and friction must not be set to zero, the atmospheric greenhouse conjecture is falsified.
This thorough debunking of the theory of man made warming disproves that there exists a mechanism whereby carbon dioxide in the cooler upper atmosphere exerts any thermal “forcing” effect on the warmer surface below. To do so would violate both the First and Second Laws of Thermodynamics. As there is no glass roof on the earth to trap the excess heat, it escapes upward into space.Thus we may conclude that the common sense axioms are preserved so that the deeper the ocean, the colder the water and heat rises, it does not fall. QED.
John O’Sullivan is a legal advocate and writer who for several years has litigated in government corruption and conspiracy cases in both the US and Britain. Visit his website.
by Pat Heyman | Dec 27, 2009 | Uncategorized
The Intellectual Incoherence of Conservatism
Hans-Hermann Hoppe
Modern conservatism, in the United States and Europe, is confused and distorted. Under the influence of representative democracy and with the transformation of the U.S. and Europe into mass democracies from World War I, conservatism was transformed from an anti-egalitarian, aristocratic, anti-statist ideological force into a movement of culturally conservative statists: the right wing of the socialists and social democrats.
Most self-proclaimed contemporary conservatives are concerned, as they should be, about the decay of families, divorce, illegitimacy, loss of authority, multiculturalism, social disintegration, sexual libertinism, and crime. All of these phenomena they regard as anomalies and deviations from the natural order, or what we might call normalcy.
However, most contemporary conservatives (at least most of the spokesmen of the conservative establishment) either do not recognize that their goal of restoring normalcy requires the most drastic, even revolutionary, antistatist social changes, or (if they know about this) they are engaged in betraying conservatism’s cultural agenda from inside in order to promote an entirely different agenda.
That this is largely true for the so-called neoconservatives does not require further explanation here. Indeed, as far as their leaders are concerned, one suspects that most of them are of the latter kind. They are not truly concerned about cultural matters but recognize that they must play the cultural-conservatism card so as not to lose power and promote their entirely different goal of global social democracy. The fundamentally statist character of American neoconservatism is best summarized by a statement of one of its leading intellectual champions Irving Kristol:
“[T]he basic principle behind a conservative welfare state ought to be a simple one: wherever possible, people should be allowed to keep their own money—rather than having it transferred (via taxes to the state)—on the condition that they put it to certain defined uses.” [Two Cheers for Capitalism, New York: Basic Books, 1978, p. 119].
This view is essentially identical to that held by modern, post-Marxist European Social-Democrats. Thus, Germany’s Social Democratic Party (SPD), for instance, in its Godesberg Program of 1959, adopted as its core motto the slogan “as much market as possible, as much state as necessary.”
A second, somewhat older but nowadays almost indistinguishable branch of contemporary American conservatism is represented by the new (post World War II) conservatism launched and promoted, with the assistance of the CIA, by William Buckley and his National Review. Whereas the old (pre-World War II) American conservatism had been characterized by decidedly anti-interventionist foreign policy views, the trademark of Buckley’s new conservatism has been its rabid militarism and interventionist foreign policy.
In an article, “A Young Republican’s View,” published in Commonweal on January 25, 1952, three years before the launching of his National Review, Buckley thus summarized what would become the new conservative credo: In light of the threat posed by the Soviet Union, “we [new conservatives] have to accept Big Government for the duration—for neither an offensive nor a defensive war can be waged . . . except through the instrument of a totalitarian bureaucracy within our shores.”
Conservatives, Buckley wrote, were duty-bound to promote “the extensive and productive tax laws that are needed to support a vigorous anti-Communist foreign policy,” as well as the “large armies and air forces, atomic energy, central intelligence, war production boards and the attendant centralization of power in Washington.”
Not surprisingly, since the collapse of the Soviet Union in the late 1980s, essentially nothing in this philosophy has changed. Today, the continuation and preservation of the American welfare-warfare state is simply excused and promoted by new and neo-conservatives alike with reference to other foreign enemies and dangers: China, Islamic fundamentalism, Saddam Hussein, “rogue states,” and the threat of “global terrorism.”
However, it is also true that many conservatives are genuinely concerned about family disintegration or dysfunction and cultural decline. I am thinking here in particular of the conservatism represented by Patrick Buchanan and his movement. Buchanan’s conservatism is by no means as different from that of the conservative Republican party establishment as he and his followers fancy themselves. In one decisive respect their brand of conservatism is in full agreement with that of the conservative establishment: both are statists. They differ over what exactly needs to be done to restore normalcy to the U.S., but they agree that it must be done by the state. There is not a trace of principled antistatism in either.
Let me illustrate by quoting Samuel Francis, who was one of the leading theoreticians and strategists of the Buchananite movement. After deploring “anti-white” and “anti-Western” propaganda, “militant secularism, acquisitive egoism, economic and political globalism, demographic inundation, and unchecked state centralism,” he expounds on a new spirit of “America First,” which “implies not only putting national interests over those of other nations and abstractions like ‘world leadership,’ ‘global harmony,’ and the ‘New World Order,’ but also giving priority to the nation over the gratification of individual and subnational interests.”
How does he propose to fix the problem of moral degeneration and cultural decline? There is no recognition that the natural order in education means that the state has nothing to do with it. Education is entirely a family matter and ought to be produced and distributed in cooperative arrangements within the framework of the market economy.
Moreover, there is no recognition that moral degeneracy and cultural decline have deeper causes and cannot simply be cured by state-imposed curriculum changes or exhortations and declamations. To the contrary, Francis proposes that the cultural turn-around—the restoration of normalcy—can be achieved without a fundamental change in the structure of the modern welfare state. Indeed, Buchanan and his ideologues explicitly defend the three core institutions of the welfare state: social security, medicare, and unemployment subsidies. They even want to expand the “social” responsibilities of the state by assigning to it the task of “protecting,” by means of national import and export restrictions, American jobs, especially in industries of national concern, and “insulate the wages of U.S. workers from foreign laborers who must work for $1 an hour or less.”
In fact, Buchananites freely admit that they are statists. They detest and ridicule capitalism, laissez-faire, free markets and trade, wealth, elites, and nobility; and they advocate a new populist—indeed proletarian—conservatism which amalgamates social and cultural conservatism and socialist economics. Thus, continues Francis,
while the left could win Middle Americans through its economic measures, it lost them through its social and cultural radicalism, and while the right could attract Middle Americans through appeals to law and order and defense of sexual normality, conventional morals and religion, traditional social institutions and invocations of nationalism and patriotism, it lost Middle Americans when it rehearsed its old bourgeois economic formulas.
Hence, it is necessary to combine the economic policies of the left and the nationalism and cultural conservatism of the right, to create “a new identity synthesizing both the economic interests and cultural-national loyalties of the proletarianized middle class in a separate and unified political movement.” For obvious reasons this doctrine is not so named, but there is a term for this type of conservatism: It is called social nationalism or national socialism.
(As for most of the leaders of the so-called Christian Right and the “moral majority,” they simply desire the replacement of the current, left-liberal elite in charge of national education by another one, i.e., themselves. “From Burke on,” Robert Nisbet has criticized this posture, “it has been a conservative precept and a sociological principle since Auguste Comte that the surest way of weakening the family, or any vital social group, is for the government to assume, and then monopolize, the family’s historic functions.” In contrast, much of the contemporary American Right “is less interested in Burkean immunities from government power than it is in putting a maximum of governmental power in the hands of those who can be trusted. It is control of power, not diminution of power, that ranks high.”)
I will not concern myself here with the question of whether or not Buchanan’s conservatism has mass appeal and whether or not its diagnosis of American politics is sociologically correct. I doubt that this is the case, and certainly Buchanan’s fate during the 1995 and 2000 Republican presidential primaries does not indicate otherwise. Rather, I want to address the more fundamental questions: Assuming that it does have such appeal; that is, assuming that cultural conservatism and socialist economics can be psychologically combined (that is, that people can hold both of these views simultaneously without cognitive dissonance), can they also be effectively and practically (economically and praxeologically) combined? Is it possible to maintain the current level of economic socialism (social security, etc.) and reach the goal of restoring cultural normalcy (natural families and normal rules of conduct)?
Buchanan and his theoreticians do not feel the need to raise this question, because they believe politics to be solely a matter of will and power. They do not believe in such things as economic laws. If people want something enough, and they are given the power to implement their will, everything can be achieved. The “dead Austrian economist” Ludwig von Mises, to whom Buchanan referred contemptuously during his presidential campaigns, characterized this belief as “historicism,” the intellectual posture of the German Kathedersozialisten, the academic Socialists of the Chair, who justified any and all statist measures.
But historicist contempt and ignorance of economics does not alter the fact that inexorable economic laws exist. You cannot have your cake and eat it too, for instance. Or what you consume now cannot be consumed again in the future. Or producing more of one good requires producing less of another. No wishful thinking can make such laws go away. To believe otherwise can only result in practical failure. “In fact,” noted Mises, “economic history is a long record of government policies that failed because they were designed with a bold disregard for the laws of economics.”
In light of elementary and immutable economic laws, the Buchananite program of social nationalism is just another bold but impossible dream. No wishful thinking can alter the fact that maintaining the core institutions of the present welfare state and wanting to return to traditional families, norms, conduct, and culture are incompatible goals. You can have one—socialism (welfare)—or the other—traditional morals—but you cannot have both, for social nationalist economics, the pillar of the current welfare state system Buchanan wants to leave untouched, is the very cause of cultural and social anomalies.
In order to clarify this, it is only necessary to recall one of the most fundamental laws of economics which says that all compulsory wealth or income redistribution, regardless of the criteria on which it is based, involves taking from some—the havers of something—and giving it to others—the non-havers of something. Accordingly, the incentive to be a haver is reduced, and the incentive to be a non-haver increased. What the haver has is characteristically something considered “good,” and what the non-haver does not have is something “bad” or a deficiency. Indeed, this is the very idea underlying any redistribution: some have too much good stuff and others not enough. The result of every redistribution is that one will thereby produce less good and increasingly more bad, less perfection and more deficiencies. By subsidizing with tax funds (with funds taken from others) people who are poor, more poverty (bad) will be created. By subsidizing people because they are unemployed, more unemployment (bad) will be created. By subsidizing unwed mothers, there will be more unwed mothers and more illegitimate births (bad), etc.
Obviously, this basic insight applies to the entire system of so-called social security that has been implemented in Western Europe (from the 1880s onward) and the U.S. (since the 1930s): of compulsory government “insurance” against old age, illness, occupational injury, unemployment, indigence, etc. In conjunction with the even older compulsory system of public education, these institutions and practices amount to a massive attack on the institution of the family and personal responsibility.
By relieving individuals of the obligation to provide for their own income, health, safety, old age, and children’s education, the range and temporal horizon of private provision is reduced, and the value of marriage, family, children, and kinship relations is lowered. Irresponsibility, shortsightedness, negligence, illness and even destructionism (bads) are promoted, and responsibility, farsightedness, diligence, health and conservatism (goods) are punished.
The compulsory old age insurance system in particular, by which retirees (the old) are subsidized from taxes imposed on current income earners (the young), has systematically weakened the natural intergenerational bond between parents, grandparents, and children. The old need no longer rely on the assistance of their children if they have made no provision for their own old age; and the young (with typically less accumulated wealth) must support the old (with typically more accumulated wealth) rather than the other way around, as is typical within families.
Consequently, not only do people want to have fewer children—and indeed, birthrates have fallen in half since the onset of modern social security (welfare) policies—but also the respect which the young traditionally accorded to their elders is diminished, and all indicators of family disintegration and malfunctioning, such as rates of divorce, illegitimacy, child abuse, parent abuse, spouse abuse, single parenting, singledom, alternative lifestyles, and abortion, have increased.
Moreover, with the socialization of the health care system through institutions such as Medicaid and Medicare and the regulation of the insurance industry (by restricting an insurer’s right of refusal: to exclude any individual risk as uninsurable, and discriminate freely, according to actuarial methods, between different group risks) a monstrous machinery of wealth and income redistribution at the expense of responsible individuals and low-risk groups in favor of irresponsible actors and high-risk groups has been put in motion. Subsidies for the ill, unhealthy and disabled breed illness, disease, and disability and weaken the desire to work for a living and to lead healthy lives. One can do no better than quote the “dead Austrian economist” Ludwig von Mises once more:
being ill is not a phenomenon independent of conscious will. . . . A man’s efficiency is not merely a result of his physical condition; it depends largely on his mind and will. . . . The destructionist aspect of accident and health insurance lies above all in the fact that such institutions promote accident and illness, hinder recovery, and very often create, or at any rate intensify and lengthen, the functional disorders which follow illness or accident. . . . To feel healthy is quite different from being healthy in the medical sense. . . . By weakening or completely destroying the will to be well and able to work, social insurance creates illness and inability to work; it produces the habit of complaining—which is in itself a neurosis—and neuroses of other kinds. . . . As a social institution it makes a people sick bodily and mentally or at least helps to multiply, lengthen, and intensify disease. . . . Social insurance has thus made the neurosis of the insured a dangerous public disease. Should the institution be extended and developed the disease will spread. No reform can be of any assistance. We cannot weaken or destroy the will to health without producing illness.
I do not wish to explain here the economic nonsense of Buchanan’s and his theoreticians’ even further-reaching idea of protectionist policies (of protecting American wages). If they were right, their argument in favor of economic protection would amount to an indictment of all trade and a defense of the thesis that each family would be better off if it never traded with anyone else. Certainly, in this case no one could ever lose his job, and unemployment due to “unfair” competition would be reduced to zero.
Yet such a full-employment society would not be prosperous and strong; it would be composed of people (families) who, despite working from dawn to dusk, would be condemned to poverty and starvation. Buchanan’s international protectionism, while less destructive than a policy of interpersonal or interregional protectionism, would result in precisely the same effect. This is not conservatism (conservatives want families to be prosperous and strong). This is economic destructionism.
In any case, what should be clear by now is that most if not all of the moral degeneration and cultural decline—the signs of decivilization—all around us are the inescapable and unavoidable results of the welfare state and its core institutions. Classical, old-style conservatives knew this, and they vigorously opposed public education and social security. They knew that states everywhere were intent upon breaking down and ultimately destroying families and the institutions and layers and hierarchies of authority that are the natural outgrowth of family based communities in order to increase and strengthen their own power. They knew that in order to do so states would have to take advantage of the natural rebellion of the adolescent (juvenile) against parental authority. And they knew that socialized education and socialized responsibility were the means of bringing about this goal.
Social education and social security provide an opening for the rebellious youth to escape parental authority (to get away with continuous misbehavior). Old conservatives knew that these policies would emancipate the individual from the discipline imposed by family and community life only to subject him instead to the direct and immediate control of the state.
Furthermore, they knew, or at least had a hunch, that this would lead to a systematic infantilization of society—a regression, emotionally and mentally, from adulthood to adolescence or childhood.
In contrast, Buchanan’s populist-proletarian conservatism—social nationalism—shows complete ignorance of all of this. Combining cultural conservatism and welfare-statism is impossible, and hence, economic nonsense. Welfare-statism—social security in any way, shape or form—breeds moral and cultural decline and degeneration. Thus, if one is indeed concerned about America’s moral decay and wants to restore normalcy to society and culture, one must oppose all aspects of the modern social-welfare state. A return to normalcy requires no less than the complete elimination of the present social security system: of unemployment insurance, social security, Medicare, Medicaid, public education, etc.—and thus the near complete dissolution and deconstruction of the current state apparatus and government power. If one is ever to restore normalcy, government funds and power must dwindle to or even fall below their nineteenth century levels. Hence, true conservatives must be hard-line libertarians (antistatists). Buchanan’s conservatism is false: it wants a return to traditional morality but at the same time advocates keeping the very institutions in place that are responsible for the destruction of traditional morals.
Most contemporary conservatives, then, especially among the media darlings, are not conservatives but socialists—either of the internationalist sort (the new and neoconservative welfare-warfare statists and global social democrats) or of the nationalist variety (the Buchananite populists). Genuine conservatives must be opposed to both. In order to restore social and cultural norms, true conservatives can only be radical libertarians, and they must demand the demolition—as a moral and economic distortion—of the entire structure of the interventionist state.
Hans-Hermann Hoppe is professor of economics at the University of Nevada, Las Vegas. Read and sign the Hoppe Victory Blog. This essay is based on a chapter from Democracy, The God that Failed (2001) that was given as a speech in 1996. Post Comments on the main blog.
On contemporary American conservatism see in particular Paul Gottfried, The Conservative Movement, rev. ed. (New York: Twayne Publishers, 1993); George H. Nash, The Conservative Intellectual Movement in America (New York: Basic Books, 1976) Justin Raimondo, Reclaiming the American Right: The Lost Legacy of the Conservative Movement (Burlingame, Calif.: Center for Libertarian Studies, 1993); see further also chap. 11.
Samuel T. Francis, “From Household to Nation: The Middle American populism of Pat Buchanan,” Chronicles (March 1996): 12-16; see also idem, Beautiful Losers:Essays on the Failure of American Conservatism (Columbia: University of Missouri Press, 1993); idem, Revolution from the Middle (Raleigh, N.C.: Middle American Press, 1997).
Ludwig von Mises, Human Action: A Treatise on Economics, Scholar’s Edition (Auburn, Ala.: Ludwig von Mises Institute, 1998), p. 67. “Princes and democratic majorities,” writes Mises, “are drunk with power. They must reluctantly admit that they are subject to the laws of nature. But they reject the very notion of economic law. Are they not the supreme legislators? Don’t they have the power to crush every opponent? No war lord is prone to acknowledge any limits other than those imposed on him by a superior armed force. Servile scribblers are always ready to foster such complacency by expounding the appropriate doctrines. They call their garbled presumptions “historical economics.”
Ludwig von Mises, Socialism: An Economic and Sociological Analysis (Indianapolis, md.: Liberty Fund, 1981), pp. 43 1-32.
by Pat Heyman | Dec 25, 2009 | Uncategorized
Suppression of Science Within Science
by Henry Bauer
I wasn’t as surprised as many others were, when it was revealed that climate-change "researchers" had discussed in private e-mails how to keep important data from public view lest it shake public belief in the dogma that human activities are contributing significantly to global warming.
I wasn’t particularly surprised because just a few weeks earlier I had spoken at the Oakland Rethinking AIDS Conference about the dogmatism and strong-arm tactics that are rampant in a seemingly increasing range of fields of medicine and science. PowerPoint presentations of most of the talks at the Conference are available at the Conference website. Here’s a slightly modified, more readable, text version of my own talk. The theme in a nutshell:
For several centuries, modern science was pretty much a free intellectual market populated by independent entrepreneurs who shared the goal of understanding how the world works. Nowadays it’s a corporate enterprise where patents, pay-offs, prestige, and power take priority over getting at the scientific truth, and the powers-that-be have established knowledge monopolies.
I had met Peter Duesberg in person only at the Conference, but I had been quite familiar with him from many videos. What had always stuck in my mind was his expression of surprise, astonishment, sheer disbelief, as he told what happened to him after he questioned whether HIV could be the cause of AIDS:
I had all the students I wanted . . . lab space . . . grants . . . . elected to the National Academy. . . . became California Scientist of the Year. All my papers were published. I could do no wrong . . . professionally . . . until I started questioning . . . that HIV is the cause of AIDS. Then everything changed.
What happened then was that he got no more grants; his manuscripts were rejected without substantive critiques, just that "everyone knows that HIV causes AIDS"; Robert Gallo, who earlier had talked of Duesberg’s distinction as a leading retrovirologist, now publicly called him dishonest on scientific matters. Defenders of the mainstream view have even held Duesberg responsible for the deaths of hundreds of thousands of South Africans and have described him as the moral equivalent of a Holocaust denier.
What had Duesberg done to bring about that radical change?
Absolutely nothing. He was doing science just as before: gathering data, documenting his sources, making his analyses, presenting his conclusions for comment by others. Of course Duesberg was surprised that suddenly he had gone from lauded leading scientist to discredited crackpot.
Of course Duesberg was surprised, because his experience of suddenly being sent beyond the pale was obviously an aberration. Science isn’t like this. Science is done by the objective self-correcting scientific method. Peer review is impersonal and impartial. Arguments are substantive, not ad hominem. This experience must be unprecedented, unique.
Or, perhaps, shared just by other AIDS Rethinkers, because questioning that HIV causes AIDS is just too outrageous, and quite justifiably it puts AIDS "denialists" outside the norms of scientific behavior and discourse. You wouldn’t find anything like this in other, more normal fields of medicine or science.
Well, actually, you would. You do. Duesberg and AIDS Rethinkers are not alone in this. Duesberg’s experience is not unique, it’s even far from unique.
For example, there’s The Skeptical Environmentalist (Cambridge University Press, 2001) in which Bjørn Lomborg discussed global warming and pointed out, documented by >500 mainstream source-references, that Kyoto-type policies would not reduce warming enough to avoid such major consequences as sea-level rises. Therefore it makes sense to devise adaptations that will be needed in any case, a much better investment than trying to reduce global CO2 emissions.
A rather unremarkable economic argument based solidly on calculations from mainstream data.
So Lomborg was surely just as surprised, astonished, disbelieving, as Duesberg had been, to find that his scholarly discussion placed him beyond the pale of civilized scientific discourse. The Chair of the International Panel on Climate Change asked, Where is the difference between Lomborg’s view on humans and Hitler’s? An Australian columnist agreed: Perhaps there is a case for making climate change denial an offenceit is a crime against humanity after all. An American environmentalist seconded the notion, writing that there should be "war crimes trials for these bastardssome sort of climate Nuremberg."
Of course those comments were not made in the scientific literature, which doesn’t countenance that sort of character assassination. Or so one might hope. Hope in vain, it turns out, because a book review in Nature (414: 149-50) held that Lomborg’s text employs the strategy of those who . . . argue that gay men aren’t dying of AIDS, that Jews weren’t singled out by the Nazis for extermination. . . .
So global-warming denialism is as much beyond the pale as AIDS denialism. Except that and perhaps you’ve noticedDuesberg has never denied that AIDS exists, he just has a different explanation for what caused it. And Lomborg doesn’t deny that global warming is occurring, he doesn’t even question that human activities are contributing significantly to it, he is just making a cost-benefit argument.
Of course, both HIV/AIDS and global warming are matters that involve not just science but public policy and large public expenditures. You wouldn’t find anything like this in a pure science like astronomy or cosmology, would you?
Yes, you would. Yes, you do.
Take cosmology and the Big-Bang theory of the origin of the universe. Halton Arp was a respected, senior American observational astronomer. He noticed that some pairs of quasars that are physically close together nevertheless have very different redshifts. How exciting! Evidently some redshifts are not Doppler effects, in other words, not owing to rapid relative motion away from us. That means the universe-expansion calculations have to be revised. It may not have started as a Big Bang!
That’s just the sort of major potential discovery that scientists are always hoping for, isn’t it?
Certainly not in this case. Arp was granted no more telescope time to continue his observations. At age 56, Halton Arp emigrated to Germany to continue his work at the Max Planck Institute for Astrophysics.
But Arp was not alone in his views. Thirty-four senior astronomers from 10 countries, including such stellar figures as Hermann Bondi, Thomas Gold, Amitabha Ghosh, and Jayant Narlikar, sent a letter to Nature pointing out that Big Bang theory
- relies on a growing number of hypothetical . . . things . . . never observed;
- that alternative theories can also explain all the basic phenomena of the cosmos
- and yet virtually all financial and experimental resources in cosmology go to Big-Bang studies.
Just the sort of discussion that goes on in science all the time, arguing pros and cons of competing ideas.
Except that Nature refused to publish the letter.
It was posted on the Internet, and by now hundreds of additional signatures have been added just like what happened with the letter the Group for Rethinking AIDS had sent to Nature, Science, the Lancet, and the New England Journal of Medicine, all of which had refused to publish it.
At a mainstream conference on "Outstanding questions for the standard cosmological model"there was not even a mention of the stunningly outstanding question of those anomalous redshifts. So the non-Big-Bang cosmologists organized their own separate meetingagain, like AIDS Rethinkers, or like those who question the mainstream dogma about how to cope with global warming.
For some reason, non-Big-Bang cosmology is as much beyond the pale as AIDS "denial" which isn’t denial or global warming "denial" which isn’t denial.
Then there’s that most abstract of fundamental sciences, theoretical physics. The problem has long been, How to unify relativity and quantum mechanics? Quantum mechanics regards the world as made up of discrete bits whereas relativity regards the world as governed by continuous, not discrete, fields. Since the mid-1970s, there has been no real progress. Everyone has been working on so-called "string theory," which has delivered no testable conclusions and remains a hope, a speculation, not a real theory. Nevertheless, theoretical physicists who want to look at other approaches can’t find jobs, can’t get grants, can’t get published. (Read Lee Smolin, The Trouble with Physics.)
You begin to wonder, don’t you, how many other cases there could be in science, where a single theory has somehow captured all the resources? And where competent scientists who want to try something different are not only blocked but personally insulted?
Well, there’s the matter of what killed off the dinosaurs. Everyone knows that the dinosaurs were killed off 65 million years ago when an asteroid hit the Earth. Everyone knows that, that is, except the paleontologists, whose specialty this sort of question is supposed to be.
The asteroid theory had been developed by Luis Alvarez, Nobel Laureate in physics, and his son Walter, a geologist. Paleontologist Dewey McLean had earlier developed a detailed theory based on volcanism it had long been known that tremendous volcanic activity, the "Deccan Traps," had occurred at the relevant time.
Do you think Alvarez engaged McLean in civilized, substantive discussion?
Or would you be surprised to hear that at a conference, Alvarez said to McLean in private: "I’ll wreck your career if you persist." And Alvarez did indeed contact McLean’s university and tried to block McLean’s promotionI know that for sure because I was Dean of Dewey McLean’s College at the time.
Of course, there’s always been resistance to change in science, as in other human activities. But this degree of suppression of minority views and the use of gutter language and character assassination makes it seem like a new phenomenon. At least it has seemed so to the people who have found themselves suddenly ejected from mainstream discourse and resources.
Arp, Duesberg, Lomborg, McLean and other "denialists" of various mainstream theories are surprised because it isn’t supposed to be like that in science. Lomborg doesn’t know that "AIDS denialists" are treated rather like "global warming denialists." Arp doesn’t know that AIDS and global warming "denialists" have it even worse than those who question the Big Bang. McLean doesn’t know that "denialists" about AIDS, Big-Bang, and global warming also have their careers threatened. Everyone who experiences personally this sort of thing imagines it’s a unique experience, because science isn’t supposed to be like this.
But science nowadays IS like this: Disagree with the conventional contemporary scientific wisdom and you won’t get grants, won’t get published, will be compared to Holocaust deniers.
And it really wasn’t always this way. Nowadays "science," "pure research," has become cutthroat in the extreme, and there’s much corner-cutting and sheer dishonesty in science. For example, NIH newsletters routinely name specific individuals who are being barred from seeking grants for some specified period because of some act of dishonesty.
There was no need, in the good not-so-old days, for a federal Office of Research Integritya designation that George Orwell would have relished. But now we do have such an Office, and at colleges there are Centers for Research Ethics, and publishers put out journals like Accountability in Researchthere’s a burgeoning young academic industry devoted to telling scientists how to behave properly.
That’s what science has come to. Genuine science, the search for better understanding, has been hijacked by self-interest and vested interests and is now captive to knowledge monopolies and research cartels: A single theory exerts dogmatic control over grants, publications, jobs, promotions.
WHY?? How did this happen?
In a follow-up piece, I’ll describe how we arrived at this New World Order in Science.
December 17, 2009
Henry H. Bauer [send him mail] is Dean Emeritus of Arts & Sciences and Professor Emeritus of Chemistry & Science Studies at Virginia Tech. His books about science and scientific unorthodoxies include Scientific Literacy and the Myth of the Scientific Method (1992), Science or Pseudoscience (2001), and The Origin, Persistence and Failings of HIV/AIDS Theory (2007). He currently writes an HIV Skepticism blog.
Copyright © 2009 by LewRockwell.com. Permission to reprint in whole or in part is gladly granted, provided full credit is given.
by Pat Heyman | Dec 25, 2009 | Uncategorized
The New World Order in Science
by Henry Bauer
I’m going to sketch a chronology and analysis that draw on the history of several centuries of science and on many volumes written about that. In being concise, I’ll make some very sweeping generalizations without acknowledging necessary exceptions or nuances. But the basic story is solidly in the mainstream of history of science, philosophy of science, sociology of science, and the like, what’s nowadays called "science & technology studies" (STS).
It never was really true, of course, as the conventional wisdom tends even now to imagine, that "the scientific method" guarantees objectivity, that scientists work impersonally to discover truth, that scientists are notably smarter, more trustworthy, more honest, so tied up in their work that they neglect everything else, don’t care about making money . . . But it is true that for centuries scientists weren’t subject to multiple and powerful conflicts of interest. There is no "scientific method." Science is done by people; people aren’t objective. Scientists are just like other professionals – to use a telling contemporary parallel, scientists are professionals just like the wheelers and dealers on Wall Street: not exactly dishonest, but looking out first and foremost for Number One.
"Modern" science dates roughly from the 17th century. It was driven by the sheer curiosity of lay amateurs and the God-worshipping curiosity of churchmen; there was little or no conflict of interest with plain truth-seeking. The truth-seekers formed voluntary associations: academies like the Royal Society of London. Those began to publish what happened at their meetings, and some of those Proceedings and Transactions have continued publication to the present day. These meetings and publications were the first informal steps to contemporary "peer review."
During the 19th century, "scientist" became a profession, one could make a living at it. Research universities were founded, and with that came the inevitable conflict of interest between truth-seeking and career-making, especially since science gained a very high status and one could become famous through success in science. (An excellent account is by David Knight in The Age of Science.)
Still it was pretty much an intellectual free market, in which the entrepreneurs could be highly independent because almost all science was quite inexpensive and there were a multitude of potential patrons and sponsors, circumstances that made for genuine intellectual competition.
The portentous change to "Big Science" really got going in mid-20th century. Iconic of the new circumstances remains the Manhattan Project to produce atomic bombs. Its dramatic success strengthened the popular faith that "science" can do anything, and very quickly, given enough resources. More than half a century later, people still talk about having a "Manhattan Project" to stop global warming, eradicate cancer, whatever.
So shortly after World War II, the National Science Foundation (NSF) was established, and researchers could get grants for almost anything they wanted to do, not only from NSF but also from the Atomic Energy Commission, the Army, the Navy, the Air Force, the Defense Advanced Research Projects Agency (DARPA), the Department of the Interior, the Agriculture Department . . . as well as from a number of private foundations. I experienced the tail end of this bonanza after I came to the United States in the mid-1960s. Everyone was getting grants. Teachers colleges were climbing the prestige ladder to become research universities, funded by grant-getting faculty "stars": colleges just had to appoint some researchers, those would bring in the moolah, that would pay for graduate students to do the actual work, and the "overhead" or "indirect costs" associated with the grants – often on the order of 25%, with private universities sometimes even double that – allowed the institutions to establish all sorts of infrastructure and administrative structures. In the 1940s, there had been 107 PhD-granting universities in the United States; by 1978 there were more than 300.
Institutions competed with one another for faculty stars and to be ranked high among "research universities," to get their graduate programs into the 20 or so "Top Graduate Departments" – rankings that were being published at intervals for quite a range of disciplines.
Everything was being quantified, and the rankings pretty much reflected quantity, because of course that’s what you can measure "objectively": How many grants? How much money? How many papers published? How many citations to those papers? How many students? How many graduates placed where?
This quantitative explosion quickly reached the limits of possible growth. That had been predicted early on by Derek de Solla Price, historian of science and pioneer of "scientometrics" and "Science Indicators," quantitative measures of scientific and technological activity. Price had recognized that science had been growing exponentially with remarkable regularity since roughly the 17th century: doubling about every 15 years had been the numbers of scientific journals being published, the numbers of papers being published in them, the numbers of abstracts journals established to digest the flood of research, the numbers of researchers . . . .
Soon after WWII, Price noted, expenditures on research and development (R&D) had reached about 2.5% of GDP in industrialized countries, which meant quite obviously that continued exponential growth had become literally impossible. And indeed the growth slowed, and quite dramatically by the early 1970s. I saw recently that the Obama administration expressed the ambition to bring R&D to 3% of GDP, so there’s indeed been little relative growth in the last half century.
Now, modern science had developed a culture based on limitless growth. Huge numbers of graduates were being turned out, many with the ambition to do what their mentors had done: become entrepreneurial researchers bringing in grants wholesale and commanding a stable of students and post-docs who could churn out the research and generate a flood of publications. By the late 1960s or early 1970s, for example, to my personal knowledge, one of the leading electrochemists in the United States in one of the better universities was controlling annual expenditures of many hundreds of thousands of dollars (1970s dollars!), with several postdocs each supervising a horde of graduate students and pouring out the paper.
The change from unlimited possibilities to a culture of steady state, to science as zero-sum game, represents a genuine crisis: If one person gets a grant, some number of others don’t. The "success rate" in applications to NSF or the National Institutes of Health (NIH) is no more than 25% on average nowadays, less so among the not-yet-established institutions. So it would make sense for researchers to change their aims, their beliefs about what is possible, to stop counting success in terms of quantities: but they can’t do that because the institutions that employ them still count success in terms of quantity, primarily the quantity of dollars brought in. To draw again on a contemporary analogy, scientific research and the production or training of researchers expanded in bubble-like fashion following World War II; that bubble was pricked in the early 1970s and has been deflating with increasingly obvious consequences ever since.
One consequence of the bubble’s burst is that there are far too many would-be researchers and would-be research institutions chasing grants. Increasing desperation leads to corner-cutting and frank cheating. Senior researchers established in comfortable positions guard their own privileged circumstances jealously, and that means in some part not allowing their favored theories and approaches to be challenged by the Young Turks. Hence knowledge monopolies and research cartels.
A consequence of Big Science is that very few if any researchers can work as independent entrepreneurs. They belong to teams or institutions with inevitably hierarchical structures. Where independent scientists owed loyalty first and foremost to scientific truth, now employee researchers owe loyalty first to employers, grant-givers, sponsors. (For this change in ideals and mores, see John Ziman, Prometheus Bound, 1994.) Science used to be compared to religion, and scientists to monks – in the late 19th century, T. H. Huxley claimed quite seriously to be giving Lay Sermons on behalf of the Church of Scientific; but today’s scientists, as already said, are more like Wall Street professionals than like monks.
Since those who pay the piper call the tune, research projects are chosen increasingly for non-scientific reasons; perhaps political ones, as when President Nixon declared war on cancer at a time when the scientific background knowledge made such a declaration substantively ludicrous and doomed to failure for the foreseeable future. With administrators in control because the enterprises are so large, bureaucrats set the rules and make the decisions. For advice, they naturally listen to the senior well-established figures, so grants go only to "mainstream" projects.
Nowadays there are conflicts of interest everywhere. Researchers benefit from individual consultancies. University faculty establish personal businesses to exploit their specialized knowledge which was gained largely at public expense. Institutional conflicts of interest are everywhere: There are university-industry collaborations; some universities have toyed with establishing their own for-profit enterprises to exploit directly the patents generated by their faculty; research universities have whole bureaucracies devoted to finding ways to make money from the university’s knowledge stock, just as the same or parallel university bureaucracies sell rights to use the university’s athletics logos. It is not at all an exaggeration to talk of an academic-government-industry complex whose prime objective is not the search for abstract scientific truth.
Widely known is that President Eisenhower had warned of the dangers of a military-industrial complex. Much less well known is that Eisenhower was just as insightful and prescient about the dangers from Big Science:
in holding scientific research and discovery in respect . . . we must also be alert to the . . . danger that public policy could itself become the captive of a scientific-technological elite
That describes in a nutshell today’s knowledge monopolies. A single theory acts as dogma once the senior, established researchers have managed to capture the cooperation of the political powers. The media take their cues also from the powers that be and from the established scientific authorities, so "no one" even knows that alternatives exist to HIV/AIDS theory, to the theory that human activities are contributing to climate change, that the Big Bang might not have happened, that it wasn’t an asteroid that killed the dinosaurs, and so on.
The bitter lesson is that the traditionally normal process of science, open argument and unfettered competition, can no longer be relied upon to deliver empirically arrived at, relatively objective understanding of the world’s workings. Political and social activism and public-relations efforts are needed, as public policies are increasingly determined by the actions of lobbyists backed by tremendous resources and pushing a single dogmatic approach. No collection of scientifically impeccable writings can compete against an International Panel on Climate Change and a Nobel Peace Prize awarded for Albert Gore’s activism and "documentary" film – and that is no prophesy, for the evidence is here already, in the thousands of well-qualified environmental scientists who have for years petitioned for an unbiased analysis of the data. No collection of scientifically impeccable writings can compete against the National Institutes of Health, the World Health Organization, UNAIDS, innumerable eminent charities like the Bill and Melinda Gates Foundation, when it comes to questions of HIV and AIDS – and again that is no prophesy, because the data have been clear for a couple of decades that HIV is not, cannot be the cause of AIDS.
As to HIV and AIDS, maybe the impetus to truth may come from politicians who insist on finding out exactly what the benefits are of the roughly $20 billion we – the United States – are spending annually under the mistaken HIV/AIDS theory. Or maybe the impetus to truth may come from African Americans, who may finally rebel against the calumny that it is their reprehensible behavior that makes them 7 to 20 times more likely to test "HIV-positive" than their white American compatriots; or perhaps from South African blacks who are alleged to be "infected" at rates as high as 30%, supposedly because they are continually engaged in "concurrent multiple sexual relationships," having multiple sexual partners at any given time but changing them every few weeks or months. Or from a court case or series of them, because of ill health caused by toxic antiretroviral drugs administered on the basis of misleading "HIV" tests; or perhaps because one or more of the "AIDS denialists" wins libel judgment against one or more of those who call them Holocaust deniers. Maybe the impetus to truth may come from the media finally seizing on any of the above as something "news-worthy."
At any rate, the science has long been clear, and the need is for action at a political, social, public-relations, level. In this age of knowledge monopolies and research cartels, scientific truth is suppressed by the most powerful forces in society. It used to be that this sort of thing would be experienced only in Nazi Germany or the Soviet Union, but nowadays it happens in democratic societies as a result of what President Eisenhower warned against: "public policy . . . become the captive of a scientific-technological elite."
December 19, 2009
Henry H. Bauer [send him mail] is Dean Emeritus of Arts & Sciences and Professor Emeritus of Chemistry & Science Studies at Virginia Tech. His books about science and scientific unorthodoxies include Scientific Literacy and the Myth of the Scientific Method (1992), Science or Pseudoscience (2001), and The Origin, Persistence and Failings of HIV/AIDS Theory (2007). He currently writes an HIV Skepticism blog.
Copyright © 2009 by LewRockwell.com. Permission to reprint in whole or in part is gladly granted, provided full credit is given.
by Pat Heyman | Dec 24, 2009 | Uncategorized
Nothing captures the commercialisation of Christmas quite as effectively as the history of Santa Claus. To illustrate the point, look at a wonderful 16th-century painting that hangs in Room 7 of the National Gallery in London. Most people who pass by it are unaware of its significance. But this panel, circa 1555-60, preserves a particularly pure element of the Christmas spirit.
The painting, attributed to the Tuscan mannerist Girolamo Macchietti, depicts the most important legend of St Nicholas of Myra.
To the right, a nobleman slumbers, surrounded by his three daughters, who are also asleep. According to tradition, the family was so poor that the father was on the brink of selling his girls into prostitution. To the left, their saviour appears at the window, dressed in a sumptuous orange tunic adorned with a red robe. Under the cover of darkness, he prepares to lob through the aperture the second of three balls of gold (each represents a purse stuffed with money) that will provide dowries for all of the daughters, so that they won’t have to sell their bodies to survive.
For Macchietti’s contemporaries, this youth would have been instantly recognisable as St Nicholas. But, today, he goes by a much more familiar name: Father Christmas.
This may come as a surprise. How can Macchietti’s Mr Goldenballs, with his gilt sandals and curly, glowing hair, be related to roly-poly, red-faced Santa Claus? For starters, he’s too thin. And beardless. He is dressed like an inhabitant of the Mediterranean, not Lapland. He is standing by the window, not peering down the chimney. And, anyway, where’s his retinue of reindeer?
But, then, this is one of the sad truths at the heart of Christmas present. These days, it isn’t only the birth of Christ that is threatened with oblivion. The charitable St Nicholas, associated with Christmas since time immemorial, is rapidly sliding towards anonymity, too. Meanwhile, the stock of his more recent incarnation as Santa Claus, the darling of department-store managers, filmmakers and advertising copy-writers the world over, continues to rise.
Canon James Rosenthal, who has earned a reputation as one of the Church’s leading authorities on St Nicholas, believes it’s time that the “real” Father Christmas is remembered.
“I always think it’s sad that people are ignorant of the origins of our customs,” he says. “Santa Claus is fine, but St Nicholas is so much better. Like us, he is real.
“I believe there is a bit of St Nicholas in all of us. For Christians, he is a model to push chubby Santa back into fairyland.”
St Nicholas’s standing is currently so low that Canon Rosenthal was recently banned from visiting children held at an immigration centre in Bedfordshire. When he arrived at Yarl’s Wood, dressed as St Nicholas, wearing a magnificent fake white beard and a bishop’s mitre, he had hoped to deliver presents donated by the congregations of several London churches. But he was turned away by security guards, who eventually called the police. “I felt like a criminal for trying to spread cheer and a few gifts,” Canon Rosenthal told me this week.
St Nick must have been a pretty impressive figure to inspire such devotion, but, in truth, few facts about him are known. He was probably born around AD 260 in the port of Patara on the southern coast of what was then Asia Minor, now Turkey. He grew up in the eastern reaches of the Roman Empire, which was still hostile to Christianity, but found himself drawn to the new religion, and rose to become the bishop of Myra, now the Lycian town of Demre. He died in Myra in 343, possibly on December 6 (the date on which he is usually venerated today).
“The first Life of St Nicholas is from the 9th century,” says Robin Cormack, professor emeritus of the Courtauld Institute of Art, and an expert on Byzantium. “Maybe St Nicholas was a bishop in the 4th century AD; all else seems fiction.”
But what fiction! Scintillating legends quickly gathered around the memory of Myra’s bishop, who acquired a reputation for generosity. Aside from rescuing the daughters of the impoverished nobleman in Macchietti’s picture, he is said to have resurrected three boys who had been killed by a psychotic butcher, who’d chopped them up, salted their remains in a barrel, and planned to sell their cured body parts as ham during a period of famine.
Over the centuries, St Nicholas evolved into the patron saint of sailors and fishermen, pawnbrokers, children, scholars, druggists – and even people being mugged.
By the 10th century, a basilica containing his relics had been built at Myra. In those days, the remains of holy figures were big business, since thousands of pilgrims flocked to shrines all over Europe. In 1087, a bunch of brigands from the Italian port of Bari on the Adriatic Sea set sail for Myra, where they looted the basilica, before returning home with the exhumed remains of St Nick. A shrine was quickly established back in Italy, and people flocked to Bari to peddle the holy “manna” which was said to drip from the saint’s bones.
The arrival of St Nicholas in Italy accounts for his popularity among Italian artists of the Renaissance. Fra Angelico and Masaccio both depicted the saint in altarpieces. Veronese painted the saint, with a white beard, in a grand canvas from 1562 that can be seen in the National Gallery.
St Nicholas was soon venerated across Europe. He was especially beloved in Holland, where, to this day, children receive gifts on the Feast of St Nicholas rather than Christmas Day. The tangerines traditionally left as gifts in the stockings of children who have been good allude to St Nicholas’s emblem – three balls of gold.
His transformation into Father Christmas only occurred after the Dutch had emigrated to North America in the 17th century. In the New World, they continued to observe the feast day of Sinterklaas, as they called St Nicholas. This dialectical quirk became “Santa Claus”.
Most of Santa Claus’s current iconography – the flowing beard, red-and-white livery, reindeer – dates from 19th-century America, where the traditions of the early Dutch settlers were fondly recalled. Clement C Moore’s poem The Night Before Christmas, published anonymously in 1823, cemented the image of Father Christmas in the popular imagination as a jolly old soul with a white beard who arrives through chimneys to deliver gifts into stockings, before riding off into the night on a sleigh laden with toys and powered by prancing reindeer.
The New York caricaturist Thomas Nast later refined our image of Father Christmas, fattening him up in a series of cartoons that appeared in Harper’s Weekly from 1863 onwards. Nast was also responsible for changing the colour of Santa’s cloak from tan or green to red, decades before the Coca-Cola advertising campaigns of the mid-20th century, featuring Swedish artist Haddon Sundblom’s vision of Father Christmas swigging from a bottle of Coke. The first of these ads appeared in 1931, marking a watershed in St Nicholas’s transformation from icon of Christian self-sacrifice to the plump, friendly face of yuletide capitalism.
The question remains why it was specifically St Nicholas, rather than any other saint, who became indelibly linked with Christmas. For Canon Rosenthal, the answer is simple. “St Nicholas hit all the right chords in the hearts, minds and imaginations of the people,” he says. “He went to prison for his faith, he smashed pagan altars, he gave away his wealth, and he even restored three boys to life. Not bad for one person.”
But Prof Cormack is not so sure. “No one understands the reason for the popularity of St Nicholas,” he says. “All the stories [associated with him] are conventional saints’ stuff. He just got lucky.”
by Pat Heyman | Dec 22, 2009 | Movie Reviews, Pat
I saw Avatar yesterday. It is one of the most hyped and promoted movies of the year, yet many of the ads neglect to let you know that the movie is in 3-D. Here are some thoughts and opinions on the movie.
To 3D or Not To 3D
Avatar certainly stands as a breakthrough in 3D technology and movie making, as the movie was shot and designed as a 3D movie from the ground up. The 3D takes a meh movie and turns it into something spectacular. There are moments in the movie where I thought, “If I were seeing this on the regular screen, I would be bored out of my mind, but because it’s 3D, I don’t mind so much.” Think of how the 12 minute podracing scene in the Phantom Menace bogged down the whole middle section of the movie. Well if George Lucas had made it in 3D, it wouldn’t have done so.
For the most part, James Cameron doesn’t engage in the cheap trick of having 3D objects being hurled at you in an attempt to get you to duck. Very few of the 3D effects pop out of the screen. Often the most effective use of 3D in the movie is simple ambiance. For example, as the scientists move through the forest the bees buzz around them and out of the screen, making it more immersive. It also highlights one of the current weakness of the 3D technology—fast moving objects lose focus and clarity. Certainly, the coolest 3D visuals are the virtual and holographic displays in the helicopters and in the science labs. The 3D image of the brain scan is simply amazing.
It will be interesting to see how the technology and movie making develop. For example, I felt distracted from the movie in several instances because the background of some shots is blurred, while the foreground is in sharp focus. Yes, this is similar to how the eye operates, but in real life, you could choose to focus on the background if you wanted to. In the movie, the choice has been made for you.
As many have pointed out, the 3D isn’t perfect, but it is a breakthrough, and in the same way that the first Star Wars trilogy’s effects are painful to watch now, at the time, there was nothing like them. I highly recommend watching it in 3D.
Of course then you have the question of which kind of 3D, RealD or Imax 3D? The best description of the different technologies I have been able to find is this article. Basically, the Imax screen is bigger and closer to you, so even without 3D, it seems more like you are in the movie. You have to turn your head to see different parts of the screen, because the screen is too big for your field of vision. Meanwhile, RealD is shown on a traditional (smaller) movie screen, so the 3D effects seem more “in the screen” than popping out at you.
We saw it on RealD. I don’t think I’m willing to spend the additional $12 to see it again on Imax 3D to see which one I would prefer. I’ll wait for a better movie to do that.
Update: I saw it again on a Liemax screen at Muvico. It was a bit underwhelming. This time, I wasn’t able to sit in the center of the theater, and that definitely made a difference when the effects were “out of the screen”. I really couldn’t tell any difference between the two technologies except that the Imax glasses are much less comfortable to wear for two and a half hours. On the other hand, another friend saw it both on RealD and at a real Imax theater, and he said there was no comparison, the real Imax blew away the RealD version.
Is this Dances With Wolves?
Avatar’s story is clearly derivative. It’s your basic, person goes to another culture and eventually identifies more with it and turns on his former “friends”. You’ve seen it done as boring as can be in Dances With Wolves. You’ve seen it done with visual flair in The Last Samurai. And now you’ve seen it in 3D in Avatar. As always, the best aspects of these movies are in the joy of discovery and gradual acceptance of the outsider into society. Avatar excels in the discovery department mostly because of the 3D visuals. The story aspect and gradual acceptance takes a back seat.
The movie is quite preachy on many levels, and this is its major flaw. No one wants to go to an escape movie to be preached at—especially an escape movie of this scale. The themes of terrorism, U.S. army occupation, ignorant Americans, and environmental destruction are all present even though some of them seem very strained. The cartoonish, over the top, Rambo-style head of security says, “We’ll fight terrorism with terrorism,” even though the guys in blue hadn’t done anything aggressive, let alone terrorism.
I also don’t understand why movie makers have to so confuse science and religion, nature and the supernatural. George Lucas completely ruined the force by making it the result of midachlorians. We were fine accepting that in the Star Wars world, the Force existed as a supernatural phenomenon, but once it is revealed that the Force is nothing more than the action of mitochodria midachlorians, the whole thing just becomes hokey. In Avatar, the aliens and animals have some kind of exposed neural interface that they can use to communicate. Some of the plants also have it, and when one of the aliens dies, they bury the body with a seed, and the consciousness of the person is preserved in the resulting tree, making the forest a giant planetary neural network of ancestors—a very cool idea. Then James Cameron has to ruin it by making it nothing more than tribal animism. For me, this is the weakest aspect of the movie.
Despite the preachiness, much of it is simply James Cameron being James Cameron. The Terminator shows his distrust of growing technology—ironic for someone who pushes it so. Watch Aliens, and you’ll see the same basic ideas and themes:
- Tough talking Hispanic soldier? Check
- Evil corporation exploiting aliens? Check
- Weaselly corporate suit who doesn’t understand what he’s up against? Check
- Cool military hardware? Check
- Epic battles ending up in a one on one mech duel? Check
- Sigourney Weaver? Check
Other thoughts and observations
Apparently you can show naked breasts on the big screen and still get a PG-13 rating as long as they belong to blue aliens. (Must be the 2010 equivalent National Geographic documentaries.)
Who would have thought that Zoe Saldana would be sexier as an eight foot, blue, tiger striped alien than as a Federation officer?
Is it just me or do the Na’vi look remarkably like Vincent (played by the ineffable Ron Perlman) from Beauty and the Beast which happened to also star Linda Hamilton who starred in both Terminator movies and married James Cameron?
So should I see it?
If you like movies, by all means yes. (In fact, you should see it in both Imax and RealD and then let me know which is better and why.) It’s a very entertaining movie with spectacular visual effects. So watch it, ignore the silly and preachy aspect, and be excited for the coming 3D developments in movie making…now if Peter Jackson would just go back and remake the Lord of the Rings in 3D…
by Pat Heyman | Dec 22, 2009 | Pat
I just saw this story on MadOgre.com.
11-25-09: Rule One: Handle all firearms as if they were loaded. This Just Happened. We had a lady bring in an old 12 gauge Winchester 1300 shotgun for trade. Travis and I both check it, cycled the action. When I looked down into the action, I didn’t see any shells in there… the action was cycled probably 20 times. It was filthy, gritty, and foul… it felt like it was full of sand and on top of that it felt too tight. Travis hit it with some gun oil and cycled it a couple more times. Then Marcus cycled it. And then all the sudden – Ker-Chunk! A Live Shell popped onto the shell lifter and there it was. A live round in the gun. What happened evidently was that because the gun was so old and so completely filthy, the feed mechanism was bound up. After a shot of some spray in oil and some working, it became unbound and was then able to feed that unseen shell. This was pretty scary, because there was the hidden potential for an accident. However because everyone followed the 4 Rules, that accident didn’t happen. But it could have had we let our guard down. Because we had all thought the gun was unloaded… here we are looking forward to getting off work early… looking forward to the holiday, getting a little lax… but because we practiced the 4 Rules we avoided what could have been a disaster. A gun shop in Colorado not too long ago had an employee working on a gun… shot and killed another employee… it can happen. Firearms are like poisonous snakes… you can handle them safely, but the moment you disrespect them – they can bite you.
Follow the Rules. Always.
For those not in the know, the four rules—or laws— of guns safety were devised by Colonel Jeff Cooper who, if not invented, developed and popularized the modern technique of the pistol. They are:
- All guns are always loaded
- Never let the muzzle cover anything you don’t want to destroy.
- Keep your finger off the trigger until your sights are on the target [and you are ready to shoot].
- Always be sure of your target.
As Randy Cain says, you can break one rule and be okay, but if you break two rules it’s going to end in pain. Although it’s rule 3, Jeff Cooper is reported to have acknowledged that keeping one’s finger off the trigger would prevent most gun accidents. If you watch TV you’ll see rule 3 violations all over the place—and Jack Bauer is one of the worst offenders.
Update: Repeatedly running the slide (as described in the story) is not correct way to check if a pump-action shotgun is unloaded. The correct way is to check the chamber (visually and tactilely) and to check the magazine tube for the presence of the follower. So apparently, in the story above, despite working at a gun shop, they didn’t know how (or didn’t care) to correctly check if the gun is unloaded. And as several of my friends have pointed out the first rule is “All guns are always loaded!” not “Treat all guns as if they were loaded” as maintained in the quoted excerpt.
by Pat Heyman | Dec 17, 2009 | Uncategorized
Inconvenient truth for Al Gore as his North Pole sums don’t add up
There are many kinds of truth. Al Gore was poleaxed by an inconvenient one yesterday.
The former US Vice-President, who became an unlikely figurehead for the green movement after narrating the Oscar-winning documentary An Inconvenient Truth, became entangled in a new climate change “spin” row.
Mr Gore, speaking at the Copenhagen climate change summit, stated the latest research showed that the Arctic could be completely ice-free in five years.
In his speech, Mr Gore told the conference: “These figures are fresh. Some of the models suggest to Dr [Wieslav] Maslowski that there is a 75 per cent chance that the entire north polar ice cap, during the summer months, could be completely ice-free within five to seven years.”
However, the climatologist whose work Mr Gore was relying upon dropped the former Vice-President in the water with an icy blast.
“It’s unclear to me how this figure was arrived at,” Dr Maslowski said. “I would never try to estimate likelihood at anything as exact as this.”
Mr Gore’s office later admitted that the 75 per cent figure was one used by Dr Maslowksi as a “ballpark figure” several years ago in a conversation with Mr Gore.
The embarrassing error cast another shadow over the conference after the controversy over the hacked e-mails from the University of East Anglia’s Climate Research Unit, which appeared to suggest that scientists had manipulated data to strengthen their argument that human activities were causing global warming.
Mr Gore is not the only titan of the world stage finding Copenhagen to be a tricky deal.
World leaders — with Gordon Brown arriving tonight in the vanguard — are facing the humiliating prospect of having little of substance to sign on Friday, when they are supposed to be clinching an historic deal.
Meanwhile, five hours of negotiating time were lost yesterday when developing countries walked out in protest over the lack of progress on their demand for legally binding emissions targets from rich nations. The move underlined the distrust between rich and poor countries over the proposed legal framework for the deal.
Last night key elements of the proposed deal were unravelling. British officials said they were no longer confident that it would contain specific commitments from individual countries on payments to a global fund to help poor nations to adapt to climate change while the draft text on protecting rainforests has also been weakened.
Even the long-term target of ending net deforestation by 2030 has been placed in square brackets, meaning that the date could be deferred. An international monitoring system to identify illegal logging is now described in the text as optional, where before it was compulsory. Negotiators are also unable to agree on a date for a global peak in greenhouse emissions.
Perhaps Mr Gore had felt the need to gild the lily to buttress resolve. But his speech was roundly criticised by members of the climate science community. “This is an exaggeration that opens the science up to criticism from sceptics,” Professor Jim Overland, a leading oceanographer at the US National Oceanic and Atmospheric Administration said.
“You really don’t need to exaggerate the changes in the Arctic.”
Others said that, even if quoted correctly, Dr Maslowski’s six-year projection for near-ice-free conditions is at the extreme end of the scale. Most climate scientists agree that a 20 to 30-year timescale is more likely for the near-disappearance of sea ice.
“Maslowski’s work is very well respected, but he’s a bit out on a limb,” said Professor Peter Wadhams, a specialist in ocean physics at the University of Cambridge.
Dr Maslowki, who works at the US Naval Postgraduate School in California, said that his latest results give a six-year projection for the melting of 80 per cent of the ice, but he said he expects some ice to remain beyond 2020.
He added: “I was very explicit that we were talking about near-ice-free conditions and not completely ice-free conditions in the northern ocean. I would never try to estimate likelihood at anything as exact as this,” he said. “It’s unclear to me how this figure was arrived at, based on the information I provided to Al Gore’s office.”
Richard Lindzen, a climate scientist at the Massachusets Institute of Technology who does not believe that global warming is largely caused by man, said: “He’s just extrapolated from 2007, when there was a big retreat, and got zero.”
by Pat Heyman | Dec 15, 2009 | Uncategorized
How Barack Obama and Ben Bernanke are destroying the dollar — and perhaps ushering in the amero
By Robert P. Murphy
First under the Bush Administration and even more so under President Obama, the federal government has been seizing power and spending money as it hasn’t done since World War II. But as bold as the Executive Branch has been during this financial crisis, the innovations of Fed chairman Ben Bernanke have been literally unprecedented. Indeed, it is entirely plausible that before Obama leaves office, Americans will be using a new currency.
Bush and Obama have engaged in record peacetime deficit spending; so too did Herbert Hoover and then Franklin Roosevelt (even though in the 1932 election campaign, FDR promised Americans a balanced budget). Bush and Obama approved massive federal interventions into the financial sector, at the behest of their respective Treasury secretaries. Believe it or not, in 1932 the allegedly “do-nothing” Herbert Hoover signed off on the creation of the Reconstruction Finance Corporation (RFC), which was given billions of dollars to prop up unsound financial institutions and make loans to state and local governments. And as with so many other elements of the New Deal, FDR took over and expanded the RFC that had been started under Hoover.
In the past year, the government has seized control of more than half of the nation’s mortgages, it has taken over one of the world’s biggest insurers, it literally controls major car companies, and it is now telling financial institutions how much they can pay their top executives. On top of this, the feds are seeking vast new powers over the nation’s energy markets (through the House Waxman-Markey “Clean Energy and Security Act” and pending Kerry-Boxer companion bill in the Senate) and, of course, are trying to “reform” health care by creating expansive new government programs.
For anyone who thinks free markets are generally more effective at coordinating resources and workers, these incredible assaults on the private sector from the central government surely must translate into a sputtering economy for years. Any one of the above initiatives would have placed a drag on a healthy economy. But to impose the entire package on an economy that is mired in the worst postwar recession, is a recipe for disaster.
Debt and Inflation
Conventional economic forecasts for government tax receipts are far too optimistic. The U.S. Treasury will need to issue far more debt in the coming years than most analysts now realize. Yet even the optimistic forecasts are sobering. For example, in March the Congressional Budget Office projected that the Obama administration’s budgetary plans would lead to a doubling of the federal debt as a share of the economy, from 41 percent of GDP in 2008 to 82 percent of GDP by 2019. The deficit for fiscal year 2009 (which ended Sept. 30) alone was $1.4 trillion. For reference, the entire federal budgetwas less than $1.4 trillion in the early years of the Clinton administration.
Clearly the U.S. government will be incurring massive new debts in the years to come. The situation looks so grim that economist Jeffrey Hummel has predicted that the Treasury will default on its obligations, just as Russia defaulted on its bonds in 1998. But another scenario involves the Federal Reserve wiping out the real burden of the debt by writing checks out of thin air to buy up whatever notes the Treasury wants to issue.
Many analysts are worried about Fed chairman Ben Bernanke’s actions during the financial crisis; Marc Faber is openly warning of “hyperinflation.” To understand what the fuss is about, consider some facts about our monetary and banking system.
The United States has a fractional reserve banking system. When someone deposits $100 in a checking account, most of that money is lent out again to other bank customers. Only a fraction—typically around 10 percent—needs to be held “on reserve” to back up the $100 balance of the original depositor. A bank’s reserves can consist of either cash in the vault or deposits with the Federal Reserve itself. For example, suppose a given bank has customer checking accounts with a combined balance of $1 billion. Assuming a 10 percent reserve requirement, the bank needs $100 million in reserves. It can satisfy this legal requirement by keeping, say, $30 million in actual cash on hand in its vaults and putting $70 million on deposit in the bank’s account with the Fed.
Normally, the Fed expands the money supply by engaging in “open market operations.” For example, the Fed might buy $1 billion worth of government bonds from a dealer in the private sector. The Fed adds the $1 billion in bonds to the asset side of its balance sheet, while its liabilities also increase by $1 billion. But Bernanke faces no real constraints on his purchasing decisions. When the Fed buys $1 billion in new bonds, it simply writes a $1 billion check on itself. There is no stockpile of money that gets drained because of the check; the recipient simply deposits the check in his own bank, and the bank in turn sees its reserves on deposit with the Fed go up by $1 billion. In principle, the Fed could write checks to buy every asset in America.
Monetary Catastrophe
Since the start of the present financial crisis, the Federal Reserve has implemented extraordinary programs to rescue large institutions from the horrible investments they made during the bubble years. Because of these programs, the Fed’s balance sheet more than doubled from September 2008 to the end of the year, as Bernanke acquired more than a trillion dollars in new holdings in just a few months.
If Bernanke has been so aggressive in creating new money, why haven’t prices skyrocketed at the grocery store? The answer is that banks have chosen to let their reserves with the Fed grow well above the legal minimum. In other words, banks have the legal ability to make new loans to customers, but for various reasons they are choosing not to do so. This chart from the Federal Reserve shows these “excess reserves” in their historical context.
U.S. depository institutions have typically lent out their excess reserves in order to earn interest from their customers. Yet currently the banks are sitting on some $850 billion in excess reserves, because (a) the Fed began paying interest on reserves in October 2008, and (b) the economic outlook is so uncertain that financial institutions wish to remain extremely liquid.
The chart explains why Faber and others are warning about massive price inflation. If and when the banks begin lending out their excess reserves, they will have the legal ability to create up to $8.5 trillion in new money. To understand how significant that number is, consider that right now the monetary aggregate M1—which includes physical currency, traveler’s checks, checking accounts, and other very liquid assets—is a mere $1.7 trillion.
What does all this mean? Quite simply, it means that if Bernanke sits back and does nothing more, he has already injected enough reserves into the financial system to quintuple the money supply held by the public. Even if Bernanke does the politically difficult thing, jacking up interest rates and sucking out half of the excess reserves, there would still be enough slack in the system to triple the money supply.
The End of the Dollar?
Aware of the above considerations, central banks around the world have been quietly distancing themselves from the U.S. dollar. Over the summer, officials in India, China, and Russia opined publicly on the desirability of a new global financial system, anchored on a basket of currencies or even gold.
We thus have in motion two huge trains of supply and demand, and the result will be an inevitable crash in the value of the dollar. Just as the Federal Reserve is embarking on a massive printing spree, the rest of the world is looking to dump its dollar holdings. It’s impossible to predict the exact timing, but sooner or later the dollar will fall very sharply against commodities and other currencies.
A crashing dollar will translate immediately into huge spikes in the price of gasoline and other basic items tied to the world market. After a lag, prices at Wal-Mart and other stores will also skyrocket, as their reliance on “cheap imports from Asia” will no longer be possible when the price of the dollar against the Chinese yuan falls by half.
The consequences will be so dramatic that what now may sound like a “conspiracy theory” could become possible. Fed officials might use such an opportunity to wean Americans from the U.S. dollar. Influential groups such as the Council on Foreign Relations have discussed the desirability of coordination among the North American governments. For example, CFR president Richard N. Haas wrote in the foreword to a 2005 Task Force report titled, “Building a North American Community”:
The Task Force offers a detailed and ambitious set of proposals that build on the recommendations adopted by the three governments [Canada, the U.S., and Mexico] at the Texas summit of March 2005. The Task Force’s central recommendation is establishment by 2010 of a North American economic and security community, the boundaries of which would be defined by a common external tariff and an outer security perimeter.
The “Texas summit of March 2005” refers to the “Security and Prosperity Partnership (SPP) of North America,” which came out of a meeting in Waco, Texas between President George W. Bush, Canadian Prime Minister Paul Martin, and Mexican President Vicente Fox. For the record, the federal government’s website has a special section devoted to refuting the (alleged) myths of the SPP, including the claim that the SPP is a prelude to a North American Union, comparable to the European Union. Yet despite the official protestations to the contrary, the global trend toward ever larger political and monetary institutions is undeniable. And there is a definite logic behind the process: with governments in control of standing armies, the only real check on their power is the ability of their subjects to change jurisdictions. By “harmonizing” tax and regulatory regimes, various countries can extract more from their most productive businesses. And by foisting a fiat currency into the pockets of more and more people, a government obtains steadily greater control over national—or international—wealth.
But if indeed key players had wanted to create a North American Union with a common currency, up till now they would have faced an insurmountable barrier: the American public would never have agreed to turn in their dollars in exchange for a new currency issued by a supranational organization. The situation will be different when the U.S. public endures double-digit price inflation, even as the economy still suffers from the worst unemployment since the Great Depression. Especially if Obama officials frame the problem as an attack on the dollar by foreign speculators, and point to the strength of the euro, many Americans will be led to believe that only a change in currency can save the economy.
For those who consider such a possibility farfetched, remember that one of FDR’s first acts as president was to confiscate monetary gold held by U.S. citizens, under threat of imprisonment and a huge fine. Yet nowadays, that massive crime is described as “taking us off the gold standard” which “untied the Fed’s hands and allowed it to fight the Depression.” The same will be said in future history books, when they explain matter-of-factly the economic crisis that gave birth to the amero.
What Can One Man Do?
If events play out as described, what should average investors do right now to protect themselves? First and most obvious, they should rid themselves of dollar-denominated assets. For example, government and corporate bonds promising to make a fixed stream of dollar payments will all become virtually worthless if huge price inflation occurs. (In contrast, holding U.S. stocks is not a bad idea from the point of view of inflation; a stock entitles the owner to a portion of the revenue stream from a company’s sales, which themselves will rise along with prices in general.)
Second, investors should acquire an emergency stockpile of gold and silver. If and when dollar-prices begin shooting through the roof, there will be a lag for most workers: They will see the prices of milk, eggs, and gasoline increasing by the week, yet their paychecks will remain the same for months or longer. If the dollar crashes in the foreign exchange markets, gold and silver would see their prices (quoted in U.S. dollars) increase in the opposite direction.
We can’t know the timing of the impending monetary catastrophe, but it is coming. Smart investors will minimize their dependence on the dollar before it crashes. At this late date, no one should trust the government and media “experts” who assure us that the worst is over.
Robert P. Murphy has a Ph.D. in economics from New York University. He is an economist with the Institute for Energy Research and author of The Politically Incorrect Guide to the Great Depression and the New Deal.
by Pat Heyman | Dec 13, 2009 | Uncategorized
Recent Comments