Search

You searched for: Journal The Objective Standard Remove constraint Journal: The Objective Standard
Number of results to display per page

Search Results

  • Author: Paul Hsieh
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Shows that, contrary to proposals being put forth by Republicans, a genuinely free market in health insurance is not only moral, in that it respects the rights of producers and consumers, but also practical, in that it enables businessmen to solve problems for profit-which leads to more and better products and services at lower prices for consumers.
  • Topic: Food
  • Author: Craig Biddle
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Author's note: This is chapter 3 of my book Loving Life: The Morality of Self-Interest and the Facts that Support It (Richmond: Glen Allen Press, 2002), which is an introduction to Ayn Rand's morality of rational egoism. Chapters 1 and 2 were reprinted in the prior two issues of TOS. In the book, this chapter is titled "To Be Or Not To Be: The Basic Choice." In chapter 2, we encountered the problem known as the "is-ought" dichotomy, the notion that moral principles (principles regarding what people "ought" to do) cannot be derived from the facts of reality (from what "is"). We also saw that this problem persists for lack of an observation-based, objective standard of value. Here we turn to the solution to that problem. First, we will discover just such a standard; then, we will discover a number of objective moral principles-principles in accordance with that standard. To begin, note that the basic fact that makes morality such a difficult subject is the very fact that makes it a subject in the first place: free will. As human beings we have the faculty of volition, the power of choice; we choose our actions. This fact gives rise to our need of morality. Indeed, the realm of morality is the realm of choice. What makes the issue complicated is the fact that our choices are guided by our values-which are also chosen. This is why it is so difficult to get to the bottom of morality: Human values are chosen-every last one of them. Consequently, peoples' values seem to differ in every imaginable way. Some people choose to play soccer; they value footwork, teamwork, and winning. Some choose to dance ballet; they value grace, poise, and flight. And some choose to attend church; they value sermons, faith, and prayer. A person who goes hiking values the scenery and exercise. One who goes fishing values the nibble and catch. And one who takes heroin values the so-called "high." A person who steals jewelry values "free stuff." One who makes jewelry values craftsmanship. A sculptor values the process of creating art. A software developer values that creative process. A student who cheats on a test values "getting away" with it. One who studies for the test values the knowledge he gains thereby. A doctor specializing in internal medicine values the process of curing disease. A terrorist specializing in biological warfare values the process of spreading disease. A man who treats his wife with respect values certain qualities in her. One who abuses his wife values having power over her. A General who fights for mandatory "volunteerism" values involuntary servitude. One who fights to defend individual rights values freedom. And so on. Different people act in different ways; they value different things. So the question is: How do we know if our choice of values is good or bad, right or wrong? What is our standard of value? As we have already seen, if we do not consciously hold something as our standard of value, then we have nothing by reference to which we can determine what goals we should or should not pursue-how we should or should not act. And if we do not hold something rationally provable as our standard of value, then we default to some form of subjectivism-personal, social, or "supernatural"-which can lead only to human sacrifice, suffering, and death. If we want to live and achieve happiness, we need a non-sacrificial standard of value that is grounded in perceptual evidence-facts we can see. In search of such a standard, the proper approach is to turn not to personal opinion or social convention or "super-nature," but to actual nature and ask, as the American philosopher Ayn Rand did: "What are values? Why does man need them?" . . .
  • Political Geography: America
  • Author: Dina Schein Federman
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: People today sense that something is wrong with the world and are searching for answers. What they generally find is disappointing. Skeptics tell us that there is no clear-cut right and wrong in any issue, that all issues are "complex," that wisdom consists of dropping the notion that there are absolute truths. The most prevalent alternative to the skeptical, relativist position comes from religionists, who accept the existence of absolute truths but insist that they may be found only within a religious framework-a belief in a supernatural being who is the source of truth and morality. Both camps agree that absolutes cannot be discovered by a rational process. Both camps agree that morality consists of selfless service to others. Both camps support the welfare state. Ayn Rand rejects all these claims and sweeps aside both skepticism and mysticism. Her philosophy, Objectivism, holds that reality is an objective absolute, independent of anyone's beliefs or feelings; that reason, based on the evidence of the senses, is our only means of knowing reality and, consequently, our only proper guide to action; that each man is an end in himself, not the means to the ends of others, and, therefore, that the pursuit of his own rational self-interest and happiness is the highest moral purpose of his life; that the proper political system is that of laissez-faire capitalism, in which men deal with one another as "traders, by free, voluntary exchange to mutual benefit."1 The reader may find the elucidation of her philosophical principles and their application in her novels, essays, and cultural commentary. Objectively Speaking: Ayn Rand Interviewed is a recent addition to this body of work. It is a collection of radio and television interviews conducted with Ayn Rand from 1932 to 1981, in which she applies Objectivism to current events. Starting with her earliest known interview at age twenty-seven, it goes on to include a series of interviews conducted with her at Columbia University from 1962 to 1966, in which students and professors asked her questions on the principles of Objectivism and their application. It also includes a series of interviews in various media, ranging from the 1959 interview with Mike Wallace to her final public appearance, a 1981 interview with Louis Rukeyser. The epilogue is an interview with Dr. Leonard Peikoff, Rand's best student, heir, and the leading exponent of her philosophy, in which he recounts his thirty-year professional and personal association with her. Among the topics Rand discusses in her interviews are the political structure of a free society, the American constitution, objective law, the nature of capitalism and various myths about it, why political conservatives are worse enemies of capitalism than the leftists, the crucial need for a free press, proper foreign policy, the moral nature of businessmen, education, the arts, the nature of humor, the foundations of morality, individual rights, and many others. For example, in one interview from the 1960s, during a discussion of the origin of individual rights, Rand is asked to elucidate her rejection of various alleged "rights," such as rights to a minimum wage, free education and medical care, and the like. She explains that because jobs, education, medical care, and other goods and services do not grow on trees but are produced and provided by individuals and businesses, a "right" to these things means that the providers are to be forced to serve those who allegedly have a right to the largesse, which is slavery. "Nobody can have a right to the unearned. . . . [These things] can only come from other men-and nobody may claim the right to enslave others" (pp. 154-55). She explains that the only political-economic system in which force is banished from human relations is the system of laissez-faire capitalism, in which men deal with one another as traders, voluntarily exchanging value for value to mutual benefit. Discussing the nature of capitalism and debunking the myths that surround it, Rand answers the allegation that government must regulate the economy in order to prevent financial crises: "Depressions and panics are the result of government intervention in the economy-specifically, government manipulation of credit and money. That was the cause of the Depression of 1929. Once more, it is capitalism that is taking the blame for the evils created by its opposite: statist intervention" (p. 42). In order to prevent financial crises, she counsels, the government must stay out of the economy. . . .
  • Topic: Foreign Policy
  • Author: Daniel Wahl
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Nine-year-old Warren Buffett is in his yard playing in the snow. Warren is catching snowflakes. One at a time at first. Then he is scooping them up by handfuls. He starts to pack them into a ball. As the snowball grows bigger, he places it on the ground. Slowly it begins to roll. He gives it a push, and it picks up more snow. Soon he reaches the edge of the yard. After a moment of hesitation, he heads off, rolling the snowball through the neighborhood. And from there, Warren continues onward, casting his eye on a whole world full of snow (prologue). Many decades later, Alice Schroeder, a former insurance analyst at Morgan Stanley and the author of The Snowball, is sitting in front of Warren Buffett, one of the world's richest men. "Where did it come from," she asks, "Caring so much about making money?" Buffett leans forward, "more like a teenager bragging about his first romance than a seventy-two-year-old financier," and begins to tell his story: "Balzac said that behind every great fortune lies a crime. That's not true at Berkshire [Hathaway]" (p. 4). Thus begins The Snowball, one of the most highly anticipated biographies of the past few years and the first to be written about Buffett with his full cooperation. As its full title indicates, The Snowball: Warren Buffett and the Business of Life sets out to present Buffett's thinking, not only about business but about life in general. Among the many topics this hefty volume explores are those individuals who influenced his thinking. A major figure in this respect is Buffett's father, whom he idolizes and from whom he learned a crucial point when it comes to judging both oneself and others: The big question about how people behave is whether they've got an Inner Scorecard or an Outer Scorecard. It helps if you can be satisfied with an Inner Scorecard. I always pose it this way, I say: "Lookit. Would you rather be the world's greatest lover, but have everyone think you're the world's worst lover? Or would you rather be the world's worst lover but have everyone think you're the world's greatest lover?" . . . Now my dad: He was a hundred percent Inner Scorecard guy. He was really a maverick. But he wasn't a maverick for the sake of being a maverick. He just didn't care what other people thought" (p. 33). In addition to the valuable lessons he learned from important figures in his life, Schroeder shows how Buffett's own interests and thinking during childhood contributed to his development. Schroeder reveals him to have been an efficacious child, intensely interested in collecting and processing facts. . . .
  • Political Geography: New York
  • Author: Scott Holleran
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: In the light and lively Fred Astaire, author and journalist Joseph Epstein offers an excellent overview of the career of the world's greatest male ballroom and tap dancer. This short biography, part of Yale University's Icons of America series, is like its subject-accessible yet elegant. Astaire began his dance training at the age of five after his mother, Johanna Austerlitz, brought him to New York City in the hopes of grooming him and his talented older sister, Adele, for careers in show business. Attending dance school with his sister, young Frederick took to the art form and was soon rehearsing with Adele in routines developed by their instructor. Changing their last names to "Astaire," the brother-sister act hit the theatrical circuit and began a professional career that lasted many years and included appearances on Broadway with Al Jolson and Fanny Brice; and work with famed showman Flo Ziegfeld, who paid the duo an impressive $5,000 per week during the Depression (pp. 12, 15). After Adele retired at the age of thirty-five, Astaire sought fortune in Hollywood. Shortly after being famously dismissed by a studio executive as "Balding. Can't sing. Dances a little." (p. 18), Astaire was noticed by Metro-Goldwyn-Mayer (MGM) and signed to a three-week contract for $1,500 per week. His first role-playing himself in Dancing Lady opposite Joan Crawford-proved that he had potential as a screen star.
  • Political Geography: New York
  • Author: Daniel Wahl
  • Publication Date: 10-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: In 1849, millions were starving due to a then-mysterious disease that "in a matter of days, if not hours, could transform a thriving field [of potatoes] into a slimy, foul-smelling patch of rotting vegetation" (pp. 38-39). Everywhere, plants were "extremely inconspicuous, [tasted] terrible, or [went] to seed in a fast and fabulously prolific way, leaving nothing behind to harvest." And plants had evolved characteristics by which they survived just long enough to reproduce, characteristics that were unsuitable for feeding a large and fast-growing population (p. 40). But all this was about to change, and dramatically so, thanks to a man born that year: Luther Burbank (p. 6). In her new book The Garden of Invention: Luther Burbank and the Business of Breeding Plants, Jane S. Smith presents the life of this extremely influential but mostly forgotten plant breeder and businessman, emphasizing his innovations and the methods he used to develop and sell them. Burbank displayed some mechanical ingenuity as a child, but, Smith reveals, apart from this, nothing in Burbank's background suggested that he might become an inventor of new plants (p. 19). Though young men of his time were encouraged to work in an academic setting, Burbank, fond as he was of the outdoors, was unsure whether he wanted to follow suit-until, at the age of 21, he picked up a copy of Charles Darwin's The Variation of Animals and Plants Under Domestication. Smith describes this book as a "detail-crammed response to those who had criticized On the Origin of Species by Means of Natural Selection as a hypothesis unsupported by sufficient proof" (p. 27), then concisely sums up what Burbank read: From gooseberries to gladioli, Darwin compiled his evidence: plants changed in response to outside stimulus (like the cabbages Darwin described that changed their shape or color when planted in different countries), and these changes could happen within a short time span (like the hyacinths he said growers had managed to improve from the offerings of only a few generations earlier). The causes of the changes were still largely unknown, but their occurrence was a fact beyond dispute. This was evolution measured in human time (p. 28). Burbank took from the book several big ideas-each of which Smith relates in an easy-to-read style: The first was that it was possible to force the emergence of latent differences in fruits and flowers, even to the point of generating what seemed to be entirely new varieties. Still more exciting was Darwin's tentative suggestion that selecting, grafting, hybridizing, or simply moving a plant to a new environment might spur changes that would persist over generations. According to Darwin, these alterations were often inadvertent, but as Burbank immediately realized, such happy accidents could also be deliberately pursued. The creation of new plant varieties, something far beyond the familiar efforts to breed the best of an existing stock, did not need to wait for the slow accumulation of natural advantages Darwin had described in his Origin of Species. Evolutionary change could be accelerated by human intervention (p. 28). Darwin's words "opened up a new world" for Burbank (p. 27). Not only did the book give him an intellectual framework for viewing the world and man's place in it, but an advertisement on the last page of Burbank's edition of Darwin's tract (for a book called Gardening for Profit) enabled him to see for the first time his place in it. He would not have to choose "between the outdoor life and the inventor's bench" after all-plant life "could be a subject for experimentation and improvement, and a commercial garden could provide a good living for an imaginative and enterprising man".
  • Political Geography: New York
  • Author: Craig Biddle
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Merry Christmas readers! And welcome to the Winter 2009-10 issue of The Objective Standard.
  • Topic: Government
  • Political Geography: United States
  • Author: Craig Biddle
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Michael Dahlen's article "The Rise of American Big Government" [TOS, Fall 2009] is a clarifying survey, in essentials, of the interventionism that has eroded freedom in America for more than a century. But as to the alleged economic successes of Reagan and Clinton, weren't these funded with deficit financing and inflation? I'd like to hear Mr. Dahlen's thoughts on this.
  • Topic: Government
  • Political Geography: America
  • Author: Cassandra Clark
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Pharmaceutical industry executives are frequently accused of greedily putting "profits before patients" (as if drug companies could profit by means other than serving patients). This accusation would be unjust if these executives were after profits. Unfortunately, however, today's pharmaceutical executives are not after profits. They are after loot. They seek to gain, through legislation, money coercively taken by the government from American citizens. But, unbeknownst to these executives, their looting is self-destructive. In fact, by aiding and abetting the government in its violation of individual rights, the pharmaceutical industry is committing suicide. To see why, let us begin by examining some of the ways in which the industry calls for the violation of rights and receives loot as a result. Then we will turn to the reasons why this practice is killing the pharmaceutical industry. Consider the industry's support for the Medicare Modernization Act of 2003 (MMA). The MMA expanded Medicare to include coverage of prescription drugs for Americans over the age of 65 and was the largest expansion of welfare in America since the creation of Medicare itself.1 When the Act took effect in 2006, it made the U.S. federal government the single largest purchaser of prescription drugs in America.2 In 1999, years before this bill had been conceived, Alan Holmer, then president of Pharmaceutical Research and Manufacturers of America (PhRMA), the industry's lobby group, made clear in a trade journal the industry's view that "the question is not whether, but how, to expand Medicare coverage of prescription drugs."3 In 2000, Holmer testified before the Senate Finance Committee that at "some point in the not-too-distant future, a Congress will pass, and a President will sign, legislation to expand drug coverage for Medicare beneficiaries. . . . Expanded drug coverage for seniors will be a positive development." Holmer emphasized: The pharmaceutical industry strongly supports . . . expanding Medicare coverage of prescription medicines. . . . Medicare beneficiaries need high-quality health care, and prescription medicines often offer the most effective therapy for them. We believe that the best way to expand prescription drug coverage for Medicare beneficiaries is through comprehensive Medicare reform.4 The pharmaceutical industry got its desired "reform," and when the MMA became law, the government not only began dictating the terms by which private insurers would provide prescription drug coverage to Medicare beneficiaries, it also began spending tens of billions of dollars annually to subsidize that coverage. From where does the U.S. government get this money? The government does not create wealth; it does not produce anything. Every penny the government spends on drugs (or anything else) comes from taxpayers. The government gets this money by taking it under threat of force from hard-working Americans (or by printing or borrowing it, which is deferred taxation). This is legalized theft; the money taken by force is loot. And when the government spends this loot on prescription drugs for the elderly, the loot is passed on to the pharmaceutical industry. Now, merely receiving loot from the government does not in and of itself constitute the moral crime of complicity in the government's coercion. But the pharmaceutical industry is not merely receiving money from the government as a result of the MMA. The industry advocated this socialist scheme of forced wealth redistribution from the start, supported it at every stage of development, and is now receiving the loot as planned. Although the industry exchanges drugs for the loot, the entire arrangement on the part of taxpayers whose money is taken by force to buy the drugs is involuntary. Taxpayers do not choose to fund the industry in this way; they are forced to do so-by a law that the pharmaceutical industry enthusiastically helped to create. . . . To read the rest of this article, select one of the following options: Subscriber Login | Subscribe | Renew | Purchase a PDF of this article.
  • Political Geography: United States, America
  • Author: Eric Daniels
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Not yet a year into its term, the initially popular Obama administration has plummeted in popularity. In light of Washington's escalated meddling in the economy, many Americans are expressing deep concerns and anger about the statist direction in which this administration is steering the country. Unfortunately, however, few Americans are aware of-and the media is ignoring-one of the administration's most serious threats to our freedom: its stated intention to bolster antitrust enforcement. Since May, Christine Varney, the newly appointed assistant attorney general for the Justice Department's Antitrust Division, has conducted a speaking tour promoting the Division's new mandate under Obama and affirming the president's many campaign promises to "reinvigorate antitrust enforcement." Varney and her counterpart at the Federal Trade Commission, Jon Leibowitz, are publicly threatening "possible investigations" of businesses ranging from Google to Monsanto to IBM. In response to this new climate, antitrust advocates from Senator Charles Schumer to the American Booksellers Association have called on Varney to undertake new prosecutions. And New York Attorney General Andrew Cuomo recently joined the push by filing a suit against Intel.1 Americans should not only be aware of this ominous trend; they should be up in arms about it. Antitrust laws violate the rights of American businessmen and consumers, thwart economic development, and stifle our quality of life in myriad ways. To see why, we must first understand what antitrust law is. During the second half of the 19th century, as American companies grew and acquired assets around the country, they found themselves in a difficult position. Although companies could achieve economies of scale by acquiring smaller firms and unifying their efforts, state laws prevented them from doing so. Whereas some state legislatures imposed special taxes on out-of-state corporations doing business in their states, other legislatures forbade corporations in their state from holding the stock of companies based elsewhere. (Legislators established such restrictions in the hope that they would force successful companies to incorporate-and thus pay taxes-in their state.) In response to these restrictions on acquisitions, C. T. Dodd and John D. Rockefeller of Standard Oil created a new form of business using the device of a legal trust, which enabled them to hold the stock of dozens of companies and thus effectively manage vast productive assets.2 The operational and financial advantages of this novel corporate structure were immense, yet critics alleged that the newly created trusts were "odious monopolies," charging them with "making competition impossible," "raising prices," and "disregarding the interests of the American consumer."3 Critics condemned this new legal device as a "problem" and branded businessmen who employed it as "robber barons." Yet these businessmen used this legal device to create their vast fortunes by increasing competition, lowering prices, and providing American consumers with more and better products.4 The problem was not that their novel form of business had generated economic inefficiencies-it had done the opposite. Rather, the problem was a political one. Because these businesses were becoming fabulously successful and their owners enormously wealthy, egalitarian-minded and envious Americans pressured politicians to "do something," and politicians, seeking approval, got "tough" on the issue. A solution to the trust "problem" came in the form of the Sherman Antitrust Act of 1890. Senator John Sherman and his colleagues claimed that trusts were "combinations that affect injuriously the industrial liberty of the citizens of these States."5 Critics of the trusts claimed that their high profits were achieved-not through the entrepreneurial, managerial, and productive genius of men such as Rockefeller, Edison, and Carnegie-but by "the few extorting the many."6 Because of the "public outcry on the trust question" and the alleged need to protect the "interests of the consumer," Sherman and his colleagues advocated the creation of a broad law that outlawed "monopolization" and "restraint of trade." That law was the Sherman Antitrust Act, and since its passage in 1890 Congress has added five other antitrust laws to the books, prohibiting dozens of supposedly "anticompetitive" business practices.7 . . . To read the rest of this article, select one of the following options: Subscriber Login | Subscribe | Renew | Purchase a PDF of this article.
  • Topic: Economics, Oil
  • Political Geography: America
  • Author: John David Lewis
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: What does the bill recently passed by the U.S. House of Representatives, HR3962, short-titled the "Affordable Health Care for America Act," actually say about major health-care issues? I here pose a few commonsense questions, cite some relevant passages, and offer a few brief comments. (The bill is available at http://docs.house.gov/rules/health/111_ahcaa.pdf.)
  • Political Geography: United States
  • Author: Paul Beard
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Imagine the following: 1. A couple improves their oceanside backyard with tables and benches, a barbecue pit and kitchenette, an outdoor shower and restroom, and a beautiful flower garden. Thirty years later, government officials inspect the yard and declare that the amenities are ugly and out of character with the surrounding area, and that, because the amenities underscore the fact that the yard is private, they have a detrimental psychological effect on the public traversing the adjacent beach. The government orders the couple to clear out their yard and restore it to its "natural" condition, or face stiff penalties. 2. A man purchases a vacant lot along the Pacific coast with the dream of building his residence there, but when he submits his building plans to the local government, he is told that the size and location of his structure would sully boaters' views of the coastline. The government demands that he reduce the size of his proposed home to one-quarter its planned size and place it in a geologically hazardous corner of the lot. 3. A family that has been cramped for years in a mobile home on their 143-acre ranch wants to build a house large enough to accommodate them, but when they apply for a building permit, the government tells the family that, in exchange for permission to build, they must agree to a perpetual agricultural easement over almost all of their land-an "agreement" that would force the family and future generations to farm the property forever. Unfortunately, we need not imagine any of these scenarios, because they are true stories of real people suffering at the hands of a real tyrant. Even as you read, these and other such abuses are occurring in California. The tyrant in question is the California Coastal Commission, a state bureaucracy with near-limitless authority over people's property and, thus, their lives. Although the Commission's power to dictate how property owners may use their property is limited to certain regions in California, similar commissions exist in other states, and the Commission's endeavors provide an ideal case study regarding how and why governmental bodies at all levels across America are increasingly violating property rights. Let us begin our study by looking at the history and nature of the Commission. . . . To read the rest of this article, select one of the following options:Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Government
  • Political Geography: California
  • Author: Doug Altner
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Over the past few years, Somali pirates have attacked numerous ships, hijacking more than forty in 2008, holding more than six hundred seafarers for ransom that same year,1 and extorting more than $150 million in ransom payments from December 2007 to November 2008.2 More troubling is that, as of September, reported pirate attacks for 2009 have already surpassed the total number reported in 2008-a strong indication that the problem of piracy is only worsening.3 Because of these attacks, shipping companies must choose between navigating dangerous waters and taking costly alternate routes in order to protect their crews and goods. In November 2008, Maersk, one of the world's largest container shipping companies, announced that, until there are more convoys to protect its ships from attacks, some of its fleet will avoid taking the most direct sea route to the East through the Suez Canal, which leads to pirate-infested waters.4 By taking the next best route from Europe to the East-around South Africa's Cape of Good Hope-shipping companies such as Maersk will add an average of 5.7 days and three thousand miles to each trip. The average annual cost of this route change to such a shipping company will range in millions of dollars for each of its ships that uses the alternate route,5 not to mention short- and long-term expenses from additional wear on its vessels. And, of course, given the integrated nature of the economy and the amount of goods shipped to and from the East, such route changes negatively affect all industries, directly or indirectly. Although the piracy threat has been well known to those in the shipping industry for a few years, it became manifest to most Americans in April 2009 when Somali pirates hijacked the Maersk Alabama and captured twenty U.S. sailors. Although the sailors soon regained control of the ship,6 four pirates took Captain Richard Phillips hostage on a lifeboat. The three-day standoff that ensued ended when a team of navy SEAL snipers rescued the captain.7 Fortunately, neither the captain nor any sailors were seriously harmed during this attack-but it is disconcerting that a small gang of third-world pirates dared to attack an American ship and abduct its captain. Why were the pirates not afraid of a standoff with the most powerful navy on earth? To determine what is motivating these pirates and how the U.S. Navy should best combat their attacks, many policy analysts, historians, and defense experts are looking to the Barbary Wars-two wars the United States fought in the early 19th century to end North African piracy-for guidance. These experts are wise to look here, for the situation surrounding the Barbary pirates of the revolutionary era is similar in important respects to the situation surrounding the Somali pirates of today. Like the Somali pirates, the Barbary pirates attacked trade ships, stole goods, took prisoners, and demanded ransom from wealthy nations with strong militaries. And like the Somali pirates, the Barbary pirates got away with their thievery for some time. But unlike the Somali pirates, who continue their predations, after the Second Barbary War the Barbary pirates stopped assaulting U.S. ships-permanently. Toward establishing a policy that can bring about this same effect with regard to the Somali pirates, it is instructive to examine those aspects of late-18th- and early-19th-century U.S. foreign policy that were effective against Barbary piracy and those that were not. In particular, it is instructive to identify why the First Barbary War failed to end the pirate attacks but the second succeeded. Let us consider the key events surrounding these two wars. . . . To read the rest of this article, select one of the following options:Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Foreign Policy, War
  • Political Geography: United States, Europe, South Africa, Somalia
  • Author: Craig Biddle
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Author's note: This is chapter 4 of my book Loving Life: The Morality of Self-Interest and the Facts that Support It (Richmond: Glen Allen Press, 2002), which is an introduction to Ayn Rand's morality of rational egoism. Chapters 1-3 were reprinted in the prior three issues of TOS. In the book, this chapter is subtitled "Basic Human Needs." We have seen that human life is logically the standard of moral value-and that each individual's own life is logically his own ultimate value. Here we turn to the question of the human means of survival. What things do we need in order to live? What actions must we take in order to gain and keep those things? And, most importantly: What makes those actions possible? All living things have a means of survival. Plants survive by means of their automatic vegetative process known as photosynthesis. Animals survive by means of their automatic instinctive processes such as hunting, fleeing, and nest building. Human beings, however, do not survive by automatic means; our means of survival is not instinctual, but volitional. Since we have free will, we choose to live or not to live-and if we choose to live, we must also choose to discover the requirements of our life and to act accordingly. While the choice to live is up to us, the basic requirements of our life are determined by nature. In order to live, we must take a specific course of action; random action will not do. We cannot survive by eating rocks, drinking Drano, or wandering aimlessly in the desert; and we cannot achieve happiness through procrastination, promiscuity, or pot. If we want to live and enjoy life, we have to discover and act in accordance with the actual, objective requirements of our survival and happiness. What are they? . . . To read the rest of this article, select one of the following options:Subscriber Login | Subscribe | Renew | Purchase a PDF of this article.
  • Author: Robert Mayhew
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Every great man nowadays has his disciples, and it is always Judas who writes the biography. . . . Formerly we used to canonise our heroes. The modern method is to vulgarise them. -Oscar Wilde, "The Critic as Artist" (1891)
  • Political Geography: America
  • Author: Gus Van Horn
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Al Gore, the Weather Channel, "environmental activists," and many politicians claim there is a worldwide "consensus" among scientists that the earth's climate is about to change drastically due to human activity. On the basis of this alleged consensus, governments are considering coercive measures to head off catastrophe-rationing fuel, regulating carbon dioxide emissions, dictating the kinds of lightbulbs we can buy, and so on. Of course, the idea that man is responsible for impending climatic doom has its detractors. Some say the science is not settled; some say it is settled, and that human activity is not warming the planet; and some say that even if human activity were warming the planet, that might be good. But who among us truly understands the scientific arguments alleged to support the various claims? Wouldn't it be nice if a scientist wrote a book carefully documenting and explaining, in layman's terms, the cases for and against man-made global warming? Then, we could determine for ourselves which claims are supported by evidence and logic. Based on favorable publicity from conservative media and politicians, Ian Plimer's Heaven and Earth: Global Warming, the Missing Science would appear to be just such a book. Plimer claims to present an "integrated scientific understanding of the environment," and the book-chock-full of figures and graphs and containing more than 2,300 footnotes in its 504 pages-certainly makes a powerful first impression. Add to that EU President (and noted climate-change skeptic) Vaclav Klaus's statement that the work is "clear, understandable, and very useful," and a good grasp of the arguments for and against man-made global warming would seem to be just a few hundred pages away. Unfortunately, Heaven and Earth utterly fails to deliver on its promise. Rather than clearly presenting the anthropogenic global warming hypothesis and specifying the kind and scope of data necessary to evaluate it, Plimer presents the reader with a disorganized hash of poorly-presented data; repeatedly mocks climate models without providing sufficient evidence or argument to warrant such mockery; dismisses the Intergovernmental Panel on Climate Change (IPCC) as a bogeyman "unrelated to science" (p. 20), without adequately explaining why this is so; and generally presents an incoherent argument against a straw-man version of the anthropogenic global warming hypothesis. . . . To read the rest of this article, select one of the following options: Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Climate Change
  • Author: Daniel Wahl
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: I'm an environmental scientist, but I've never had time to review the "evidence" for the [man-made] causes of global warming. I operate on the principle that global warming is a reality and that it is human-made, because a lot of reliable sources told me that . . . Faith-based science it may be, but who has time to review all the evidence? (p. 318) That is just one of many astonishing statements by global warming alarmists that Christopher C. Horner catalogs and analyzes in Red Hot Lies: How Global Warming Alarmists Use Threats, Fraud, and Deception to Keep You Misinformed. Horner amply demonstrates that the acceptance of such "faith-based science" is the modus operandi of many activists, journalists, and politicians-who want you to accept it too. Horner exposes the tactics alarmists use to sell you on the faith and keep you ignorant of the facts surrounding alleged global warming. One such tactic is simple, premeditated deception: Consider the example of Gore's co-producer Laurie David, who followed [Gore's documentary, An Inconvenient Truth] with a book aimed at the little ones. . . . [In the documentary] one ought to have suspected that Al Gore was up to something when he ran two lines [for CO2 concentration and temperature change] across the screen . . . claimed a cause-and-effect relationship, and then forgot to superimpose them. . . . Gore was correct to insist there was a relationship between the temperature line and the CO2 concentration line, as measured over the past 650,000 years. . . . The relationship, however, was the precise opposite of what he suggested: historically, it warms first, and then CO2 concentrations go up. Gore's wording and visuals were cleverly deceptive-he implied, without stating . . . that CO2 increases are followed by temperature increases. The reason he didn't source his claim is that the literature doesn't support it. Somehow believing only the young would bother to open their [book] targeting children, David and [co-author Cambria] Gordon weren't so clever. Their book included the two lines, but dared superimpose them, and even stated the phony relationship more outlandishly than Gore did in his film. The reason that temperature appeared to follow CO2 proved to be because they reversed the labels in the legend (pp. 199-200). Horner shows that such deceptions are typical, not just of documentary producers and author-activists seeking to spread the faith, but also of so-called scientists and purportedly scientific organizations including, for instance, the UN's Inter-Governmental Panel on Climate Change (IPCC). Of the IPCC, Horner writes: The unsupportable advocacy from these supposed sages of science begins with their threshold dishonesty of putting forth a lurid and alarming "summary," drafted by a few dozen people who often are activists-and encouraging claims that these conclusions represent the consensus of thousands of "the world's leading scientists" from the world over (p. 294). These summaries, Horner shows, are written before scientists have submitted data. Even worse, it "appears that the IPCC intends to make the scientists . . . change their findings if they depart from the summary in order to bring them in line with it" (p. 304). The IPCC's "Lead Authors" edit out inconvenient findings, such as the following two, which, according to one scientist, had intentionally been "included at the request of participating scientists to keep the IPCC honest." . . . To read the rest of this article, select one of the following options:Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Climate Change
  • Author: Andrew Lewis
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Since 9/11, cultural critics and religious apologists have argued the question of Islam's militaristic nature: Is Islam inherently violent, or is it a peaceful religion corrupted by today's Islamic terrorists? Efraim Karsh does not argue that Islam is necessarily more inclined to violence than any other religion, or that today's terrorists have perverted Muhammad's message. Rather, he claims, Islamic culture has always been (and potentially always will be) associated with and spread by bloodshed and violence. This, he believes, is less because of the fundamental tenets of Islam than because of the fact that the religion's leaders and adherents have always been motivated by delusions of imperial grandeur achievable only by force. In trying to explain the motivation behind the attacks of 9/11 and the militancy of today's Islamists in general, Karsh documents Islam's history of political violence. He tells the story of Islam, from Muhammad's rise to power in the early 7th century, through the rule by caliphates of the medieval period, through the rise and fall of the Ottomans, up to today's "renewed quest," headed by terrorists such as Osama bin Laden, for a universal Arabic-Islamic empire. In Karsh's view, Islamic violence has always been driven more by political and imperial ambition than by religious fervor. Recounting key aspects of Islamic history, from Muhammad's many raids; through the persecutions, assassinations, and wars of conquest that followed; through the resurgent violence in the last century, Karsh leaves the reader with no doubt that Islam's past and present have been riddled with violence, and that its future likely will be too. Karsh explains that Islam developed on a foundational premise of an "inextricable link between religious authority and political power," established by Muhammad himself (p. 13). What "made Islam's imperial expansion inevitable" is that Muhammad's umma (community of believers) accepted a credo that combined a universal religion with the necessity of territorial conquest to establish political rule to enforce that faith (p. 18). Islamic Imperialism: A History is not a comprehensive or straightforward history of Islamic empires or culture. Rather it is a history of Islamic leaders' dream (Karsh's word) of achieving a global empire and the actions they have taken toward realizing that dream-a dream that, Karsh argues, can never become a reality. . . . To read the rest of this article, select one of the following options:Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Islam
  • Author: Daniel Wahl
  • Publication Date: 12-2009
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: According to George Gilder, Israel's defenders have failed to make a compelling case for the country's right to exist-though not for lack of trying. Gilder cites, as one example, Alan Dershowitz, who has contributed two books offering "over thirty chapters of evidence against [anti-Israel] propaganda."Dershowitz cogently contests the proposition that Israel is a racist bastion of apartheid, a genocidal expansionist power, and a crypto-Nazi perpetrator of "massacres." He ably refutes the verdict of the relevant UN committee that Israel is "the world's primary violator of human rights" . . . [And he] even takes the trouble to answer charges of the ineffable Iranian president Mahmoud Ahmadinejad as if the ruler were moved by legal niceties and resourceful argument (pp. 20-21). But, although Gilder acknowledges that Dershowitz's arguments refute the typical charges made against Israel, he says that this defensive posture is an all-too-typical mistake. "The central error of Israel's defenders is to accept the framing of the debate by its enemies. . . . Locked in a debate over Israel's alleged vices, they miss the salient truth running through the long history of anti-Semitism: Israel is hated above all for its virtues" (pp. 21-22). For all its special features and extreme manifestations, anti-Semitism is a reflection of the hatred toward . . . capitalists that is visible . . . whenever an identifiable set of outsiders outperforms the rest of the population in an economy. This is true whether the offending excellence comes from the Kikuyu in Kenya, the Ibo and the Yoruba in Nigeria . . . [or] the over 30 million overseas Chinese [throughout] Southeast Asia (p. 36). In The Israel Test, Gilder zeros in on both the source of Israel's success and the source of hatred toward the nation, making a strong case for why the nation's continued existence should be both supported and celebrated. . . .To read the rest of this article, select one of the following options: Subscriber Login | Subscribe | Renew | Purchase a PDF of this article
  • Topic: Human Rights
  • Political Geography: Israel
  • Author: Craig Biddle
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Welcome to the now-orange-for-better-visibility-on-the-newsstands Fall 2008 issue of TOS. Here is a preview of the seven articles at hand:My essay, "McBama vs. America," surveys the promises of John McCain and Barack Obama, shows that these intentions are at odds with the American ideal of individual rights, demonstrates that the cause of such political aims is a particular moral philosophy (shared by McCain and Obama), and calls for Americans to repudiate that morality and to embrace instead a morality that supports the American ideal.
  • Topic: Government
  • Political Geography: Japan, America
  • Author: Yaron Brook
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Following the economic disasters of the 1960s and 1970s, brought on by the statist policies of the political left, America seemed to change course. Commentators called the shift the "swing to the right"-that is, toward capitalism. From about 1980 to 2000, a new attitude took hold: the idea that government should be smaller, that recessions are best dealt with through tax cuts and deregulation, that markets work pretty effectively, and that many existing government interventions are doing more harm than good. President Bill Clinton found it necessary to declare, "The era of big government is over." Today that attitude has virtually vanished from the public stage. We are now witnessing a swing back to the left-toward statism. As a wave of recent articles have proclaimed: The era of big government is back. The evidence is hard to miss. Consider our current housing and credit crisis. From day one, it was blamed on the market and a lack of oversight by regulators who were said to be "asleep at the wheel." In response to the crisis, the government, the policy analysts, the media, and the American people demanded action, and everyone understood this to mean more government, more regulation, more controls. We got our wish. First came the Fed's panicked slashing of interest rates. Then the bailout of Bear Stearns. Then the bailout of Freddie Mac. Then a $300 billion mortgage bill, which passed by a substantial margin and was signed into law by President Bush. No doubt more is to come. All of this intervention, of course, is supported by our presidential candidates. Both blame Wall Street for the current problems and vow to increase the power of the Fed's and the SEC's financial regulators. John McCain has announced that there are "some greedy people on Wall Street that perhaps need to be punished." Both he and Barack Obama envision an ever-growing role for government in the marketplace, each promises to raise taxes in some form or another, and both support more regulations, particularly on Wall Street. Few doubt they will keep these promises. What do Americans think of all this? A recent poll by the Wall Street Journal and NBC News found that 53 percent of Americans want the government to "do more to solve problems." Twelve years earlier, Americans said they opposed government interference by a 2-to-1 margin. In fact, our government has been "doing more" throughout this decade. While President Bush has paid lip service to freer markets, his administration has engineered a vast increase in the size and reach of government. He gave us Sarbanes-Oxley, the largest expansion of business regulation in decades. He gave us the Medicare prescription drug benefit, the largest new entitlement program in thirty years. He gave us the "No Child Left Behind Act," the largest expansion of the federal government in education since 1979. This is to say nothing of the orgy of spending over which he has presided: His 2009 budget stands at more than $3 trillion-an increase of more than a $1 trillion since he took office. All of this led one conservative columnist to label Bush "a big government conservative." It was not meant as a criticism. Americans entered the 21st century enjoying the greatest prosperity in mankind's history. And many agreed that this prosperity was mainly the result of freeing markets from government intervention, not only in America, but also around the world. Yet today, virtually everyone agrees that markets have failed. Why? What happened? To identify the cause of today's swing to the left, we need first to understand the cause and consequences of the swing to the right.
  • Topic: Education, Government
  • Political Geography: America
  • Author: Alan Germani
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Examines the moral ideas of Christopher Hitchens, Sam Harris, Daniel Dennett, and Richard Dawkins, exposes some curious truths about their ethics, and provides sound advice for theists and atheists alike who wish to discover and uphold a rational, secular morality.
  • Topic: Islam
  • Political Geography: Middle East
  • Author: Paul Hsieh
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Identifies the theory behind the Massachusetts mandatory health insurance program, exposes the program as a fiasco, explains why the theory had to fail in practice, and sheds light on the only genuine, rights-respecting means to affordable, accessible health care for Americans.
  • Topic: Government, Health
  • Political Geography: America
  • Author: Eric Daniels
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: On June 23, 2005, the United States Supreme Court's acquiescence in a municipal government's use of eminent domain to advance "economic development" goals sent shockwaves across the country. When the Court announced its decision in Kelo v. City of New London, average homeowners realized that their houses could be condemned, seized, and handed over to other private parties. They wanted to know what had gone wrong, why the Constitution and Fifth Amendment had failed to protect their property rights. The crux of the decision, and the source of so much indignation, was the majority opinion of Justice John Paul Stevens, which contended that "economic development" was such a "traditional and long accepted function of government" that it fell under the rubric of "public use." If a municipality or state determined, through a "carefully considered" planning process, that taking land from one owner and giving it to another would lead to increased tax revenue, job growth, and the revitalization of depressed urban areas, the Court would allow it. If the government had to condemn private homes to meet "the diverse and always evolving needs of society," Stevens wrote, so be it. The reaction to the Kelo decision was swift and widespread. Surveys showed that 80 to 90 percent of Americans opposed the decision. Politicians from both parties spoke out against it. Such strange bedfellows as Rush Limbaugh and Ralph Nader were united in their opposition to the Court's ruling. Legislatures in more than forty states proposed and most then passed eminent domain "reforms." In the 2006 elections, nearly one dozen states considered anti-Kelo ballot initiatives, and ten such measures passed. On the one-year anniversary of the decision, President Bush issued an executive order that barred federal agencies from using eminent domain to take property for economic development purposes (even though the primary use of eminent domain is by state and local agencies). The "backlash" against the Court's Kelo decision continues today by way of reform efforts in California and other states. Public outcry notwithstanding, the Kelo decision did not represent a substantial worsening of the state of property rights in America. Rather, the Kelo decision reaffirmed decades of precedent-precedent unfortunately rooted in the origins of the American system. Nor is eminent domain the only threat to property rights in America. Even if the federal and state governments abolished eminent domain tomorrow, property rights would still be insecure, because the cause of the problem is more fundamental than law or politics. In order to identify the fundamental cause of the property rights crisis, we must observe how the American legal and political system has treated property rights over the course of the past two centuries and take note of the ideas offered in support of their rulings and regulations. In so doing, we will see that the assault on property rights in America is the result of a long chain of historical precedent moored in widespread acceptance of a particular moral philosophy.Property, Principle, and Precedent In the Revolutionary era, America's Founding Fathers argued that respect for property rights formed the very foundation of good government. For instance, Arthur Lee, a Virginia delegate to the Continental Congress, wrote that "the right of property is the guardian of every other right, and to deprive a people of this, is in fact to deprive them of their liberty." In a 1792 essay on property published in the National Gazette, James Madison expressed the importance of property to the founding generation. "Government is instituted to protect property of every sort," he explained, "this being the end of government, that alone is a just government, which impartially secures to every man, whatever is his own." Despite this prevalent attitude-along with the strong protections for property contained in the United States Constitution's contracts clause, ex post facto clause, and the prohibition of state interference with currency-the founders accepted the idea that the power of eminent domain, the power to forcibly wrest property from private individuals, was a legitimate power of sovereignty resting in all governments. Although the founders held that the "despotic power" of eminent domain should be limited to taking property for "public use," and that the victims of such takings were due "just compensation," their acceptance of its legitimacy was the tip of a wedge. The principle that property rights are inalienable had been violated. If the government can properly take property for "public use," then property rights are not absolute, and the extent to which they can be violated depends on the meaning ascribed to "public use." From the earliest adjudication of eminent domain cases, it became clear that the term "public use" would cause problems. Although the founders intended eminent domain to be used only for public projects such as roads, 19th-century legislatures began using it to transfer property to private parties, such as mill and dam owners or canal and railroad companies, on the grounds that they were open to public use and provided wide public benefits. Add to this the fact that, during the New Deal, the Supreme Court explicitly endorsed the idea that property issues were to be determined not by reference to the principle of individual rights but by legislative majorities, and you have the foundation for all that followed. . . .
  • Topic: Development, Economics
  • Political Geography: United States, America, London
  • Author: Tara Smith
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Author's note: This essay is based on a lecture delivered at the Objectivist Conference (OCON) held in Newport Beach, CA, July 2008, and retains some of the informal character of an oral presentation. While people commonly disagree about competing world views and substantive ideologies-arguing the merits of different religious creeds or value systems, for instance, of environmentalism or dominant business practices, of volunteerism or the specifics of political platforms-many are blind to the fact that nearly all these ideologies are fueled by a single, more basic philosophy: pragmatism. As people increasingly complain that political candidates are "all the same," in fact, many of the ideas and approaches supported by these candidates do reflect a shared method. It is important to understand this common element not simply because of the breadth of its influence, but because of its destructiveness. While pragmatism presents itself as a tool of reason and enjoys the image of mature moderation, of common sense and practical "realism," in truth, it is anything but realistic or practical. Pragmatism has become a highly corrosive force in people's thinking. And insofar as it is thinking that drives actions-the actions of individuals and correlatively, the course of history-as long as a person or a nation is infected by a warped philosophical approach, genuine progress will be impossible. In this essay, I seek to demonstrate the stealth but all too live menace that pragmatism poses. Pragmatism is not a substantive set of doctrines so much as a way of thinking, a unifying approach that helps to sustain an array of doctrines that are, in their content, irrational. Because it is a method, however, and informs the way that a practitioner tackles any issue, it proves much more difficult to unroot than an erroneous conclusion. Moreover, thanks to its positive image, pragmatism tends to give harmful ideas a good name, bestowing them with the misplaced aura of reason. It thereby makes people who wish to be rational all the more susceptible to those ideas. I will begin by clarifying exactly what pragmatism is and proceed to supply evidence of its prevalence. I will then consider the distinctive appeal of pragmatism, as well as the heart of its error-where its goes wrong. Next, I will explain its destructive impact, the principal means by which pragmatism is, indeed, corrosive. Finally, I will offer some thoughts concerning means of combating its influence. What Pragmatism Is As a formal school of philosophy, pragmatism was founded by C. S. Peirce (1839-1914) in the late 19th century. Its more renowned early advocates included William James (1842-1910) and John Dewey (1859-1952). Primarily, pragmatism is a way of tackling philosophical questions. This, according to its founders, is what made pragmatism different from all previous philosophy. James wrote that pragmatism does not stand for any results or specific substantive doctrines; rather, it is distinguished by its method of "clarifying ideas" in practical terms by tracing the practical consequences of accepting one idea or another. The meaning and the truth of any claim depend entirely on its practical effects. The mind, accordingly, should not be thought of as a mirror held up to the external world, but as a tool whose role is not to discover, but to do, to act. What, then, should we make of the concept of truth?-or the concept of reality? Don't we need to respect those, in order to achieve practical consequences? Well, of course truth exists, says James, but truth is not a stagnant property. Rather, an idea becomes true-"truth happens to an idea." Truth "lives on a credit system" in his view; what a truth has going for it is that people treat it in a certain way. The true is the "expedient," "any idea upon which we can ride." Any idea is true so long as it is "profitable." All truths do have something in common, then, namely, "that they pay." The question to ask of any proposed idea is: What is its "cash value in experiential terms?" The traditional notion of purely objective truth, however, is "nowhere to be found." The world we live in is "malleable, waiting to receive its final touches at our hands." As Peirce memorably put it, "there is absolutely no difference between a hard thing and a soft thing so long as they are not brought to the test." In the view of a much more recent and influential pragmatist, Richard Rorty, truth is "what your contemporaries let you get away with." To call a statement true is essentially to give it a rhetorical pat on the back. In short, for the pragmatists, we find no ready-made reality. Instead, we create reality. Correlatively, there are no absolutes-no facts, no fixed laws of logic, no certainty. . . .
  • Author: Stella Daily
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: This article is dedicated to Anna Tomalis, a young girl who died of liver cancer on August 15, 2008. Anna's parents desperately sought experimental treatment that might have saved her life, but were delayed for months by FDA bureaucracy. Anna finally received approval to obtain treatment through a clinical trial in July, but died after receiving just one round of treatment. She was thirteen years old. Abigail Burroughs was not the typical cancer patient: She was just nineteen years old when she was diagnosed with squamous cell cancer that had spread to her neck and lungs. Her prognosis was poor, but a then-experimental drug, Erbitux, offered the hope of saving her life. Abigail was denied that hope by the Food and Drug Administration. Because the drug was considered experimental, she could receive it only as part of a clinical trial-and Abigail was ineligible to participate in any trials at the time. Despite the best efforts of her family, friends, and doctor, Abigail was unable to receive the treatment that might have saved her life. At twenty-one years old, Abigail died of her disease. Abigail's father, Frank Burroughs, thought other patients with life-threatening illnesses should not be denied the ability to try any treatment that might give them a chance. In his daughter's name, he formed the Abigail Alliance for Better Access to Developmental Drugs, which sued the FDA in 2003. The group argued that the FDA's restrictions on access to experimental treatments constitute a violation of the right to self-defense as well as of the Fifth Amendment right not to be deprived of life, liberty, or property without due process of law. In August 2007, the Appeals Court of the District of Columbia struck a blow against the Abigail Alliance, and against individual rights, when it ruled that patients, even the terminally ill, do not have the right to receive treatment that has not been approved by the FDA. Erbitux has since been approved by the FDA to treat cancer of the head and neck-too late, of course, for Abigail Burroughs. How has America come to a point where the government denies dying patients the right to try to save their own lives? To answer that question, let us begin with a brief history of the Food and Drug Administration.A Brief History of the FDA Prior to the 20th century, the government did not regulate pharmaceutical products in the United States. Although Congress had considered federal regulations on food and drug safety as early as 1879, it had refrained from passing any legislation in this regard. However, with the muckraking journalism of the early 1900s, and especially with the publication of Upton Sinclair's novel The Jungle, which portrayed unsavory practices in the meatpacking industry, the American public clamored for laws to ensure the safe production of food and drugs. This public outcry pushed Congress to pass federal legislation in 1906. As the resulting Food and Drugs Act applied to drugs specifically, products were required to be sold only at certain levels of purity, strength, and quality; and ingredients considered dangerous (such as morphine or alcohol) had to be listed on the product's label. Violators would be subject to seizure of goods, fines, or imprisonment. Thus, in order to enforce the Act, the Food and Drug Administration was born. In its early years, the agency focused primarily on food rather than on pharmaceuticals, but in 1937 it increased its focus on drugs after a new formulation of sulfanilamide, a drug that had previously been successfully used to treat certain bacterial infections, proved to be deadly. The drug's manufacturer, S. E. Massengill Company, had dissolved an effective drug in a toxic solvent. More than one hundred people, babies and children among them, died as a result of taking Massengill's product, known as Elixir Sulfanilamide. Under the 1906 Food and Drugs Act, the FDA was not authorized to prosecute Massengill for selling an unsafe drug, and the agency had the power to recall Elixir Sulfanilamide only via a technicality. Because "elixir" was defined as a drug dissolved in alcohol, and because Massengill's formulation used the nonalcoholic solvent ethylene glycol, the product was technically mislabeled, bringing it under FDA jurisdiction and enabling the agency to recall the product. The public and legislators wanted more: They wanted the FDA not only to recall mislabeled products, but to prevent the sale of unsafe drugs in the first place. Thus, popular demand gave rise to the Food, Drug, and Cosmetics Act of 1938, which greatly expanded the FDA's authority. The most important change brought about by this Act was a shift in the burden of proof. Rather than prosecuting a drugmaker after the fact for having fraudulently marketed a product, the FDA would now require proof of safety before a drug could be marketed at all. (Note that this required manufacturers to prove a negative-i.e., that a given drug would not harm consumers.) After World War II, pharmaceutical companies came under still more scrutiny. Then, as now, complaints about the cost of drugs reached Congress, and in 1961 Senator Estes Kefauver led the charge in an investigation not only of drug pricing, but of the relationship between the drug industry and the FDA. Kefauver sought to pass legislation that would increase the agency's authority over drug production, distribution, and advertising. Whereas previously proof of safety alone was required to gain FDA approval, the proposed law would require drug manufacturers also to prove the efficacy of their products. Kefauver's bill might have languished in congressional debate but for the emergence at that time of data showing that thalidomide, which was then sold as a sleep aid and antinausea medication for pregnant women, caused severe birth defects in the children of women who took it. Thalidomide had not yet been approved for use in the United States at that time due to concerns of an FDA reviewer over a different side effect noted in the drug's application for approval. The drug was widely used in other countries, however, and the babies of many women who used it were born with grotesquely deformed limbs. As their harrowing images flooded the media, Americans realized they had narrowly escaped inflicting these deformities on their own children. The resulting public outcry led to Kefauver's bill being made law in 1962. This law served as the cornerstone for the wide powers that the FDA acquired thereafter, from requiring companies to include warnings in drug advertisements to dictating the way companies must investigate their own experimental compounds. Thus, although the scope and power of the FDA were modest at the agency's inception, its scope widened and its power increased markedly in the decades that followed. Now, a century later, the agency's purview includes foods and drugs for humans and animals, cosmetics, medical devices (including everything from breast implants to powered wheelchairs), blood and tissues, vaccines, and any products deemed to be radiation emitters (including cell phones and lasers). And the agency's power is nothing short of enormous. . . .
  • Topic: Health
  • Political Geography: America
  • Author: Elan Journo
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: The measure of success in the Iraq war has undergone a curious progression. Early on, the Bush administration held up the vision of a peaceful, prosperous, pro-Western Iraq as its benchmark. But the torture chambers of Saddam Hussein were replaced by the horrors of a sadistic sectarian war and a fierce insurgency that consumed thousands of American lives. And the post-invasion Iraqi regime, it turns out, is led by Islamist parties allied with religious militias and intimately tied to the belligerent Iranian regime. The benchmark, if we can call it that, then shrank to the somewhat lesser vision of an Iraqi government that can stand up on its own, so that America can stand down. But that did not materialize, either. So we heard that if only the fractious Sunni and Shiite factions in the Iraqi government could have breathing space to reconcile their differences, and if only we could do more to blunt the force of the insurgency, that would be progress. To that end, in early 2007, the administration ordered a "surge" of tens of thousands more American forces to rein in the chaos in Iraq. Today, we hear John McCain and legions of conservatives braying that we are, in fact, winning (some go so far as to say we have already won). Why? Because the "surge" has reduced the number of attacks on U.S. troops to the levels seen a few years ago (when the insurgency was raging wildly) and the number of Iraqis slaughtering their fellow countrymen has taken a momentary dip. Victory, apparently, requires only clearing out insurgents (for a while) from their perches in some neighborhoods, even though Teheran's influence in the country grows and Islamists carve out Taliban-like fiefdoms in Iraq. The goals in Iraq "have visibly been getting smaller," observes John Agresto, a once keen but now disillusioned supporter of the campaign (p. 172). Iraq, he argues contra his fellow conservatives, has been a fiasco. "If we call it 'success,' it's only because we've lowered the benchmark to near zero" (p. 191). . . .
  • Topic: War
  • Political Geography: Iraq, America
  • Author: Eric Daniels
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: One of the distinguishing features of American life is the large degree of freedom we have in making choices about our lives. When choosing our diets, we have the freedom to choose everything from subsisting exclusively on junk food to consuming meticulously planned portions of fat, protein, and carbohydrate. When choosing how to conduct ourselves financially, we have the freedom to choose everything from a highly leveraged lifestyle of debt to a modest save-for-a-rainy-day approach. In every area of life, from health care to education to personal relationships, we are free to make countless decisions that affect our long-term happiness and prosperity-or lack thereof. According to Richard Thaler and Cass Sunstein, professors at the University of Chicago and authors of Nudge: Improving Decisions About Health, Wealth, and Happiness, this freedom and range of options is problematic. The problem, they say, is that most people, when given the opportunity, make bad choices; although Americans naturally want to do what is best for themselves, human fallibility often prevents them from knowing just what that is. "Most of us are busy, our lives are complicated, and we can't spend all our time thinking and analyzing everything" (p. 22). Average Americans, say Thaler and Sunstein, tend to favor the status quo, fall victim to temptation, use mental shortcuts, lack self-control, and follow the herd; as a result, they eat too much junk food, save too little, make bad investments, and buy faddish but useless products. Many Americans, according to the authors, are more like Homer Simpson (impulsive and easily fooled) than homo economicus (cool, calculating, and rational). "One of our major goals in this book," they note, "is to see how the world might be made easier, or safer, for the Homers among us" (p. 22). The particular areas where these Homers need the most help are those in which choices "have delayed effects . . . [are] difficult, infrequent, and offer poor feedback, and those for which the relation between choice and experience is ambiguous" (pp. 77-78).The central theme of Nudge is the idea that government and the private sector can improve people's choices by manipulating the "choice architecture" they face. As Thaler and Sunstein explain, people's choices are often shaped by the way in which alternatives are presented. If a doctor explains to his patient that a proposed medical procedure results in success in 90 percent of cases, that patient will often make a different decision from the one he would have made if the doctor had told him that one in ten patients dies from the procedure. Free markets, the authors argue, too often cater to and exploit people's tendencies to make less than rational choices. Faced with choices about extended warranties or health care plans or investing in one's education, only the most exceptional and rational people will make the "correct" choices. Most people, the authors argue, cannot avoid the common foibles of bad thinking; thus we ought to adopt a better way of framing and structuring choices so that people will be more likely to make better decisions and thereby do better for themselves. Hence the title: By presenting information in a specific way, "choice architects" can "nudge" the chooser in the "right" direction, even while maintaining his "freedom of choice."
  • Topic: Health
  • Political Geography: America, Chicago
  • Author: Joe Kroeger
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: In the years since the attacks of 9/11, there have been numerous attempts by terrorists to attack Americans on our own soil, but all of these attempts have been foiled. Who is responsible for this remarkable record, and how have they achieved it? These questions are answered in Ronald Kessler's recent book, The Terrorist Watch: Inside the Desperate Race to Stop the Next Attack, which surveys the work of the individuals involved in America's intelligence community since 9/11. In twenty-seven brief chapters, Kessler documents the post-9/11 work of the CIA, FBI, National Security Agency (NSA), National Geospatial-Intelligence Agency (NGA), National Counterterrorism Center (NCTC), and other agencies-showing the organizational, tactical, and technological changes that have occurred, along with their positive results. The book begins by recounting the events of September 11, 2001, from President Bush being informed of the first plane crashing into the World Trade Center, to his "We're at war" declaration, to the initial coordination of efforts among the vice president, the military, and law enforcement and intelligence agencies. Proceeding from there, Kessler shows how the CIA immediately linked some of the hijackers to Al Qaeda and how, a few days later, the president began redirecting the priorities of the FBI and the Justice Department from prosecuting terrorists to preventing attacks. . . .
  • Political Geography: America
  • Author: Eric Daniels
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: According to Joel Waldfogel, a professor of business and public policy at the Wharton School of Business, "a dominant strand of current thinking" regards markets as superior to government when it comes to providing consumers with what they want. When government undertakes the provision of goods, the products offered are limited to those that meet with the approval of the majority, whereas "[m]arkets are thought to avoid the tyranny of the majority because in markets each person can decide what she wants." According to this dominant argument, he writes, "what's available to me in markets depends only on my preferences, not on anyone else's" (p. 2). In his recent book, The Tyranny of the Market, Waldfogel challenges this assumption. When one considers what actually happens in free markets, when one considers the products available therein, says Waldfogel, "it's clear that you can be better off in your capacity as a consumer of a particular product as more consumers share your preferences" (p. 4). In other words, you are more likely to get exactly what you want if your tastes are shared by the majority and less likely to get exactly what you want if your tastes differ from the majority. Thus, Waldfogel contends, when it comes to providing the goods that people want, "the market does not generally avoid the tyranny of the majority" any more than does a democratic political system that allocates goods (p. 6). Waldfogel's goal is to examine "how markets actually work" in order to allow policy makers and citizens to balance the shortcomings of markets against the shortcomings of government and "to determine an appropriate mix in each arena" (p. 36). Toward this end, he leads the reader through a series of examples in which there appears to be a breakdown in the market provision of goods. . . .
  • Topic: Government
  • Author: John David Lewis
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: During World War II, the prime source of information for Americans about the war overseas was the dispatches of foreign correspondents-men who put their lives on the line in war zones to report the truth. George Weller was a giant among such men. Captured by the Nazis and traded for a German journalist, Weller watched the Belgian Congolese Army attack Italians in Ethiopia, saw the invasion of Crete, interviewed Charles de Gaulle in South Africa following an escape through Lisbon, and overcame malaria to report on the war in the Pacific. He was the first foreign correspondent trained as a paratrooper, and he won a Pulitzer Prize for his report of an appendectomy on a submarine. He wrote the book Singapore is Silent in 1942 after seeing the city fall to the Japanese, and he advocated a global system of United States bases in his 1943 book Bases Overseas. After witnessing Japan's surrender on September 2, 1945, he broke General Douglas MacArthur's order against travel to Nagasaki by impersonating an American colonel and taking a train to the bombed-out city. In a period of six weeks, he sent typewritten dispatches totaling some fifty thousand words back to American newspapers through official channels of the military occupation. Under MacArthur's directives, they were censored and never made it into print. Weller died in 2002 thinking his dispatches had been lost. Months later his son, Anthony Weller, found a crate of moldy papers with the only surviving carbon copies. Anthony Weller edited the dispatches and included his own essay about his father, resulting in this priceless addition to our information about World War II in the Pacific, and the birth of the atomic age. The importance of the dispatches, however, extends far beyond the value of the information from Nagasaki. George Weller is a voice from a past generation, and the publication of his censored dispatches raises a series of deeply important issues and, in the process, reveals an immense cultural divide between his world and ours today. On September 8, 1945, two days after he arrived in Nagasaki, Weller wrote his third dispatch concerning Nagasaki itself. He described wounded Japanese in two of Nagasaki's undestroyed hospitals, and recorded the question posed by his official guide: Showing them to you, as the first American outsider to reach Nagasaki since the surrender, your propaganda-conscious official guide looks meaningfully in your face and wants to know: "What do you think?" What this question means is: Do you intend writing that America did something inhuman in loosing this weapon against Japan? That is what we want you to write (p. 37). What would many reporters today write if asked this question by bombed enemy civilians? . . .
  • Topic: War
  • Political Geography: Japan, America, Germany, Nagasaki
  • Author: Craig Biddle
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Surveys the promises of John McCain and Barack Obama, shows that these intentions are at odds with the American ideal of individual rights, demonstrates that the cause of such political aims is a particular moral philosophy (shared by McCain and Obama), and calls for Americans to repudiate that morality and to embrace instead a morality that supports the American ideal.
  • Political Geography: America, Europe
  • Author: Craig Biddle
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: No abstract is available.
  • Topic: Economics
  • Political Geography: America
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: No abstract is available.
  • Political Geography: New York, California
  • Author: Craig Biddle
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Concretizes the selfishness-enabling nature of capitalism and shows why this feature makes it the only moral social system on earth.
  • Topic: Economics
  • Political Geography: America
  • Author: John David Lewis
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Analyzes the resounding Republican defeat and shows that the party faces a fundamental decision that will determine whether it orchestrates a comeback or stumbles into further defeat.
  • Topic: Education, Government
  • Political Geography: Taliban
  • Author: Raymond C. Niles
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: The Internet is an achievement of historic importance, arguably rivaling or exceeding the invention of the printing press in its capacity to spread human knowledge and entertainment to the farthest corners of the globe. With the introduction of his printing press in 1450,1 Gutenberg took the books from the hands of cloistered monks and put them into the hands of those who would challenge the orthodoxy of the Church-and into the hands of those who would build the free society that has produced the industrial and technological marvels we enjoy today. In the same manner, the Internet takes encyclopedic knowledge from the libraries and puts it into the homes of people all over the earth. It delivers images of artworks from the Louvre and the Metropolitan Museum of Art to our homes. It makes the wares of locally owned boutiques available to a world of customers. It facilitates discussions between distant scholars and enthusiasts on every possible subject. And, as did the printing press, it can lead to great cultural and political change, by spreading truths that censored media around the world cannot speak. The Internet promotes the open exchange of ideas and information in an unprecedented way. What makes this open exchange of ideas and information possible? According to some, the answer is something called "net neutrality." "Net Neutrality is the reason why the Internet has driven economic innovation, democratic participation, and free speech online," claims one website. And, say its advocates, net neutrality-and thus the Internet itself-is in grave danger: The big phone and cable companies are spending hundreds of millions of dollars lobbying Congress and the Federal Communications Commission to gut Net Neutrality, putting the future of the Internet at risk. . . . The consequences of a world without Net Neutrality would be devastating. Innovation would be stifled, competition limited, and access to information restricted. Consumer choice and the free market would be sacrificed to the interests of a few corporate executives.2 Such claims naturally catch the attention of people who value innovation, competition, and information. And anything that threatens to thwart the free market is certainly cause for alarm. But what exactly is net neutrality? Does it really protect these crucial values? If so, how? And if not, might it actually assault them? To answer these questions, we must first specify the exact nature of the Internet. . . .
  • Author: Gus Van Horn
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Not long ago, Alan Greenspan was widely regarded as a sort of gnome of Zurich, on whose unique, ineffable powers our prosperity depended. His famously cryptic mumblings could spook markets and spur investors, many of whom believed that his every word carried great weight. One news channel would cite the thickness of his briefcase before certain meetings as an "economic indicator." The "Maestro," as one biographer called him, even appeared on the cover of Time as part of a three-man "Committee to Save the World." And, most remarkably, Greenspan's reputation as a brilliant economist and potential savior of the world was part and parcel of his reputation as a capitalist. Indeed, he had studied under the great "radical for capitalism" Ayn Rand and had written cogent essays in defense of individual rights and free markets. But then, on October 23, 2008, media outlets around the country dropped a bombshell: Alan Greenspan, "lifelong champion of free markets," had declared capitalism dead. The financial crisis, it seems, shook Greenspan to his core and led him to conclude that free markets do not work. As the San Francisco Chronicle reported: Asked by committee Chairman Henry Waxman [D-CA] whether his free-market convictions pushed him to make wrong decisions, especially his failure to rein in unsafe mortgage lending practices, Greenspan replied that indeed he had found a flaw in his ideology, one that left him very distressed. "In other words, you found that your view of the world, your ideology was not right?" Waxman asked. "Absolutely, precisely," replied Greenspan, who stepped down as Fed chief in 2006 after more than eighteen years as chairman. "That's precisely the reason I was shocked, because I have been going for 40 years or more with very considerable evidence it was working exceptionally well." But the idea that Greenspan possessed "free-market convictions" and that those convictions are why he failed to rein in unsound lending practices is ridiculous. The very purpose of the Federal Reserve-the central bank at the heart of our troubled, government-controlled economy and the money machine that Greenspan operated for almost twenty years-is to manipulate the market. Such a "bank" would not even exist in a free market, and its precise function in our mixed economy is to engage in unsound lending practices as a means of such manipulation. As explained in The Federal Reserve System: Purposes and Functions, which is available from the Federal Reserve's website, one of the primary functions of the Fed is "conducting the nation's monetary policy by influencing the monetary and credit conditions in the economy." The document elaborates, explaining that the Fed "influences" the rate of inflation by setting the interest rate (known as the "federal funds rate") at which private banks can borrow from the various Federal Reserve banks. When the Fed lowers this rate, it thereby expands credit and increases the supply of fiat money-money that is unmoored to any commodity; money that is just printed paper representing no real value in the marketplace; money that is, essentially, worthless. This constitutes inflation and wreaks havoc on the economy. Once upon a time, Greenspan openly acknowledged the destructive nature of fiat money. "The law of supply and demand is not to be conned," he wrote in his famous 1966 essay "Gold and Economic Freedom": As the supply of money (of claims) increases relative to the supply of tangible assets in the economy, prices must eventually rise. Thus the earnings saved by the productive members of the society lose value in terms of goods. When the economy's books are finally balanced, one finds that this loss in value represents the goods purchased by the government for welfare or other purposes with the money proceeds of the government bonds financed by bank credit expansion. Again, "the earnings saved by the productive members of the society lose value . . . represent[ing] the goods purchased by the government for welfare or other purposes." In other words, Greenspan acknowledged in 1966 that one of the primary functions of the Fed is to violate property rights-yours and mine-by printing fiat money and thereby coercively decreasing the value of our hard-earned savings. The Federal Reserve-in all its anti-capitalistic glory-is by its very nature the primary generator of unsound banking. Greenspan knew this in 1966, when he wrote that article; he knew it in 1987, when he accepted his post as chairman of the Fed; he knew it during the eighteen years he manipulated the money supply; and he knows it today. . . .
  • Topic: Economics
  • Author: Brian P. Simpson
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: "We've got to go after the oil companies," says President-elect Barack Obama in response to high oil and gasoline prices. "We've got to go after [their] windfall profits." Explaining the purpose of recently proposed energy legislation, Senate Majority Leader Harry Reid says: "We are forcing oil companies to change their ways. We will hold them accountable for unconscionable price-gouging and force them to invest in renewable energy or pay a price for refusing to do so." Calling for government seizure of private power plants, California Senate Leader John Burton insists: "We have to do something. These people have got us by the throat. They're making more money than God, and we've got to fight back-not with words, but with actions." This attitude toward energy producers, which is practically unanimous among American politicians today, is wreaking havoc not only on the lives and rights of these producers, but on the lives and rights of Americans in general. It leads to laws and regulations that prohibit producers and consumers from acting on their rational judgment with respect to energy. It causes energy shortages, brownouts, and blackouts that thwart everyone's ability to be productive and enjoy life. And it results in higher prices not only for energy, but for every good and service that depends on energy-which means every good and service in the marketplace, from food to transportation to medical care to sporting events to education to housing. Energy producers, like all rational businessmen, are in business to make money. Profits are what motivate them to exert the requisite brain power, to engage in the necessary research, and to invest the massive amounts of money required to produce and deliver the energy we need to light, heat, and cool our homes, and to power the factories, workplaces, and tools required to produce the goods on which our lives depend. Their profit motive is to our benefit. Moreover, energy producers, like all human beings, have a moral right to act according to their own judgment so long as they do not violate the rights of others. They have a moral right to use and dispose of the product of their effort as they see fit. They have a moral right to contract with customers by mutual consent to mutual benefit. In other words, they have a moral right to life, liberty, property, and the pursuit of happiness. And it is only by respecting these rights that we can expect energy producers to produce energy. So let us examine the assault on these producers, count the ways in which this assault is both impractical and immoral, and specify what must be done to rectify this injustice. . . .
  • Topic: Government
  • Political Geography: America, California
  • Author: Gena Gorlin
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Einstein credited Isaac Newton, the father of physics and arguably the founder of scientific certainty, with "the greatest advance in thought that a single individual was ever privileged to make." The compliment is not hyperbole: In his Principia and the discoveries that preceded it, Newton single-handedly deciphered more of the universe's enigmas than perhaps any other scientist in history. He revolutionized mathematics, integrated the previously disparate fields of mechanics and astronomy, and thus opened the door to the science of force and motion as we know it. And yet, ironically, Newton himself remains an enigma to those biographers who attempt to identify the force that moved him-the motive that compelled him to strive for the discovery and validation of scientific truths on such a grand scale. Most biographers shy from examining Newton's motives, asserting that a genius and creative power of Newton's magnitude defies mundane human explanation. One scholar, however, accepted the charge. Frank Manuel, in his biography, A Portrait of Isaac Newton, attempted to diagnose the source of Newton's genius by means of what is, in effect, a retroactive psychoanalysis. In keeping with the neo-Freudian school of psychology, Manuel attempts to demystify Newton's thought and behavior by speculating about repressed insecurities and unconscious defense mechanisms that may have commanded Newton's psyche. Since the book's publication in 1968 and to this very day, biographers and Newton scholars defer to Manuel-with varying degrees of enthusiasm-on the question of what, in Newton's character and soul, could have spawned his inexhaustible passion for discovering the nature of things. Manuel's model of Newton's underlying motives stands, by default, as the definitive account of the psyche whence sprang the Principia. For example, James Gleick, in his recent biography Isaac Newton, quotes-without critique or comment-Manuel's interpretation of Newton's relationship with his niece Catherine, whom he raised and nurtured into adulthood: "'In the act of fornication between his friend Halifax and his niece was Newton vicariously having carnal intercourse with his mother?'" Such rhetorical suggestions offered in explanation of Newton's words and actions abound in Manuel. And because other Newton scholars have defaulted on the task of evaluating Newton's motives, such "suggestions" have stood unchallenged and unrefuted to this day-coloring the legacy and tainting the name of one of history's greatest scientists. It is easy to sympathize with biographers who struggle in vain to knit together the apparently disparate threads of Newton's psychological life. By common accounts, Newton was a man of perplexing contradiction-described alternately as an arrogant, self-obsessed egomaniac and then as a neurotic "recluse" crippled by "searing" self-doubt. On one hand, he is the man who proclaimed his own theory of light to be "the oddest if not the most considerable detection which has hitherto been made in the operations of nature"-which Manuel interprets as an instance of Newton's "fantasies of omnipotence and omniscience and his self-image as the perfect one." On the other hand, Newton himself, denying a friend's compliments of his preternatural genius, mused that his success is the outcome of "nothing but industry a patient thought." And in the twilight of his years Newton reflected on himself as "only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me." These are not the words of a "narcissist" or an "egomaniac" who is "completely wrapped up in himself," as is commonly postulated-but of a man wrapped up in the quest for truth, and humbled by the vastness of its terrain. Indeed, Newton's close friend John Locke described him as "a nice man to deal with and a little too apt to raise in himself suspicions where there was no ground." This seeming contradiction between Newton's supreme "arrogance" on the one hand and his apparent self-doubt on the other is what led Manuel to postulate that Newton was driven by deep-seated unconscious insecurities. But is there really any contradiction? In truth, when Newton was certain of his conclusions, he was unshakably sure of himself-and did not suffer criticism that he considered misguided, let alone dishonest. Yet when he was uncertain, he did not rest content until he copiously checked his conclusions against the facts. For instance, in 1666, he tested his theory of an inverse-square gravitational force, likely formulated in the early 1660s by deduction from Kepler's elliptical theory, against observations of the moon's motions. He found that the motions agreed "pretty nearly," but not nearly enough-and because he could not explain the discrepancy, he set aside the theory until he could amass further evidence. Whereas his fellow scientists, from Ptolemy to Copernicus to Galileo, sought nothing further than approximate agreement between their theories and the empirical phenomena, Newton was "extreme" in his demands for accuracy. In Manuel's account, such polar extremities of behavior bear the elusive signature of a "neurotic." For Manuel, Newton's apparent "swings" of extreme self-assurance and extreme self-doubt are symptoms of a deep neurosis rooted in his childhood. In true Freudian fashion, Manuel points to Newton's abandonment by his mother when she remarried and sent him to live with his grandmother. He also points to Newton's puritanical education, which allegedly inculcated in him a powerful fear of wrongdoing. These traumatic experiences, according to Manuel's account, begat a combined longing for attachment and fear of punishment that molded his scientific thought. Manuel even goes so far as to speculate that Newton's discovery of the gravitational force was inspired by his childhood anxieties: Newton "knew of the common metaphoric description of the attractive power of a magnet as love" and of the "'sociability' of liquids." Consequently, Manuel speculates, Newton's "longing for the absent ones, his dead father and his remarried mother" inspired his formulation of gravity as "a sort of an impulse or attraction." And as for Newton's vehemence in defending the certainty of his conclusions-this Manuel explains as a wall of defense against the puritanical guilt constantly clamoring to invade his mind. In one instance, Manuel cites a fervent (and thus, in his view, feverish) letter in which Newton defends his theory of light against a religiously motivated attack by a group of Jesuit scholars. The letter was penned in response to a charge by Anthony Lucas, a Jesuit who leveled a cavalier accusation against Newton's measuring accuracy, claiming that Newton had incorrectly reported the length of an image of the light spectrum. Newton had previously refrained from defending himself against the unscientific accusations of the Jesuits, writing that he did not wish to become a "slave to philosophy" (which meant, in his terms, to empty sophistic bickering) by engaging them in argument. But now, he was infuriated: Lucas had challenged the accuracy of Newton's experimental data, without bothering to supply any evidence in support of his challenge. Enraged, Newton wrote to Oldenberg, president of the Royal Society of London: "'Tis the truth of my experiments which is the business at hand. On this my Theory depends, which is of more consequence, the credit of my being wary, accurate and faithful in the reports I have made or shall make of experiments in any subject, seeing that a trip in any one will bring all the rest into suspicion." Ultimately the Royal Society duplicated Newton's experiments and formally refuted Lucas's objections, such that, in Newton's terms, he stood "convicted" by the "trial of the Royal Society." Lucas had miscalculated in groundlessly questioning the truth of Newton's calculations; his error was in underestimating Newton's reverential devotion to that truth. In Manuel's interpretation, such reverence bespeaks neurosis. Interpreting the incident, Manuel writes, "Ambivalent neurotics have a craving for certainty. Themore searing the doubt the more profound the need for a safe haven. Newton had two such refuges, a great blessing for a man in his state of everlasting tension: one was the Bible . . . the other was mathematical proof." He proceeds to explain the neurotic basis for Newton's intolerance of doubt: "Scientific error was assimilated with sin, for it could only be the consequence of sloth on his part and a failure in his divine service. For Newton a sin was not an act of human frailty that could be forgiven, but a sign that the culprit was possessed by evil." Manuel thus diagnoses Newton's preoccupation with scientific certainty and his impatience with doubt as symptoms of a psychotic insecurity, which stems, in turn, from his oppressive puritanical upbringing. But such a diagnosis seems quite peculiar if one considers that Newton is right. As is the case with his theory of light, Newton's theories do stand or fall on the truth of his evidential arguments, because true theories admit of no contradictory evidence. If an objector such as Anthony Lucas were to discover an error in Newton's empirical measurements, he would indeed cast doubt on Newton's entire theory. Thus, in vehemently defending the truth of his experiments against those who launch a frivolous attack against it, Newton reveals his "religious" devotion, not to his puritanical schoolteachers or to a wrathful God who censures his every step, but to the real, physical world-and to his understanding thereof. It is not Newton's departure from reality, then, but rather the intensity of his devotion to it that Manuel finds psychotic. Psychosis is generally defined as a state of delusion, a split from reality. A preoccupation with "divine service," as Manuel describes it, implies obeisance to an invisible God at the expense of any rational judgments grounded in this-worldly evidence. But Newton's alleged psychosis consists precisely in obeying the evidence of this world as he searches for scientific truth; the preoccupation Manuel refers to is not with an otherworldly authority, but with physical reality. Manuel does not seem to recognize this distinction, as he repeatedly lumps together Newton's reverence for the truth with his deference to God and his alleged cringing fear of divine punishment. Manuel's psychoanalytic model of the mind treats any pervasive behavior or thought pattern as symptomatic of childhood trauma or some other brand of prior conditioning. What this model does not seem to admit as a possibility is a mind not buoyed along by external circumstances, but rather governing itself, holding the truth as its sole guide and master-a mind conscientiously committed to the pursuit of truth as its primary and fundamental motive. And yet such a scenario should not seem so far-fetched, given that grasping the truth is the mind's proper and primary function. Viewed in this light, Newton's "craving for certainty" is not a symptom of illness but of exemplary mental health. Indeed, the craving for certainty is a venerable virtue-for to crave certainty is to crave the truth, and truth is infinitely valuable. Given the advances in science enabled by the certainty of Newton's laws, and the vast benefits ultimately conferred on human life by those advances, the objective value of Newton's "preoccupation" with truth is indisputable. . . .
  • Topic: History
  • Author: David Harriman
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Author's note: The following is excerpted from Chapter 6 of my book in progress, "The Inductive Method in Physics." In contrast to perception, thinking is a fallible process. This fact gives rise to our need for the method of logic. Logic, when properly applied, enables us to arrive at true conclusions. But it comes with no guarantee that we will apply the method correctly. The laws of deduction were identified by Aristotle more than two millennia ago, and yet people still commit deductive fallacies. If one remains attentive to the evidence, however, further use of logic leads to the correction of these errors. The same is true of false generalizations reached by induction. Although even the best thinkers can commit inductive errors, such errors wither and die in the light shed by continued application of observation and logic. During the past century, however, many philosophers have rejected the validity of induction and argued that every generalization is an error. For example, Karl Popper claimed that all the laws of Kepler, Galileo, and Newton have been "falsified"; in his view, no laws or generalizations have ever been or can ever be proven true. By demanding that a true generalization must apply with unlimited precision to an unlimited domain, Popper upheld a mystical view of "truth" that is forever outside the reach of man and accessible only to an omniscient god. In the end, he was left with two types of generalizations: those that have been proven false and those that will be proven false. He was then accused by later philosophers of being too optimistic; they insisted that nothing can be proven, not even a generalization's falsehood. Such skeptics commit-on a grand scale-the fallacy of dropping context. The meaning of our generalizations is determined by the context that gives rise to them; to claim that a generalization is true is to claim that it applies within a specific context. The data subsumed by that context are necessarily limited in both range and precision. Galileo, for example, committed no error when he identified the parabolic nature of trajectories. Obviously, he was not referring to the 6,000-mile path of an intercontinental ballistic missile (to which his law does not apply). He was referring to terrestrial bodies that could be observed and studied in his era-all of which remained close to the surface of the earth, traveled perhaps a few hundred feet, and moved in accordance with his law. Similarly, when Newton spoke of bodies and their motion, he was not referring to the movement of an electron in an atom or of a proton in a modern accelerator. He was referring to observable, macroscopic bodies, ranging from pebbles to stars. The available context of knowledge determines the referents of the concepts that are causally related in a generalization. The context also includes the accuracy of the data integrated by a law. For example, Kepler's laws of planetary motion are true; they correctly identify causal relationships that explain and integrate the data available to Kepler. By Newton's era, however, the measurement errors in astronomical data had been reduced by more than a factor of ten, and today they have been reduced by another factor of ten. In order to explain the more accurate data, one must grasp not only that the sun exerts a force on the planets, but also that the planets exert forces on each other and on the sun. The truths discovered by Kepler were essential to these later discoveries; they made it possible to identify deviations of the new data from the original laws, which in turn made it possible to identify the additional causal factors and develop a more general theory. In cases where the data are insufficient to support a conclusion, it is important to look closely at the exact nature of the scientist's claim. He does not commit an error simply by proposing a hypothesis that is later proven wrong, provided that he correctly identified the hypothetical status of the idea. If he can cite some supporting evidence, and he has not overlooked any data that contradict it, and he rejects the idea when counterevidence is discovered, then his thinking is flawlessly logical. An example is provided by the work of Albert Ladenburg, a 19th-century German chemist, who proposed a triangular prism structure of the benzene molecule.1 Ladenburg's hypothesis was consistent with the data available in the 1860s, but it was rejected a few years later when it clashed with Jacobus Henricus van't Hoff's discovery of the symmetrical arrangement of carbon bonds. In such cases, the scientist's thinking is guided by the evidence at every step, and he deserves nothing but praise from the logician. A true generalization states a causal relationship that has been induced from observational data and integrated within the whole of one's knowledge (which, in terms of essentials, spans the range of facts subsumed by the generalization). A scientist makes an error when he asserts a generalization without achieving such an integration. In such cases, the supporting evidence is insufficient, and often the scientist has overlooked counterevidence. I make no attempt here to give an exhaustive list of the essential inductive fallacies. Rather, I have chosen five interesting cases in which scientists have investigated a complex phenomenon and reached false generalizations by deviating from the principles of induction. In each case, I examine the context of knowledge available to the scientist and seek to identify the factors that cast doubt on his conclusion. . . .
  • Author: Eric Daniels
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Seventy-five years have elapsed since Franklin Delano Roosevelt introduced the flurry of government programs he called the New Deal. In the years since, most historians have lavished FDR with praise, claiming that his bold leadership helped to pull America out of the Great Depression. Even those who acknowledge the failure of particular Roosevelt-era programs claim that FDR instilled hope and confidence in the American people, and that his economic failures were the result of his not going far enough in his policies and not spending enough money. Today, amidst calls for increasing government regulation of the financial industry and increasing government spending through stimulus packages, the New Deal is making a comeback. In light of the recent mortgage crisis and economic downturn, pundits are calling for a revival of 1930s-style policies. Daniel Gross claimed at Slate.com that New Deal reforms were "saving capitalism again." Newly minted Nobel economist Paul Krugman issued calls in the New York Times for President-elect Obama to mimic and expand FDR's response to the Great Depression. And a recent Time cover called for a "New New Deal"-and featured an iconic photo of FDR, with Obama's face and hands substituted. As the Obama administration begins to implement its economic plan, Americans would do well to reexamine the history of the original New Deal and its effects. Though most historians rank FDR as a great president, some, including Burton Folsom Jr., boldly dare to ask if "the New Deal, rather than helping to cure the Great Depression, actually help[ed to] prolong it" (p. 7). According to Folsom, a professor of history at Hillsdale College, the answer is clearly the latter. In New Deal or Raw Deal? How FDR's Economic Legacy Has Damaged America, he challenges the myth that FDR's New Deal represents a shining moment in American history. As long as the mythology surrounding the New Deal remains intact, he notes, "the principles of public policy derived from the New Deal will continue to dominate American politics" (p. 15), costing Americans billions of dollars and further damaging the economy. . . .
  • Topic: Economics, Government, History
  • Political Geography: America
  • Author: Gus Van Horn
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: In August 1919, three white men brutally beat John R. Shillady in broad daylight outside his hotel. Shillady, also white, had come to Austin, Texas, as executive secretary of the NAACP to persuade state officials not to suppress its local branch. One of his attackers, a county judge, claimed that "it was my duty to stop him" because Shillady was there to "sow discontent among the Negroes" (pp. 105-106). In 1920, Shillady would resign from the NAACP, expressing despair for his cause: "I am less confident than heretofore . . . of the probability of overcoming, within a reasonable period, the forces opposed to Negro equality" (p. 109). And yet, not even a century later, the United States has elected its first black president-in an election in which race was hardly an issue. How did racial equality in America progress so far in so short a time? This is the remarkable story that Adam Fairclough relates in Better Day Coming: Blacks and Equality, 1890-2000. Fairclough succeeds in making his introduction to the struggle for black equality accessible to the general reader in two ways. First, he concentrates on events in the South, wherein particularly harsh forms of racial domination made it the logical focus of black efforts to achieve equality. Second, he follows the lead of fellow historian John W. Cell and classifies the approaches taken by various figures in his narrative as either "militant confrontation" (defiantly opposing racial oppression), "separatism" (working toward the creation of an all-black society here or abroad), or "accommodation" (gradually securing improvements from within the system of white supremacy) (pp. xi-xii). It is from this perspective that the book's chapters examine prominent individuals, organizations, events, and periods of the civil rights movement. Fairclough begins his narrative at a time when blacks were "more powerless than at any other time since the death of slavery" and had been "purged from the voting rolls" of the former Confederacy (pp. 15-17). He proceeds to examine the many different ways in which blacks fought against discrimination and oppression: from the intransigent, confrontational approach of Ida B. Wells, who campaigned against lynching in the 1890s; to the accommodation of Booker T. Washington, whose emphasis on black self-improvement over confrontation is characterized by Fairclough as "a tactical retreat in order to prepare the way for a strategic advance" (p. 63); to the separatism of Marcus Garvey, who proposed that blacks fight for an independent, united Africa (p. 126). Fairclough continues this kind of analysis throughout subsequent chapters, where we learn, among other things, about the involvement of the labor movement and the Communist party in the civil rights movement during the 1930s, the evolution of the NAACP's strategy to include legal challenges to discrimination in education after World War II and then mass civil disobedience after 1955, and the rise and fall of the "Black Power" movement. . . .
  • Topic: War
  • Political Geography: United States, America
  • Author: Craig Biddle
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: No abstract is available.
  • Author: Raymond C. Niles
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Surveys the history and achievements of America's electricity entrepreneurs, shows how government interference in the transmission grid has hampered their enterprises from the outset to the present day, and indicates what America must do to liberate the grid and enable a new wave of entrepreneurs to supply this vital product commensurate with the country's demand.
  • Topic: Economics, Government
  • Political Geography: New York, America
  • Author: Alex Epstein
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Who were we that we should succeed where so many others failed? Of course, there was something wrong, some dark, evil mystery, or we never should have succeeded!1 -John D. Rockefeller The Standard Story of Standard Oil In 1881, The Atlantic magazine published Henry Demarest Lloyd's essay "The Story of a Great Monopoly"-the first in-depth account of one of the most infamous stories in the history of capitalism: the "monopolization" of the oil refining market by the Standard Oil Company and its leader, John D. Rockefeller. "Very few of the forty millions of people in the United States who burn kerosene," Lloyd wrote,
  • Topic: Government
  • Political Geography: United States, New York
  • Author: David Harriman
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Author's note: The following is adapted from a chapter of my book in progress, "The Inductive Method in Physics." Whereas my article "The 19th-Century Atomic War" (TOS, Summer 2006) focused on the opposition to the atomic theory that arose from positivist philosophy, this article focuses on the evidence for the atomic theory and the epistemological criteria of proof. It is necessary to repeat some material from my earlier articles in TOS, but the repetition is confined mainly to the first few pages below.
  • Topic: Government
  • Political Geography: New York
  • Author: Larry Salzman
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Conservatives today have considerable influence on America's legal culture. They are welcome on law school faculties at even the most elite institutions, and they man dozens of think tanks, policy centers, and public interest law foundations pressing varying brands of conservative doctrine on the courts-with a degree of success rivaling competing liberal organizations. Shrill leftists allege a "vast right-wing conspiracy," and senators now begin judicial nomination hearings with hand-wringing warnings about the behind-the-scenes influence of "Federalist Society lawyers."
  • Political Geography: America
  • Author: John David Lewis
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: War is one of man's most destructive activities (only dictatorship has ruined more lives), and it is not surprising that thousands of books have been written about it. Yet, paradoxically, books on war itself-books concerned with war as a phenomenon, rather than focused on strategy, tactics, or some particular war-have been relatively few. This is due in part to the focus by modern scholars on the minutiae of human affairs, and their reluctance to deal with broad generalizations; but the failure to come to grips with the abstract principles of war goes back to the dawn of historical writing. The ancient Greek historian Thucydides, a soaring intellect obsessed by the great war between Athens and Sparta, identified "honor, security and interest" as causally important principles that motivate men "for all time." But even Thucydides did not examine the philosophical foundations of these factors; he took them as given in human nature, which left the study of war mired in the vagaries of human desires and without philosophical grounding.1 As a result, important questions remained unanswered: What are the principles of war; what are their philosophical foundations, and what methods of waging war do they imply? In ancient China, a thriving culture of thinkers tried to answer such questions. They derived principles of warfare from ideas that were fundamental to their own philosophies and applied those principles to the practical needs of military commanders. The extant remains of these works have been compiled into the so-called seven Chinese military classics, the best preserved of which is Art of War by Sun-tzu, who lived sometime between 450 and 250 BC, about the time of classical Greece.2 This was approximately the "Warring States" period of Chinese history, when China was divided among military warlords, iron was first used in weapons, armies grew to more than one hundred thousand men, and commanders needed expert guidance to help them organize their huge forces. Ralph Sawyer has produced a lively translation, with a historical essay and explanatory notes, of Sun-tzu's classic work. Sawyer also includes new supplementary material, found in graves and carved on bamboo stalks, that adds to our knowledge of ancient Chinese thought. . . .
  • Topic: War
  • Author: John P. McCaskey
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: The 19th-century philosopher John Stuart Mill is widely regarded as one of history's leading proponents of inductive science and of political liberty. Yet, oddly, philosophers working in his train have been remarkably unsuccessful in saying exactly what is wrong with the scientific skepticism or the political tyrannies of the past one hundred and fifty years. Is it possible that Mr. Mill was not such a good guy after all? This question is not the stated theme of Laura Snyder's Reforming Philosophy, but it is the underlying spirit of this excellent work of scholarly intellectual history.