Search

You searched for: Content Type Journal Article Remove constraint Content Type: Journal Article Publishing Institution The Objective Standard Remove constraint Publishing Institution: The Objective Standard Political Geography America Remove constraint Political Geography: America
Number of results to display per page

Search Results

  • Author: Eric Daniels
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: On June 23, 2005, the United States Supreme Court's acquiescence in a municipal government's use of eminent domain to advance "economic development" goals sent shockwaves across the country. When the Court announced its decision in Kelo v. City of New London, average homeowners realized that their houses could be condemned, seized, and handed over to other private parties. They wanted to know what had gone wrong, why the Constitution and Fifth Amendment had failed to protect their property rights. The crux of the decision, and the source of so much indignation, was the majority opinion of Justice John Paul Stevens, which contended that "economic development" was such a "traditional and long accepted function of government" that it fell under the rubric of "public use." If a municipality or state determined, through a "carefully considered" planning process, that taking land from one owner and giving it to another would lead to increased tax revenue, job growth, and the revitalization of depressed urban areas, the Court would allow it. If the government had to condemn private homes to meet "the diverse and always evolving needs of society," Stevens wrote, so be it. The reaction to the Kelo decision was swift and widespread. Surveys showed that 80 to 90 percent of Americans opposed the decision. Politicians from both parties spoke out against it. Such strange bedfellows as Rush Limbaugh and Ralph Nader were united in their opposition to the Court's ruling. Legislatures in more than forty states proposed and most then passed eminent domain "reforms." In the 2006 elections, nearly one dozen states considered anti-Kelo ballot initiatives, and ten such measures passed. On the one-year anniversary of the decision, President Bush issued an executive order that barred federal agencies from using eminent domain to take property for economic development purposes (even though the primary use of eminent domain is by state and local agencies). The "backlash" against the Court's Kelo decision continues today by way of reform efforts in California and other states. Public outcry notwithstanding, the Kelo decision did not represent a substantial worsening of the state of property rights in America. Rather, the Kelo decision reaffirmed decades of precedent-precedent unfortunately rooted in the origins of the American system. Nor is eminent domain the only threat to property rights in America. Even if the federal and state governments abolished eminent domain tomorrow, property rights would still be insecure, because the cause of the problem is more fundamental than law or politics. In order to identify the fundamental cause of the property rights crisis, we must observe how the American legal and political system has treated property rights over the course of the past two centuries and take note of the ideas offered in support of their rulings and regulations. In so doing, we will see that the assault on property rights in America is the result of a long chain of historical precedent moored in widespread acceptance of a particular moral philosophy.Property, Principle, and Precedent In the Revolutionary era, America's Founding Fathers argued that respect for property rights formed the very foundation of good government. For instance, Arthur Lee, a Virginia delegate to the Continental Congress, wrote that "the right of property is the guardian of every other right, and to deprive a people of this, is in fact to deprive them of their liberty." In a 1792 essay on property published in the National Gazette, James Madison expressed the importance of property to the founding generation. "Government is instituted to protect property of every sort," he explained, "this being the end of government, that alone is a just government, which impartially secures to every man, whatever is his own." Despite this prevalent attitude-along with the strong protections for property contained in the United States Constitution's contracts clause, ex post facto clause, and the prohibition of state interference with currency-the founders accepted the idea that the power of eminent domain, the power to forcibly wrest property from private individuals, was a legitimate power of sovereignty resting in all governments. Although the founders held that the "despotic power" of eminent domain should be limited to taking property for "public use," and that the victims of such takings were due "just compensation," their acceptance of its legitimacy was the tip of a wedge. The principle that property rights are inalienable had been violated. If the government can properly take property for "public use," then property rights are not absolute, and the extent to which they can be violated depends on the meaning ascribed to "public use." From the earliest adjudication of eminent domain cases, it became clear that the term "public use" would cause problems. Although the founders intended eminent domain to be used only for public projects such as roads, 19th-century legislatures began using it to transfer property to private parties, such as mill and dam owners or canal and railroad companies, on the grounds that they were open to public use and provided wide public benefits. Add to this the fact that, during the New Deal, the Supreme Court explicitly endorsed the idea that property issues were to be determined not by reference to the principle of individual rights but by legislative majorities, and you have the foundation for all that followed. . . .
  • Topic: Development, Economics
  • Political Geography: United States, America, London
  • Author: Stella Daily
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: This article is dedicated to Anna Tomalis, a young girl who died of liver cancer on August 15, 2008. Anna's parents desperately sought experimental treatment that might have saved her life, but were delayed for months by FDA bureaucracy. Anna finally received approval to obtain treatment through a clinical trial in July, but died after receiving just one round of treatment. She was thirteen years old. Abigail Burroughs was not the typical cancer patient: She was just nineteen years old when she was diagnosed with squamous cell cancer that had spread to her neck and lungs. Her prognosis was poor, but a then-experimental drug, Erbitux, offered the hope of saving her life. Abigail was denied that hope by the Food and Drug Administration. Because the drug was considered experimental, she could receive it only as part of a clinical trial-and Abigail was ineligible to participate in any trials at the time. Despite the best efforts of her family, friends, and doctor, Abigail was unable to receive the treatment that might have saved her life. At twenty-one years old, Abigail died of her disease. Abigail's father, Frank Burroughs, thought other patients with life-threatening illnesses should not be denied the ability to try any treatment that might give them a chance. In his daughter's name, he formed the Abigail Alliance for Better Access to Developmental Drugs, which sued the FDA in 2003. The group argued that the FDA's restrictions on access to experimental treatments constitute a violation of the right to self-defense as well as of the Fifth Amendment right not to be deprived of life, liberty, or property without due process of law. In August 2007, the Appeals Court of the District of Columbia struck a blow against the Abigail Alliance, and against individual rights, when it ruled that patients, even the terminally ill, do not have the right to receive treatment that has not been approved by the FDA. Erbitux has since been approved by the FDA to treat cancer of the head and neck-too late, of course, for Abigail Burroughs. How has America come to a point where the government denies dying patients the right to try to save their own lives? To answer that question, let us begin with a brief history of the Food and Drug Administration.A Brief History of the FDA Prior to the 20th century, the government did not regulate pharmaceutical products in the United States. Although Congress had considered federal regulations on food and drug safety as early as 1879, it had refrained from passing any legislation in this regard. However, with the muckraking journalism of the early 1900s, and especially with the publication of Upton Sinclair's novel The Jungle, which portrayed unsavory practices in the meatpacking industry, the American public clamored for laws to ensure the safe production of food and drugs. This public outcry pushed Congress to pass federal legislation in 1906. As the resulting Food and Drugs Act applied to drugs specifically, products were required to be sold only at certain levels of purity, strength, and quality; and ingredients considered dangerous (such as morphine or alcohol) had to be listed on the product's label. Violators would be subject to seizure of goods, fines, or imprisonment. Thus, in order to enforce the Act, the Food and Drug Administration was born. In its early years, the agency focused primarily on food rather than on pharmaceuticals, but in 1937 it increased its focus on drugs after a new formulation of sulfanilamide, a drug that had previously been successfully used to treat certain bacterial infections, proved to be deadly. The drug's manufacturer, S. E. Massengill Company, had dissolved an effective drug in a toxic solvent. More than one hundred people, babies and children among them, died as a result of taking Massengill's product, known as Elixir Sulfanilamide. Under the 1906 Food and Drugs Act, the FDA was not authorized to prosecute Massengill for selling an unsafe drug, and the agency had the power to recall Elixir Sulfanilamide only via a technicality. Because "elixir" was defined as a drug dissolved in alcohol, and because Massengill's formulation used the nonalcoholic solvent ethylene glycol, the product was technically mislabeled, bringing it under FDA jurisdiction and enabling the agency to recall the product. The public and legislators wanted more: They wanted the FDA not only to recall mislabeled products, but to prevent the sale of unsafe drugs in the first place. Thus, popular demand gave rise to the Food, Drug, and Cosmetics Act of 1938, which greatly expanded the FDA's authority. The most important change brought about by this Act was a shift in the burden of proof. Rather than prosecuting a drugmaker after the fact for having fraudulently marketed a product, the FDA would now require proof of safety before a drug could be marketed at all. (Note that this required manufacturers to prove a negative-i.e., that a given drug would not harm consumers.) After World War II, pharmaceutical companies came under still more scrutiny. Then, as now, complaints about the cost of drugs reached Congress, and in 1961 Senator Estes Kefauver led the charge in an investigation not only of drug pricing, but of the relationship between the drug industry and the FDA. Kefauver sought to pass legislation that would increase the agency's authority over drug production, distribution, and advertising. Whereas previously proof of safety alone was required to gain FDA approval, the proposed law would require drug manufacturers also to prove the efficacy of their products. Kefauver's bill might have languished in congressional debate but for the emergence at that time of data showing that thalidomide, which was then sold as a sleep aid and antinausea medication for pregnant women, caused severe birth defects in the children of women who took it. Thalidomide had not yet been approved for use in the United States at that time due to concerns of an FDA reviewer over a different side effect noted in the drug's application for approval. The drug was widely used in other countries, however, and the babies of many women who used it were born with grotesquely deformed limbs. As their harrowing images flooded the media, Americans realized they had narrowly escaped inflicting these deformities on their own children. The resulting public outcry led to Kefauver's bill being made law in 1962. This law served as the cornerstone for the wide powers that the FDA acquired thereafter, from requiring companies to include warnings in drug advertisements to dictating the way companies must investigate their own experimental compounds. Thus, although the scope and power of the FDA were modest at the agency's inception, its scope widened and its power increased markedly in the decades that followed. Now, a century later, the agency's purview includes foods and drugs for humans and animals, cosmetics, medical devices (including everything from breast implants to powered wheelchairs), blood and tissues, vaccines, and any products deemed to be radiation emitters (including cell phones and lasers). And the agency's power is nothing short of enormous. . . .
  • Topic: Health
  • Political Geography: America
  • Author: Elan Journo
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: The measure of success in the Iraq war has undergone a curious progression. Early on, the Bush administration held up the vision of a peaceful, prosperous, pro-Western Iraq as its benchmark. But the torture chambers of Saddam Hussein were replaced by the horrors of a sadistic sectarian war and a fierce insurgency that consumed thousands of American lives. And the post-invasion Iraqi regime, it turns out, is led by Islamist parties allied with religious militias and intimately tied to the belligerent Iranian regime. The benchmark, if we can call it that, then shrank to the somewhat lesser vision of an Iraqi government that can stand up on its own, so that America can stand down. But that did not materialize, either. So we heard that if only the fractious Sunni and Shiite factions in the Iraqi government could have breathing space to reconcile their differences, and if only we could do more to blunt the force of the insurgency, that would be progress. To that end, in early 2007, the administration ordered a "surge" of tens of thousands more American forces to rein in the chaos in Iraq. Today, we hear John McCain and legions of conservatives braying that we are, in fact, winning (some go so far as to say we have already won). Why? Because the "surge" has reduced the number of attacks on U.S. troops to the levels seen a few years ago (when the insurgency was raging wildly) and the number of Iraqis slaughtering their fellow countrymen has taken a momentary dip. Victory, apparently, requires only clearing out insurgents (for a while) from their perches in some neighborhoods, even though Teheran's influence in the country grows and Islamists carve out Taliban-like fiefdoms in Iraq. The goals in Iraq "have visibly been getting smaller," observes John Agresto, a once keen but now disillusioned supporter of the campaign (p. 172). Iraq, he argues contra his fellow conservatives, has been a fiasco. "If we call it 'success,' it's only because we've lowered the benchmark to near zero" (p. 191). . . .
  • Topic: War
  • Political Geography: Iraq, America
  • Author: Eric Daniels
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: One of the distinguishing features of American life is the large degree of freedom we have in making choices about our lives. When choosing our diets, we have the freedom to choose everything from subsisting exclusively on junk food to consuming meticulously planned portions of fat, protein, and carbohydrate. When choosing how to conduct ourselves financially, we have the freedom to choose everything from a highly leveraged lifestyle of debt to a modest save-for-a-rainy-day approach. In every area of life, from health care to education to personal relationships, we are free to make countless decisions that affect our long-term happiness and prosperity-or lack thereof. According to Richard Thaler and Cass Sunstein, professors at the University of Chicago and authors of Nudge: Improving Decisions About Health, Wealth, and Happiness, this freedom and range of options is problematic. The problem, they say, is that most people, when given the opportunity, make bad choices; although Americans naturally want to do what is best for themselves, human fallibility often prevents them from knowing just what that is. "Most of us are busy, our lives are complicated, and we can't spend all our time thinking and analyzing everything" (p. 22). Average Americans, say Thaler and Sunstein, tend to favor the status quo, fall victim to temptation, use mental shortcuts, lack self-control, and follow the herd; as a result, they eat too much junk food, save too little, make bad investments, and buy faddish but useless products. Many Americans, according to the authors, are more like Homer Simpson (impulsive and easily fooled) than homo economicus (cool, calculating, and rational). "One of our major goals in this book," they note, "is to see how the world might be made easier, or safer, for the Homers among us" (p. 22). The particular areas where these Homers need the most help are those in which choices "have delayed effects . . . [are] difficult, infrequent, and offer poor feedback, and those for which the relation between choice and experience is ambiguous" (pp. 77-78).The central theme of Nudge is the idea that government and the private sector can improve people's choices by manipulating the "choice architecture" they face. As Thaler and Sunstein explain, people's choices are often shaped by the way in which alternatives are presented. If a doctor explains to his patient that a proposed medical procedure results in success in 90 percent of cases, that patient will often make a different decision from the one he would have made if the doctor had told him that one in ten patients dies from the procedure. Free markets, the authors argue, too often cater to and exploit people's tendencies to make less than rational choices. Faced with choices about extended warranties or health care plans or investing in one's education, only the most exceptional and rational people will make the "correct" choices. Most people, the authors argue, cannot avoid the common foibles of bad thinking; thus we ought to adopt a better way of framing and structuring choices so that people will be more likely to make better decisions and thereby do better for themselves. Hence the title: By presenting information in a specific way, "choice architects" can "nudge" the chooser in the "right" direction, even while maintaining his "freedom of choice."
  • Topic: Health
  • Political Geography: America, Chicago
  • Author: Joe Kroeger
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: In the years since the attacks of 9/11, there have been numerous attempts by terrorists to attack Americans on our own soil, but all of these attempts have been foiled. Who is responsible for this remarkable record, and how have they achieved it? These questions are answered in Ronald Kessler's recent book, The Terrorist Watch: Inside the Desperate Race to Stop the Next Attack, which surveys the work of the individuals involved in America's intelligence community since 9/11. In twenty-seven brief chapters, Kessler documents the post-9/11 work of the CIA, FBI, National Security Agency (NSA), National Geospatial-Intelligence Agency (NGA), National Counterterrorism Center (NCTC), and other agencies-showing the organizational, tactical, and technological changes that have occurred, along with their positive results. The book begins by recounting the events of September 11, 2001, from President Bush being informed of the first plane crashing into the World Trade Center, to his "We're at war" declaration, to the initial coordination of efforts among the vice president, the military, and law enforcement and intelligence agencies. Proceeding from there, Kessler shows how the CIA immediately linked some of the hijackers to Al Qaeda and how, a few days later, the president began redirecting the priorities of the FBI and the Justice Department from prosecuting terrorists to preventing attacks. . . .
  • Political Geography: America
  • Author: John David Lewis
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: During World War II, the prime source of information for Americans about the war overseas was the dispatches of foreign correspondents-men who put their lives on the line in war zones to report the truth. George Weller was a giant among such men. Captured by the Nazis and traded for a German journalist, Weller watched the Belgian Congolese Army attack Italians in Ethiopia, saw the invasion of Crete, interviewed Charles de Gaulle in South Africa following an escape through Lisbon, and overcame malaria to report on the war in the Pacific. He was the first foreign correspondent trained as a paratrooper, and he won a Pulitzer Prize for his report of an appendectomy on a submarine. He wrote the book Singapore is Silent in 1942 after seeing the city fall to the Japanese, and he advocated a global system of United States bases in his 1943 book Bases Overseas. After witnessing Japan's surrender on September 2, 1945, he broke General Douglas MacArthur's order against travel to Nagasaki by impersonating an American colonel and taking a train to the bombed-out city. In a period of six weeks, he sent typewritten dispatches totaling some fifty thousand words back to American newspapers through official channels of the military occupation. Under MacArthur's directives, they were censored and never made it into print. Weller died in 2002 thinking his dispatches had been lost. Months later his son, Anthony Weller, found a crate of moldy papers with the only surviving carbon copies. Anthony Weller edited the dispatches and included his own essay about his father, resulting in this priceless addition to our information about World War II in the Pacific, and the birth of the atomic age. The importance of the dispatches, however, extends far beyond the value of the information from Nagasaki. George Weller is a voice from a past generation, and the publication of his censored dispatches raises a series of deeply important issues and, in the process, reveals an immense cultural divide between his world and ours today. On September 8, 1945, two days after he arrived in Nagasaki, Weller wrote his third dispatch concerning Nagasaki itself. He described wounded Japanese in two of Nagasaki's undestroyed hospitals, and recorded the question posed by his official guide: Showing them to you, as the first American outsider to reach Nagasaki since the surrender, your propaganda-conscious official guide looks meaningfully in your face and wants to know: "What do you think?" What this question means is: Do you intend writing that America did something inhuman in loosing this weapon against Japan? That is what we want you to write (p. 37). What would many reporters today write if asked this question by bombed enemy civilians? . . .
  • Topic: War
  • Political Geography: Japan, America, Germany, Nagasaki
  • Author: Craig Biddle
  • Publication Date: 09-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Surveys the promises of John McCain and Barack Obama, shows that these intentions are at odds with the American ideal of individual rights, demonstrates that the cause of such political aims is a particular moral philosophy (shared by McCain and Obama), and calls for Americans to repudiate that morality and to embrace instead a morality that supports the American ideal.
  • Political Geography: America, Europe
  • Author: Craig Biddle
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: No abstract is available.
  • Topic: Economics
  • Political Geography: America
  • Author: Craig Biddle
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: Concretizes the selfishness-enabling nature of capitalism and shows why this feature makes it the only moral social system on earth.
  • Topic: Economics
  • Political Geography: America
  • Author: Brian P. Simpson
  • Publication Date: 12-2008
  • Content Type: Journal Article
  • Journal: The Objective Standard
  • Institution: The Objective Standard
  • Abstract: "We've got to go after the oil companies," says President-elect Barack Obama in response to high oil and gasoline prices. "We've got to go after [their] windfall profits." Explaining the purpose of recently proposed energy legislation, Senate Majority Leader Harry Reid says: "We are forcing oil companies to change their ways. We will hold them accountable for unconscionable price-gouging and force them to invest in renewable energy or pay a price for refusing to do so." Calling for government seizure of private power plants, California Senate Leader John Burton insists: "We have to do something. These people have got us by the throat. They're making more money than God, and we've got to fight back-not with words, but with actions." This attitude toward energy producers, which is practically unanimous among American politicians today, is wreaking havoc not only on the lives and rights of these producers, but on the lives and rights of Americans in general. It leads to laws and regulations that prohibit producers and consumers from acting on their rational judgment with respect to energy. It causes energy shortages, brownouts, and blackouts that thwart everyone's ability to be productive and enjoy life. And it results in higher prices not only for energy, but for every good and service that depends on energy-which means every good and service in the marketplace, from food to transportation to medical care to sporting events to education to housing. Energy producers, like all rational businessmen, are in business to make money. Profits are what motivate them to exert the requisite brain power, to engage in the necessary research, and to invest the massive amounts of money required to produce and deliver the energy we need to light, heat, and cool our homes, and to power the factories, workplaces, and tools required to produce the goods on which our lives depend. Their profit motive is to our benefit. Moreover, energy producers, like all human beings, have a moral right to act according to their own judgment so long as they do not violate the rights of others. They have a moral right to use and dispose of the product of their effort as they see fit. They have a moral right to contract with customers by mutual consent to mutual benefit. In other words, they have a moral right to life, liberty, property, and the pursuit of happiness. And it is only by respecting these rights that we can expect energy producers to produce energy. So let us examine the assault on these producers, count the ways in which this assault is both impractical and immoral, and specify what must be done to rectify this injustice. . . .
  • Topic: Government
  • Political Geography: America, California