No matter who is appointed to replace retiring members of the Supreme Court, the larger issues will remain unchanged, as they have been for nearly seven decades — the New Deal Supreme Court has become a permanent fixture in our country.
Changes brought about by Franklin Roosevelt’s Court solidified the trends that had occurred since the Progressive Era, trends that could have come about only through viewing the U.S. Constitution in a way fundamentally different from what the Framers intended. As Roosevelt “brain truster” Rexford G. Tugwell put it, the Constitution was a document that was written not to promote and expand a welfare state but instead to protect Americans from their own government.
That the New Deal justices were able to absolutely subvert the Constitution — and with it, the rule of law — and do it without meaningful opposition from Congress and the Fourth Estate constitutes one of the darkest chapters in American history. A nation that was conceived in liberty and limited government has become a country where almost no meaningful limits are placed on those who are in authority, all with the approval of the courts, which were supposed to be one of the bulwarks against such action.
In this last part of my series, I first reappraise the judicial/legal philosophy that has guided the Court since the mid 1930s and show how it radically departs from how the Founders viewed the law. This ethos is one which holds that government — including the judicial branch — is the entity that must “change society.” Thus, there exists a “compelling government interest” that began with tampering with economic institutions and ultimately spread to the rest of society.
Second, I examine some of the decisions and their aftermath to demonstrate how the Court’s legal philosophy led to the continuation of the expansion of federal powers and the misuse of the Constitution’s Commerce Clause.
The great Austrian economist Ludwig von Mises once wrote that, while he set out to be a “reformer,” instead he became the “historian of decline.” Likewise, this series has been a history of decline of what was once a bastion of Western civilization: law and liberty. However, I do not write pessimistically, for as long as there are people who deeply treasure individual rights, private property, and liberty, there will be the possibility of uprooting the collectivist mindset that permeates the political, economic, and legal landscape in the United States.
Miller and gun control
Although the Second Amendment begins with a preamble on militias, it is clear that the Framers considered gun ownership to be a basic right. Unfortunately, as was noted in part two of this series, the Supreme Court had already weakened the Second Amendment by claiming it protected a collective right, not an individual one. Such reasoning would guide the Court in future gun-control decisions.
In 1939, the Court ruled in the last gun-control case to be heard by the Supreme Court, United States v. Miller, which involved a man who had violated the National Firearms Act of 1934 by having a sawed-off shotgun in violation of the rule that a barrel had to be at least 18 inches long. In ruling for the government, the Court held that the Second Amendment protected a collective right, not an individual right, for Americans to own firearms. Furthermore, the Court tied the right to the Second Amendment’s preamble about the necessity of a well-trained militia, in essence saying that individual rights extended only to weapons that would be considered useful for a militia, and that the state could heavily regulate those “rights.”
Although the Court’s decision in Miller was limited, it was collectivist in nature and did not question the power of the federal government to restrict individual ownership of firearms. Since then, gun-control laws have proliferated on the state and national levels, and most Americans who are gun owners can easily run afoul of those statutes. At last count, there were about 20,000 laws on the books at all levels of government to regulate use and ownership of firearms, all to restrict a right that the Framers clearly believed was enjoyed by the people.
Furthermore, one must tie the proliferating anti-gun-ownership laws to the growing power of the police. Self-defense, once seen as necessary and even honorable, has been relegated to “vigilantism,” which is given a negative connotation. The current ethos is that home and personal defense should be left to the authorities, who ostensibly are better at such things.
(Unfortunately, the Supreme Court has also ruled that the police are under no legal obligation to protect anyone. Thus, people who protect their lives and property with personal weapons are subject to prosecution, but waiting for the police to show up can be fatal.)
The attack on private property
The Court’s continuing assault on the institution of private property came in a multi-pronged manner. First, the principle of “no discrimination” was used to constitutionally trump private-property rights. Second, the Court strengthened the power of government to seize private property under the notion that anything the government contends will increase tax revenue constitutes a “public purpose.”
The year 1954 is significant not only for Brown v. Board of Education, which helped launched the modern “civil rights movement,” but also for Berman v. Parker, which held that the seizing and tearing down of “blighted areas” for “urban renewal” met the Fifth Amendment guidelines for eminent domain. The Progressives of the late 19th and early 20th centuries had “slum clearance” as one of their main goals for urban planning, and the Court agreed, albeit nearly 50 years after the Progressive Era.
Fast forward 51 years later to the Kelo v. New London decision, in which the Court ruled that areas did not even have to be “blighted” in order to be condemned. All that was needed was a “formula” that demonstrated that the municipality or political entity in question could raise more tax revenue by taking private property and selling it at less-than-market prices to developers who would then build shopping centers, condominiums, or something similar. To put it another way, the U.S. Supreme Court said, in effect, that there really is no such thing as private property, at least the kind of private property that existed at the time of the framing of the Constitution.
Another example of the Court’s intrusion into the institution of private property has come with the civil-rights laws. While the Brown v. Board of Education decision of 1954 is outside the scope of this article, the decision not only emboldened both the Court and Congress to expand the definitions of racial discrimination but also, in effect, to “nationalize” businesses by declaring them to be “public” entities, falling under the jurisdiction of Congress through the Commerce Clause.
Ten years after Brown, Congress passed the landmark 1964 Civil Rights Act. Resorting (once again) to its new interpretation of the Commerce Clause, Congress expanded the reach of “no discrimination” to private property, but that was not all that the act involved. During the New Deal, as noted in part three of this series, Congress ceded much of its power to the executive branch, including the final interpretation of the laws Congress passed.
This situation had special meaning with regard to the Civil Rights Act, which had very specific language forbidding the use of racial quotas and the evoking of special racial tests in situations governed by the Act (such as employment, college admissions, and the like). However, as Paul Craig Roberts and Lawrence M. Stratton point out in their book The Tyranny of Good Intentions, when the Court reviewed a challenge to the use of racial quotas, it deferred to the administration, specifically the Equal Employment Opportunity Commission.
The EEOC decided to “redefine discrimination as unintentional statistical group disparities” which could be remedied only through racial quotas, as Roberts and Stratton point out. For example, a business could be charged with racial discrimination if the percentage of minority employees on its payroll did not match the racial makeup of the surrounding area.
The Supreme Court, in unanimously approving this definition of racial discrimination, said, “The administrative interpretation of the act by the enforcing agency is entitled to great deference.” Thus, the government was able to use a law that unequivocally outlawed racial quotas to create a system based on racial quotas, all with the enthusiastic approval of the Supreme Court.
As pointed out in part three, the government further intruded on private-property rights with its Wickard decision, which restricted the amount of wheat a farmer could grow on his property, despite the fact that the wheat was intended for personal consumption. While Wickard generally is discussed in the context of agriculture and interstate commerce (indeed, the Court used the Commerce Clause of the Constitution to justify its decision), it can also be termed an assault on private property.
Keep in mind that at the time of the decision, the United States was at war with Germany and Japan and much of the agricultural harvest was being diverted to war uses. To deal with the huge shortages of fresh vegetables, Americans started what were termed “Victory Gardens,” turning spaces that might have been devoted to flowers, grass, or even weeds into garden plots. The logic of Wickard easily could have been applied to the “Victory Gardens,” although it is clear that such a move by the authorities would have been tremendously unpopular and would have seriously damaged the home-front morale.
Changes in the purpose of law
Thus, we have a clear example of the presence of laws on the books that would be enforced selectively, and the kinds of laws that lent themselves to this kind of abuse were an unfortunate legacy of the New Deal and the Roosevelt Supreme Court. The Framers would have been horrified to see that the Commerce Clause — put into place in order to guarantee free trade among the states — was being used, in effect, to criminalize a farmer’s growing wheat on his own property for his own consumption. Paul Rosenzweig writes,
At its inception, criminal law was directed at conduct that society recognized as inherently wrongful and, in some sense, immoral. These acts were wrongs in and of themselves (malum in se), such as murder, rape, and robbery. In recent times the reach of the criminal law has been expanded so that it now addresses conduct that is wrongful not because of its intrinsic nature but because it is a prohibited wrong (malum prohibitum) — that is, a wrong created by a legislative body to serve some perceived public good. These essentially regulatory crimes have come to be known as “public welfare” offenses.
While no one went to jail in the Wickard case, it is clear that it fell within the “public welfare” category of what Rosenzweig describes. In this case, the government wanted to keep wheat prices high in order to provide benefits to a certain political constituency, and anything that might interfere with that goal — and one doubts that it could be considered a “lofty” goal at that — is illegal. In fact, since the New Deal Supreme Court effectively revolutionized law in this country (or at least provided the push to topple the legal order created by the nation’s Founders), the growth of laws — both civil and criminal — at all levels of government has occurred within the “public welfare” classification.
The emphasis on the “public purpose” of the law is yet another example of how the courts have become collectivist in their focus. As Rosenzweig pointed out, in the past criminal law ultimately dealt with harm imposed wrongfully on individuals. Today, however, the victim is “society”; even if no individuals are harmed — as was the situation in Wickard — for according to the modern way of interpreting the law, it is considered that a wrong still has been committed.
War and the Constitution
The last time Congress declared war was immediately after the attack on Pearl Harbor in December 1941. It is obvious, however, that World War II was not the last war involving U.S. troops. Since the end of hostilities in the late summer of 1945, soldiers from this country have been involved in wars or military actions in Korea, Vietnam, Cambodia, Lebanon, Panama, Grenada, Kuwait, Somalia, Bosnia, Afghanistan, and Iraq.
Korea, Vietnam, Cambodia, Grenada, Panama, Kuwait, Afghanistan, and Iraq involved full-scale invasions, and many conflicts have included the use of bombers and air fighters, not to mention the navy.
However, while thousands of American troops have died in these wars, the thing they have in common is that Congress has not declared war in any of them. All of the post–World War II wars that have involved American troops have been executive wars. The president has decided to commit troops, and the Pentagon has dutifully carried out his orders. Furthermore, the presidents involved have come from both political parties, so this development is not a partisan issue.
While the Constitution gave Congress the power to declare war, the executive branch has in effect usurped this power as well, with little protest from the legislative branch and with the clear approval of the Supreme Court. Before the Great Depression, it would have been almost unthinkable for the president to openly involve U.S. troops in continuous wartime operations without a declaration of war from Congress; the New Deal thinking changed all of that.
While Roosevelt sought and received a declaration of war from Congress following Pearl Harbor, he already had committed American goods, equipment, and personnel to the British side for about two years, not to mention having U.S.-funded troops (the Flying Tigers) attacking Japanese forces in China. Even Woodrow Wilson would not have been able to do such a thing.
Thus, each time a U.S. soldier dies in Iraq or Afghanistan, his death has indirectly come about because Congress agreed to transfer much of its legislative power to the executive branch, clearly in violation of the Constitution’s “nondelegation” clause. Unfortunately, the Supreme Court declines to enforce constitutional provisions on war, and, therefore, this succession of presidential wars since 1950 has been the unhappy result.
This article has covered only a small portion of the post–New Deal Supreme Court’s crimes against the Constitution. For lack of space, I have not dealt with the Court’s rulings on asset forfeiture, which has accompanied the government’s “war on drugs,” nor have I dealt with the various Court assaults on free speech, religious beliefs, and civil liberties.
To be able to fully gauge the effect that the New Deal has had on our lives today through the Supreme Court, a deforestation of North America would be needed to write a volume large enough. However, there are two consistent themes that have emerged in the past seven decades. The first is that private property is considered to be an anachronism, useful only insofar as it serves as a mechanism to raise tax revenues for government. The second is that the U.S. Supreme Court and all U.S. courts, federal and state, are expected to be movers and arbiters of social change. To put it bluntly, the courts see themselves as having a mission to implement the policies of the Progressive Era. Unfortunately, what the political classes see as being “progressive” actually is little more than a regression into tyranny in which the state has absolute power.
This article originally appeared in Freedom Daily.