Front Page Titles (by Subject) VOLUME 2, NUMBER 1, SPRING 1962 - New Individualist Review
The Online Library of Liberty
A project of Liberty Fund, Inc.
Search this Title:
VOLUME 2, NUMBER 1, SPRING 1962 - Ralph Raico, New Individualist Review 
New Individualist Review, editor-in-chief Ralph Raico, introduction by Milton Friedman (Indianapolis: Liberty Fund, 1981).
About Liberty Fund:
Liberty Fund, Inc. is a private, educational foundation established to encourage the study of the ideal of a society of free and responsible individuals.
The copyright to this publication is held by Liberty Fund, Inc. The New Individualist Review is prohibited for use in any publication, journal, or periodical without written consent of J. M. Cobb, J. M. S. Powell, or David Levy.
Fair use statement:
This material is put online to further the educational goals of Liberty Fund, Inc. Unless otherwise stated in the Copyright Information section above, this material may be used freely for educational and academic purposes. It may not be used in any way for profit.
VOLUME 2, NUMBER 1, SPRING 1962
A. J. P. TAYLOR AND THE CAUSES OF WORLD WAR II
HARRY ELMER BARNES
THE NEW CONSERVATISM
JAMES M. O’CONNELL
SIN AND THE CRIMINAL LAW
ROBERT M. HURT
THE SHORTCOMINGS OF RIGHT-WING FOREIGN POLICY
JOHN P. McCARTHY
NEW INDIVIDUALIST REVIEW is published quarterly (Spring, Summer, Autumn, Winter) by New Individualist Review, Inc., at Ida Noyes Hall, University of Chicago, Chicago 37, Illinois.
Opinions expressed in signed articles do not necessarily represent the views of the editors. Editorial, advertising, and subscription correspondence and manuscripts should be sent to NEW INDIVIDUALIST REVIEW, Ida Noyes Hall, University of Chicago, Chicago 37, Illinois. All manuscripts become the property of NEW INDIVIDUALIST REVIEW.
Subscription rates: $2.00 per year (students $1.00). Add $1.00 for foreign subscriptions.
Copyright 1962 by New Individualist Review, Inc., Chicago, Illinois. All rights reserved. Republication of less than 200 words may be made without specific permission of the publisher, provided New Individualist Review is duly credited and two copies of the publication in which such material appears are forwarded to New Individualist Review.
Editors-in-Chief • Ronald Hamowy • Ralph Raico
Associate Editors • Robert M. Hurt • John P. McCarthy
Robert Schuettinger • John Weicher
Business Manager • Sam Peltzman
Editorial Assistants • J. Edwin Malone • Jerome Heater
Milton Friedman • F. A. Hayek • Richard M. Weaver
University of Chicago
COLLEGE AND UNIVERSITY REPRESENTATIVES
UNIVERSITY OF ARIZONA
UNIVERSITY OF CALIFORNIA (Riverside)
CITY COLLEGE OF NEW YORK
CENTRE COLLEGE OF KENTUCKY
UNIVERSITY OF DETROIT
GROVE CITY COLLEGE
UNIVERSITY OF IDAHO
UNIVERSITY OF ILLINOIS
UNIVERSITY OF INDIANA
STATE UNIVERSITY OF IOWA
UNIVERSITY OF KANSAS
UNIVERSITY OF KENTUCKY
LOYOLA UNIVERSITY (Chicago)
UNIVERSITY OF MINNESOTA
NORTHWESTERN UNIVERSITY (Chicago)
NORTHWESTERN UNIVERSITY (Evanston)
OREGON STATE COLLEGE
SETON HALL UNIVERSITY
TEXAS CHRISTIAN UNIVERSITY
UNIVERSITY OF VIRGINIA
UNIVERSITY OF WISCONSIN
YALE LAW SCHOOL
UNIVERSITY OF EDINBURGH
UNIVERSITY OF FRANKFURT
UNIVERSITY OF PARIS
A. J. P. Taylor and the Causes of World War II
IT IS A privilege and pleasure to be invited to appraise the epoch-making book of Professor A. J. P. Taylor on The Origins of the Second World War1 for the readers of a magazine made up mainly of earnest members of the younger generation who are seeking to understand the novel and complicated world into which maturity has cast their lot. No field of study could be more useful in promoting such aspirations for rational orientation than that of history. Unless we know how we got here, we are bound to be confused as to how to deal with the present or to plan for the future.
Those who are now coming to maturity are greatly handicapped in regard to historical information and realism as compared to my own generation. The 1920’s and early 1930’s were an era of iconoclasm and debunking, well symbolized by Mencken and Nathan and the American Mercury, the writings of Theodore Dreiser, Sinclair Lewis, Scott Fitzgerald, and the like. It was difficult in those days to maintain an intellectual blackout anywhere, even in the realm of historical writing. My first ardent attack on any form of historical blackout appeared in the first number of the Mercury at Mencken’s suggestion, even insistence.
The iconoclastic trend in history took the form of what has come to be known as “Revisionism,” which was devoted to wiping out the vestiges of the wartime propaganda of the previous decade. It got its name because it was hoped that the facts this movement revealed relative to the causes of the first World War would lead to the revision of the notorious Treaty of Versailles. Had this been done, there would have been no second World War, although there might have been a militant lineup of Western Europe against Soviet Russia.
The generation which was born or has been educated since 1936 or thereabouts is, historically speaking, a lost generation—a group of youthful Rip van Winkles. By 1937, the majority of American liberal intellectuals were adopting the internationalist ideology of the Popular Front and “collective security,” which Litvinov had so successfully propagated at Geneva. Nearly all liberals, and a surprising number of conservatives, jumped on the interventionist and anti-German bandwagon then being chartered and steered by President Roosevelt and Harry Hopkins. The great majority of American historians belonged to the liberal camp and became ardent interventionists.
From this time onward, most history teaching and writing in this country, in dealing with recent world events, increasingly took on the form of a fanciful, and in part unconsciously malicious fairy-tale. It presented the pattern of the late 1930’s and the 1940’s as a planetary crusading arena in which a triumvirate of St. Georges—Franklin D. Roosevelt, Winston Churchill and Joseph Stalin—were bravely united in a holy war to slay the Nazi dragon. Even before the latter had shot himself in a Berlin bunker, Roosevelt and Churchill had begun to suspect that their erstwhile Soviet fellow crusader for freedom, justice and peace was more of a menace to utopia than the Nazi “madman.” In due time, even his successor was revealed to be a threat to the Free World, although he had snatched Stalin from the Kremlin display window and buried him like any ordinary mortal.
In the 1920’s, the evidence of the mistakes which the United States had made in its first crusade in Europe under the leadership of Woodrow Wilson were frankly brought forth and displayed before the American educational world and reading public. Not so with the far greater blunders of our second global crusade. The disagreeable facts were consigned to the Orwellian “memory hole,” and the few books which sought to present the salutary truth were either ignored or viciously derided. The generation which grew up during this ill-fated crusading era has been thoroughly brainwashed in regard to the historical basis of world affairs and the role of the United States therein. It has passed little if any beyond the intellectual and informational confines of President Roosevelt’s colorful but misleading “Day of Infamy” rhetoric.
It has long since been observed that historical truth is the first casualty of a war. American historiography was sadly ailing before September, 1939, and was mortally ill by Pearl Harbor, in December, 1941. The great majority of historians ardently supported intervention in the European maelstrom. A surprisingly large group accepted posts involved in the war effort and propaganda, a number of them of much prominence and responsibility. Hence, they had a powerful vested interest in preserving and defending the dragon-killing legend.
Most historians were ardently inflamed by the emotions engendered by the wartime propaganda. Many of them, no doubt, were honestly convinced of the soundness of this interventionist and crusading propagandism. Those few who had kept their heads and really knew the score were wise enough to keep their counsel to themselves in order to hold their posts and have some assurance of promotion. Whatever the reasons for the debacle, it is certain that historical standards and products at all affected by recent world events declined to a lower level, so far as integrity and objectivity are concerned, than at any period since the close of the Counter-Reformation. For anything comparable in this country one would have to look back to the political tracts of the period of the Civil War and Reconstruction.
In the 1920’s there was a strong reaction against the military obsession for intervention in foreign quarrels. For more than a decade a trend towards peace, isolation and anti-militarism ensued. Historical writing and teaching rather generally adjusted to this climate of intellectual opinion. Revisionism sprang up and, by and large, had won the battle against the bitter-enders of the previous decade before the end of the 1920’s. Leading revisionist historians, such as Sidney Bradshaw Fay and Charles Callan Tansill, were lavishly praised by members of their craft. The journalistic culmination of revisionist spirit and lore, Walter Millis’ Road to War, became one of the outstanding best sellers of the 1930’s.
THERE WAS NO such cooling-off period or escape from militant emotions after V-J Day in 1945. Along with the perpetuation of propaganda in the guise of history came a powerful effort to prevent those who had some real regard for historical truth from getting their facts and thoughts before the American public. This project has come to be known as the “Historical Blackout.” It involved a comprehensive effort since the outbreak of the second World War to suppress the truth relative to the causes and merits of the great conflict that began in 1939 and the manner in which the United States entered it. This has consisted in ignoring or suppressing facts that ran counter to the wartime propaganda when writing books on these subjects, and in suppressing, ignoring or seeking to discredit those books which have taken account of such facts.
It has often been asserted that this historical blackout is today a sinister and deliberate plot to obstruct the truth and degrade history. This is undoubtedly the truth with respect to the program and activities of some minority groups and ideological organizations which have a special vested interest in perpetuating the wartime mythology. But, for the most part, it is more the unconscious product of nearly three decades of indoctrination that grew out of interventionist and wartime propaganda. Even most professional historians who began their teaching career after 1937 have automatically come to accept as truth the distortions of pre-war and wartime interventionism. The current blackout is more an automatic reaction to brainwashing than a perverse conspiracy. But this does not make it any less difficult to resist or overcome.
This situation following the second World War is, thus, a complete reversal of what happened after the first World War when Revisionism carried the day in the historical forum in less than a decade after the Armistice of November 11, 1918. Even some of the outstanding leaders of Revisionism after the first World War, such as Sidney B. Fay and William L. Langer, recanted their Revisionism, succumbed to the historical blackout, and gave warm support to the dragon-slaying fantasy. In only about a year and a half after the Armistice of 1918 Fay had blasted for all time the myth of the unique guilt of a Hohenzollern gorilla, as the Kaiser had been portrayed during the conflict. Within a decade after the close of the War a veritable library of revisionist books had been produced on responsibility for the calamity of 1914.
Despite the fact that the documentary material to support Revisionism after the second World War is more profuse, cogent and convincing than after 1918, as of 1962 not a single volume by an American scholar devoted exclusively to the causes of the second World War has been published in the United States—some twenty-three years after the outbreak of the War and seventeen years after its close.
To be sure one book related to the field was published, Back Door to War, by Charles Callan Tansill, now dean of diplomatic historians. It has about as much material on responsibility for 1939 as Professor Taylor’s book, is more thoroughly documented, and arrives at much the same conclusions as Taylor. But the Tansill book was designed primarily to indicate by impressive documentation how, as Clare Boothe Luce had expressed it, President Roosevelt had lied the United States into war from 1937 to 1941. Hence, there was much more interest in the antecedents of Pearl Harbor than in the responsibility for the European War in 1939, and Tansill’s extensive and valuable material on the latter was generally ignored. There have been a number of important and distinguished books by American writers which have supplemented Tansill’s account of American entry into the second World War but for the most part they have been ignored or smeared, and the dragon-slaying fiction still remains almost immaculate and impregnable.
Professor Tansill’s book, America Goes to War, which was published in 1938, and is far and away the best account of American entry into war in 1917, was declared by Dr. Henry Steele Commager to be “the most valuable contribution to the history of the pre-war years in our literature and one of the most notable achievements of historical scholarship of this generation.” His Back Door to War is an equally learned, scholarly and erudite account of our entry into the second World War, but orthodox historians have been inclined to dismiss it as merely superficial counter-propaganda. Even Charles Austin Beard, dean of American historians and political scientists, was ruthlessly smeared for presuming to protect Clio’s chastity.
Although several impressive books by informed experts, including Tansill, have detailed the facts about the Pearl Harbor disaster and scandal, Professor Foster Rhea Dulles, writing in the most formidable historical series recently launched in the United States and co-edited by Professor Commager, declared that “there is no evidence to support such charges.”
Publishers who might wish to make available the truth about the second World War are intimidated by the more powerful Book Clubs, which are without exception dominated by those supporting the historical blackout. The most influential advisory service, which has great weight in recommending book purchases by public libraries and book stores, makes a specialty of deriding and discouraging the purchase of revisionist books. The fable of the dragon-killers remains almost inviolate, so far as the general public is concerned.
William Henry Chamberlin’s America’s Second Crusade, the only substantial but popular account of our entry into the second World War, was highly comparable to Millis’ Road to War on 1917. But, whereas Millis’ book sold a quarter of a million copies, a year after the Chamberlin book was published there was still not one copy listed in the New York Public Library or in any of its many branches. It need not be alleged that all those who operate book clubs and book services deliberately aim to pervert or frustrate historical truth relative to world affairs. They are presumably supporters of truth in theory. They just do not know what it is. They are emotionally congenial to the wartime legends and most historians they know seem to agree with them. Both have been brainwashed for a generation.
The essence of what has preceded is that the generation which has gained its historical knowledge and perspectives since the late 1930’s has been deprived, cheated and handicapped by the distortion and suppression of historical facts relative to world affairs. This is especially unfortunate because of the transcendent role of world relations and policies in the everyday life, interests, decisions and destiny of the American citizen of today. This handicap is true even if a person has been a history major in college. Indeed, it is likely that he will have been more victimized by historical errors as a result of more copious and intensive indoctrination with historical fiction than one who has specialized in literature, art or music.
The importance of Professor Taylor’s highly controversial volume lies in the prospect that it will prove unusually potent in blasting through the historical blackout. Through a fortunate combination of circumstances, the book has shaken up Britain more than any other historical work in the field of world affairs since the writings of E. D. Morel just about forty years ago. It may be hoped that the American edition can do as well in producing a flash of light which will penetrate the historical blackout of nearly a generation’s duration.
For the generation represented by most of the readers of this article, the great value of the Taylor book is that it can be the logical starting-point for them in recovering the all-important lost pages of history, out of which they have been cheated by brainwashing and the historical blackout. Those who are stimulated to continue the process will find most useful J. J. Martin’s Liberal Opinion and World Politics, 1931-1941 (Devin-Adair); C. C. Tansill’s Back Door to War (Regnery); G. N. Crocker’s Roosevelt’s Road to Russia (Regnery); W. H. Chamberlin’s Beyond Containment (Regnery); and John Lukacs’ A History of the Cold War (Doubleday). These carry the story consecutively from the Hoover Administration to that of Kennedy.
HAVING THUS presented at some length the background and setting of Professor Taylor’s book, we may now consider the nature and significance of the book itself. First and foremost, it is the first book to be published in any language which is exclusively devoted to the task of debunking the dragon-slaying travesty which has colored and distorted historical perspective for nearly a quarter of a century.
It is probable that no living historian could be more appropriate as an effective and convincing author of such a book. In the first place, he is an English scholar. Due to Rhodes scholarships and other allied items which promote Anglophilism in the United States, there is a special aura attaching to English historians, their scholarship and their implied words of wisdom. This gives Taylor and his book special prestige in this country. Then, he is easily the best known and most popular of contemporary British historians. Further, he is the author of a number of substantial historical works dealing with contemporary history and diplomatic relations, most of them devoted in part at least to recent German history. In other words, he is a specialist in the field covered by his book under review here, which is not the case with such bitter critics as A. L. Rowse and Hugh R. Trevor-Roper, the former a specialist in Tudor history and poetry and the latter in Stuart ecclesiastical history and, also, poetry.
In all of his previous books, Taylor has invariably shown a rather strong antipathy to German politics and leaders. Hence, he could not logically be suspected of any pro-German sympathies or any desire to clear Hitler or any other German politician of political errors or public crimes which could be supported by reliable documentation. Finally, he has been closely associated with British left-wing activities, the Labor Party, disarmament, and other attitudes and policies which make it quite impossible for him to be imagined as having any sympathy with totalitarianism of any sort, least of all with that of National Socialist Germany in the 1930’s. Clement Attlee and the Laborites were, if anything, more vehement in their hatred of Hitler and so-called appeasement than the Tories who were in power in Britain in 1938-1939.
Hence, it would be difficult to conceive of any historian who could give greater assurance that his criticisms of the dragon-slaying hypothesis are no more than those which historical accuracy and reliable documentation makes necessary. They are a product of historical integrity and professional courage, probably more of the latter than has been displayed by any other historian of our generation. It is interesting to note that since his book on the causes of the second World War has appeared, a number of critical reviewers have accused Taylor of being a publicity-seeking vendor of sensationalism who must not be taken seriously as a historian. But these same critics were actually the very ones who had previously lauded his profound scholarship when his books reflected a strong hostility to Germany and its policies.
While indicating Professor Taylor’s attitude towards Germany, and especially the Germany of the 1930’s and Hitler, it may be well to make clear my own approach to such matters. As a lifelong exponent of freedom of thought and political action, and a veteran critic of any racial theory of history, it will be a little difficult to hang any pro-Hitler label on me. Further, I probably lost more in the way of prestige, influence and contacts in Germany than any other American intellectual as a result of the rise of Hitler and National Socialism, surely far more than any other historian. William L. Shirer and Dorothy Thompson were catapulted into fame and fortune by the ascendency of the Nazis and should have been exceedingly grateful for the emergence of Hitler.
My contention is that there are enough valid reasons for repudiating the social system represented by National Socialism without resorting to the most extensive, lurid and indefensible body of lies and distortions which have ever degraded so-called historical science and have caused Clio to bed down with the Gadarenes for a quarter of a century. My extensive revisionist labors in the 1920’s and early 1930’s were designed to encourage the revision of the Treaty of Versailles and prevent the rise and ascendency of Hitler or anybody like him.
AFTER THESE preliminary observations, which are indispensable for judging the importance and validity of Professor Taylor’s work, we can now get down to the outstanding facts and conclusions which are expressed in the book.
The vital core of the volume is the contention that Hitler did not wish a war, either local, European, or world, from March, 1933, right down into September, 1939. His only fundamental aim in foreign policy was to revise the unfair and unjust Treaty of Versailles, and to do this by peaceful methods.
This is a most remarkable and unusual contention, however well defended in the book. Hitherto, even those who have sympathized heartily with the justice and need of revising the Versailles Treaty have, nevertheless, usually maintained that, even if Hitler’s revisionist program was justified in its general objectives, he carried it out in a reprehensibly brusque, provocative and challenging manner, gladly or casually risking war in each and every move he made to achieve the revision of the Versailles system. In other words, even if his goal were justifiable, his methods of seeking to obtain it were unpardonably violent, deceitful and inciting.
Professor Taylor repudiates and refutes this interpretation as thoroughly as he does the charge that Hitler wished to provoke war at any time. He holds that Hitler was unusually cautious and unprovocative in every outstanding step he took to undermine Versailles. He let others create situations favorable to achieving his ends and then exploited them in a non-bellicose manner.
One thing is certain, even if one takes a most hostile attitude towards Hitler and Professor Taylor’s thesis. This is that the Allies had some thirteen years in which to revise the Treaty of Versailles in a voluntary and peaceful manner. But they did nothing about it, although one of the main ostensible functions of the League of Nations was stated to be carrying forward a peaceable revision of Versailles Professor Sidney B. Fay had proved by 1920 that the war-guilt clause of the Treaty of Versailles, proclaiming that Germany and her allies were solely responsible for the first World War, had no valid historical foundation whatever.
Professor Fay and the rest of us revisionists of the 1920’s hoped that the facts we brought forth had completely undermined the war-guilt clause, and would lead to the revision of the Treaty in political fact. But they did not, and the failure to do so accounts for the rise of Hitler and all the many results for good or evil which ensued.
After he came into power, Hitler waited patiently for some years for the Allies to make some practical move to revise the Versailles system before he occupied the Rhineland on March 7, 1936. Even on the heels of this action he publicly proposed on March 31, 1936, what Francis Neilson has called “the most comprehensive non-aggression pact ever to be drawn up.” But the Allies made no cooperative response whatever; they totally ignored it.
In the meantime, Hitler had barely attained power when, on May 17, 1933, he proposed the most sweeping disarmament plan set forth by any country between the two World Wars, but neither Britain nor France took any formal notice whatever of it. Even after he had introduced conscription in March, 1935, in response to the expansion of military conscription in France, Hitler declared that “the German Government is ready to take an active part in all efforts which may lead to a practical limitation of armaments.” This proposal received no more response from Britain, France or the United States than that of May, 1933. Hence, if Hitler was to revise Versailles at all, it had become completely evident by March, 1936, that it must be by unilateral action.
We may now consider what Professor Taylor concludes about the moves whereby Hitler accomplished all of his revisionist program except for the settlement with Poland, the failure of which, due to British support of Polish intransigence, brought on the European war in September, 1939. In doing so, we should always keep in mind Taylor’s fundamental assumption about Hitler, to the effect that he was not a fanatical and bellicose psychopath—a veritable madman intent upon war—but a shrewd and rational statesman, notably in his handling of foreign affairs.
It will hardly be necessary for any sane person to emphasize the fact that Professor Taylor does not seek to present Hitler as any combination of Little Lord Fauntleroy, George Washington at the cherry tree, Clara Barton and Jane Addams. He could be devious, shrewd, inconsistent, self-contradictory, cruel and brutal, although he did balk at saturation bombing of civilians until he was compeled to do so in retaliation. The main point here is that, unlike Churchill, Roosevelt and Stalin, he did not wish to have a war break out in 1939.
Professor Taylor takes up in order the main items and acts which have been exploited for decades by Hitler’s critics and orthodox historians to demonstrate Hitler’s combined depravity and bellicosity.
The occupation of the Rhineland in March, 1936, was long overdue. It should have been returned to Germany years before Hitler took over power. His forceful occupation was pure bluff. Even a strong protest from France and Britain would probably have restrained him, and an order of mobilization by France would have produced an ignominous retreat. Moreover, the act had no serious results and at least a few advantages for Britain and France.
Historians bent on maintaining Hitler’s responsibility for the second World War and his grandiose plans for world conquest have based their indictment mainly on the Hossbach Memorandum, a record made of a meeting at the German Chancellery on November 5, 1937, by a German general staff liason officer by the name of Hossbach. It was attended by Hitler, Goering, the chief army and navy officers, and the Foreign Minister.
What took place was a general consideration of the European situation, past, present, and future, and of possible German policies in relation to existing and potential developments—the type of discussion that was common, even routine, in the higher counsels of any great state. Those who were present gave little serious attention to what was said after the conference broke up, and a majority of them were out of office or command before the summer of 1939. The memorandum had been lost sight of until the Allies dug it up about a decade later and sprang it maliciously as a surprise on Goering at the Nuremberg Trial.
Taylor dismisses the Hossbach Memorandum with deserved contempt: “Hitler, it is claimed, decided on war and planned it in detail on 5 November, 1937. Yet the Hossbach Memorandum contains no plans of the kind . . . Hitler did not make plans for world conquest or for anything else . . . [His speculations] bear hardly any relation to the actual outbreak of war in 1939.”
Although the public at large knew little about the Hossbach Memorandum, world opinion was well aware of the occupation of Austria on March 12, 1938, the so-called Anschluss or union of Germany and Austria. The circumstances were quite different from what Hitler had planned and wished, and were forced on him by the stupidity and duplicity of Schuschnigg. Hitler had planned to take over control gradually by infiltration and political operations from within Austria. He was annoyed by being compelled to make a show of force and was humiliated by the spectacle that his ill-prepared army made in marching into Vienna.
The Anschluss itself had been recommended by most fair-minded and realistic observers of the post-war situation, and it was greeted with enthusiasm by the majority of the people of Austria. But for the short-sighted opposition of Britain and France it would have been accomplished during the era of the Weimar Republic and might have helped to bolster the fortunes of both the Weimar regime and Austria, even to the point of saving both from National Socialism.
FEW EPISODES or events in the history of civilized mankind have been more vehemently attacked and viciously pilloried than the Munich Conference of September 29-30, 1938. It has been depicted and denounced as a veritable incarnation of the cowardly betrayal of all principle and public ethics in international dealings. It gave rise to the most widely used political smear term of the present generation—“appeasement”—which is actually the procedure whereby most normal diplomacy had been carried on for centuries, namely, by rational and peaceful negotiations. Munich has also been especially portrayed as the most ignominious and irresponsible defeat Britain ever met in her entire diplomatic experience and the main cause of the second World War. Professor Taylor, on the contrary, finds that Munich “was a triumph for all that was best and most enlightened in British life.”
That Munich did not work out as had been hoped at the time was due more to British action and policy on the heels of Munich than to any deeds of Hitler. Chamberlain did not, and perhaps could not, stand up effectively against the myopic and bitter criticisms of Munich by both the British Conservatives and Laborites. Halifax was already in the process of betraying the peace efforts at Munich and taking over the leadership of the war party in the cabinet. Churchill proclaimed that Germany was getting too strong to be tolerated and must be smashed, if necessary by force of arms. Duff Cooper contended that the balance of power on the Continent of Europe must be preserved at all costs. Taylor fails to mention the fact that Clement Attlee also attacked Munich with as great vehemence and bitterness as any Conservative.
Instead of defending his Munich policy on the high level of statecraft and public morality to which Taylor has ascribed his motives, Chamberlain, in the face of criticism by the British war party, fell back on the lame and dishonest excuse that Britain surrendered at Munich because it had been too weak to fight rather than negotiate; hence, it now had to rearm speedily and thoroughly. “In this way, Chamberlain did more than anybody else to destroy the case for his own policy.”
The usual explanation that Munich failed to preserve peace because Hitler violated his pledge not to make further territorial demands in Europe after the Sudetenland transfer cannot be maintained on a factual basis. He actually made this pledge at a Sportpalast speech in Berlin on September 26, 1938, three days before Munich. Hitler made no demand for Czechoslovakian territory after the Munich Conference and the cession of the Sudetenland, and his demands for the return of the German city of Danzig, on which Poland had no valid claims, and for the railroad and motor road across the Corridor, could hardly be regarded as any literal, or even moral, violation of this pledge. Czechoslovakia inevitably fell apart in the natural course of the political disintegration which had been set in motion by the return of the Sudeten territory to Germany. Taylor emphasizes this fact at length.
Of all the silly and preposterous allegations made against Hitler, surely the outstanding was that his occupation of Prague proved his determination on world conquest. Although Chamberlain, Halifax, and the British war party made this charge to beguile the British public, they knew better than to base their case in diplomatic channels on this travesty. Rather, in collusion with the Rumanian minister in London, they concocted a transparent fraud, immediately repudiated by the Rumanian Foreign Minister, charging that Hitler had just made demands on Rumania which threatened her sovereignty and forecast an attempt at wholesale penetration of the Balkans.
Aside from inadequate emphasis on the extent and manner in which Lord Halifax and Sir Howard Kennard, the British ambassador at Warsaw, encouraged Poland not to negotiate a peaceful settlement with Hitler in August, 1939, Professor Taylor’s account of the German-Polish crisis of October, 1938, to September, 1939, accords with his general thesis that Hitler did not want war. He makes it clear that Hitler wished a permanent and peaceful settlement with Poland rather than war.
THE TERMS Hitler suggested to Poland, beginning on October 24, 1938, were extremely reasonable—indeed, the most moderate of any in his whole revisionist procedure from 1933 to 1939 and were far less drastic than many British leaders had suggested between the two World Wars. Even Churchill, at about the very time Hitler came to power, had declared in the House of Commons on April 13, 1933, that the question of the Polish Corridor was a leading issue that had to be adjusted if European peace were to be preserved.
Hitler only asked for the return of Danzig and a railroad and motor road across the Corridor. Indeed, he proposed much more in return than he requested; he offered to guarantee the Polish boundaries as settled at Versailles after the first World War, something the Weimar Republic would never even remotely consider. Britain has been invariably presented in the traditional story of 1939 as the moral custodian of Europe, even willing to risk war to protect the integrity of Poland, which Hitler was seeking to gobble up. The facts are precisely the reverse.
There is conclusive evidence that the Polish leaders believed that Hitler’s terms of 1938-1939 were sincere, and were not merely the first step in a sinister program to absorb Poland later on by military force or political intrigue. But Josef Beck, the Polish Foreign Minister, refused to accept Hitler’s generous terms, and on March 26, 1939, broke off negotiations with Germany. They were never again resumed down to the time war broke out on September 1, 1939.
The stubborn refusal of Poland even to negotiate with Germany during the crisis of August, 1939, is fully revealed by Taylor, although he does not bring out the extent to which Beck was encouraged in this intransigeance by Halifax and Kennard, especially the latter. Taylor does, however, make it crystal clear that the Poles were far more willing to envisage war than was Hitler. Right down to the final crisis Hitler had hoped for peaceful revision. Even during the last hours of peace he only increased his demands to include a plebiscite in the northern tip of the Corridor. It would have taken a year of peaceful negotiations to complete the arrangements under this plan, and the important Polish port of Gdynia was explicitly excluded from the proposed plebiscite area.
Those who refuse to be convinced by Taylor’s demonstration that Hitler’s operations in revising the Treaty of Versailles prove that he did not desire to provoke war, fall back on the allegation that his whole economic policy had been to gear German industry to warlike plans, that he had spent enormous sums of money to create a great military machine, sufficient for and ready to start a war of world conquest, and that he had converted Germany into a great military camp.
Taylor refutes all this very effectively. Hitler had not spent more money for armament, relatively, than either France or Britain, and he was in no way prepared for even a Europeon war, to say nothing of a war of world conquest. He was only ready for a short Blitzkrieg of a couple of months, such as he waged in Poland. Out of a hundred divisions he put into war in Poland, only three were mechanized and not one completely motorized. The combined military forces of Britain and France were far more than equal to those of Germany in 1939.
The final line of defense of those who reject the facts of both diplomatic history and economic history from 1933 to 1939 is that the real proof of Hitler’s plan to conquer the world is to be found in his Mein Kampf, written in 1924, and his alleged “Second Book,” putatively composed in 1928, not in what he actually did from 1933 to 1939. This implies that Hitler was the only prominent public figure in 1939 who had never changed his mind over the years despite revolutionary alterations in surrounding circumstances. Yet, these same critics of both Hitler and sound history have been the very ones who have contended for three decades that if there was one invariable characteristic of Hitler it was his explosive nature, his undependability, his instability, vacillation and fickleness, and his general irresponsibility. They cannot very well have it both ways.
Mein Kampf furnishes little or no clue as to what was going on in Hitler’s mind in 1939, any more than Churchill’s violent attacks on Russia in 1918-1920 provide a true reflection of his attitude towards Russia at the Teheran or Yalta Conferences, or his assurance to the House of Commons after his return from Yalta that he knew of no country which honored its public promises with greater fidelity than Soviet Russia. The Mein Kampf subterfuge is like seeking the motives and policies of President Roosevelt on the eve of Pearl Harbor in his isolationist and pacifist speeches during the campaign of 1936—only five years earlier.
In his final conclusion as to the coming of war in September, 1939, Professor Taylor rejects the verdict which has been accepted for more than two decades, namely, that it was the inevitable product of a long premeditated and wicked plot on the part of a maniacal Nazi dictator.
He contends, to the contrary, that it was a calamitous mistake, not premeditated by either side, and was primarily the product of diplomatic and political blunders on both sides: “This is a story without heroes; and perhaps even without any villains . . . The war of 1939, far from being welcome, was less wanted by nearly everybody than almost any war in history . . . The war of 1939, far from being premeditated, was a mistake, the result on both sides of diplomatic blunders . . . Such were the origins of the second World War, or rather the war between the three Western Powers over the settlement of Versailles; a war which had been implicit since the moment the first war ended.”
PROFESSOR TAYLOR is quite correct in stating that, in so far as the general publics were concerned, the second World War was one of the most unwanted wars in history, but it was not unwanted by Halifax, Kennard, and the British war party in the summer of 1939. Chamberlain was rather wavering and schizoid on the matter, but in the end he joined with Halifax and Kennard and stood out against Sir Nevile Henderson, the British ambassador at Berlin, who resolutely opposed the war to the last moment.
As Foreign Secretary, Halifax was the responsible leader of the war group. He had taken over control of British foreign policy within a week after the Munich Conference. He carried through the war program in a ruthless and undeviating manner and with consummate skill, craftiness, duplicity, and determination, from mid-October, 1938, to the sending of the final ultimatum to Germany on September 3, 1939. If there was any “villain” in 1939 it was Lord Halifax, far more so than Churchill. The latter had little to do with British diplomacy at the time, and actually did not know much about what was going on at the end of August when Halifax was craftily, skillfully, and relentlessly piloting England and Europe into war.
While affecting a personal piety almost akin to that of Thomas a Kempis, Halifax planned, engineered and gratuitously let loose on the world the most cruel and devastating war in history, the ultimate result of which may be the extermination of the human race, with no more justification than the perpetuation of an obsolete British political tradition—the balance of power on the European continent—which had been fashioned in the sixteenth century by Cardinal Wolsey.
As to the motives of the group which backed up Halifax, they were both varied and numerous. Some were chronic German haters. Others were alarmed by Germany’s economic recovery and the methods whereby this had been accomplished. Some may have honestly feared that Hitler did have a program of extensive military conquest, although surely none of them believed that this would be directed against Britain. Some, like Churchill, believed that they could improve their political status in the event of war. Laborites and other Leftwing groups hated conservative totalitarianism.
Certainly, the British blank check to Poland, either when made in March or when confirmed on August 25th, was a hypocritical fraud which did not offer any honest guarantee or comprehensive protection to Poland, and was not intended to do so. It was purely a provocative war stratagem. It merely encouraged Poland to stand firm against reasonable German demands and thus make inevitable a war against Germany. It was Hitler who offered the genuine guarantee to Poland.
When, in the autumn of 1939, Russia brazenly occupied eastern Poland, the question was raised in the House of Commons as to whether the British guarantee of Poland covered aggression against her by Russia. Richard A. (Rab) Butler, who answered for the government, had to admit that it did not. It was only a guarantee against Germany, which at the outset did not contemplate annexing any Polish territory. Rather, Germany offered to guarantee the Versailles boundaries of Poland.
It is well established that no responsible leaders in Germany, France, or Italy wished war in 1939. President Roosevelt apparently desired to have the European war break out as soon as possible, pressed Chamberlain to go ahead, and encouraged Polish arrogance and stubbornness. But Roosevelt was in no position to exert any directly decisive influence on European decisions in 1939, and Halifax did not need any encouragement from Roosevelt.
It is unlikely, however, that Britain would have dared to adopt the policy she did in 1939 in regard to Poland and Germany if Roosevelt had not already promised British leaders, notably through Anthony Eden and George VI, all possible American aid in the event of war and had agreed to make every conceivable effort to bring the United States into war on the side of Britain if one broke out. This is well brought out in the so-called “Kent Documents,” the nearly two thousand secret messages that were exchanged between Roosevelt and Churchill in American code and embodied, as Churchill had admitted, most of the vital Anglo-American diplomatic commitments and arrangements, beginning even before Churchill became Prime Minister.
To summarize realistically the matter of war responsibility in 1939, one may quite safely say that Professor Taylor is entirely correct in holding that the broad general responsibility, running over two decades, was divided among all the parties and was the outcome of blunders by all of them.
In regard to the direct and immediate responsibility for the outbreak of hostilities in September, 1939, the blame for the German-Polish War was divided between Poland, Britain and Germany, with the so-called guilt ranking in this order.
The primary and direct responsibility for the European War, which grew into the second World War, was almost solely that of Great Britain and the British war group, made up of both Conservatives and Laborites. If Britain had not gratuitously given Poland a blank check, which was not needed in the slightest to assure British security, Poland might have risked a war with Germany. Nevertheless, even in this case there would still have been no justification for British intervention in such a war or for the provocation of a European war.
This sole immediate British responsibility for the outbreak of the European War in September, 1939, stands out in contrast to the direct responsibility for starting a European war in August, 1914, which was divided between Russia, France and Serbia, in the order given. If Alexander Izvolski, the Russian ambassador to France in 1914, was more responsible than any other individual for war in 1914, so was Lord Halifax more to be blamed than any other person for the coming of war in 1939.
ALREADY THERE has arisen a line of criticism designed to discredit the significance of Professor Taylor’s book, even granting its accuracy as to the general responsibility for war in 1939. It is held that, although Hitler and the Nazis may not have started the war in 1939, or even wished to start it, the brutal outrages of which they were guilty after the war got started proved them such degenerate gangsters that Halifax and his associates were justified in resorting to any degree of plotting and duplicity required to produce a war to smash and annihilate them, and that President Roosevelt performed a great moral service in “lying the United States into the war” to make it certain that this salutary and needed act of extermination would be accomplished.
Any such argument is even more fallacious and deplorable than the ex post facto jurisprudence on which the Nuremberg Trials were founded. Further, there is no reason whatever to believe that the brutal wartime actions which have been alleged against Germany would have taken place if peace had been preserved. Finally, as Milton Mayer, Victor Gollancz, and others, have already suggested, it seems likely that the whole question of the wartime crimes of Germany will ultimately be submitted to as drastic a type of revisionism as the conventional views about the responsibility for the second World War have been subjected to by Taylor. Many thousands were executed after war-crime trials in Germany and Iron Curtain countries—trials which are still going on today—and far over 100,000 were executed or massacred in France and Italy during the “Liberation.”
Two great wrongs do not make a right but even a casual survey of Allied atrocities, which does not even include those in the Asiatic area, aside from the atom bombings, makes it amply clear that there is no validity to the argument that the second World War simply had to be waged to rid the world of a totally unique gang of German scoundrels—unique both as to moral depravity and deeds of brutal violence.
Hitler’s evil deeds have been told and retold, beginning long before 1939. After the Cold War started, the Western World began to learn something about the monstrous and nefarious doings of Stalin—that “man of massive outstanding personality, and deep and cool wisdom,” as Churchill described him—which far exceeded those of Hitler. But we have heard little of the horrors which were due to the acts and policies of Churchill and Roosevelt, as, for example, the saturation bombing of civilians, the incendiary bombings of German cities such as Hamburg and of Tokyo, the bombing and destruction of the beautiful city of Dresden which had no military significance whatever and in which more lives were lost than in the bombings of Hiroshima and Nagasaki, the atom bombings of the Japanese cities (planned by Roosevelt), the expulsion of about fifteen million Germans from their former homes and the death of four to six millions in the process as a result of massacre, starvation and exposure, the brutalities practised on German SS prisoners of war, the cruel and barbarous treatment of Germany from 1945 to 1948, and the return of around five million Russian refugees in Germany to Stalin to be butchered or enslaved. The greatest horror that could be fairly traced to their doings is still held in reserve for us—the nuclear extermination of mankind.
In short, there is no unique or special case against Nazi barbarism and horrors unless one assumes that it is far more wicked to exterminate Jews than to massacre Gentiles. While this latter value judgment appears to have become rather generally accepted in the Western world since 1945, I am personally still quaint enough to hold it to be reprehensible to massacre either Jews or Gentiles.
Professor Taylor, logically and wisely, deals only slightly and incidentally with the domestic policy of Nazi Germany, although he does hint correctly several times that this probably did more to produce the war than Hitler’s foreign policy. Of all of Hitler’s domestic policies, the one which brought upon him the greatest opprobrium and hatred and the one which played the most important public role in encouraging war on Germany, was his treatment of the German Jews, a piece of folly which I have condemned for nearly thirty years in numerous articles, books and lectures. Indeed, the famous American Rabbi, Stephen S. Wise, reprinted a series of articles I wrote for the Scripps-Howard newspapers criticizing Hitler’s anti-Semitism and distributed tens of thousands of copies.
There could, however, be no greater paradox in history than a war in behalf of Poland on the basis of the Jewish issue. There were in Poland, in 1933, six times as many Jews as in Germany, and they were surely treated as badly as were the German Jews under Hitler. Moreover, by 1939, Hitler’s anti-Jewish program had moderated and more than half the German Jews had left Germany, usually with many of their possessions, whereas the Polish Jewish population had declined relatively slightly and their treatment had not improved to any notable extent.
In the 1930’s, when I was actively engaged in journalism, I received much praise from Jewish readers for my columns and editorials criticizing Hitler’s treatment of the Jews, but this was interspersed with frequent and insistent suggestions that I should not overlook the far more extensive plight of the Jews in Poland. Several of my more responsible correspondents charged that the Polish government was laying plans to exterminate the Polish Jews as communist revolutionaries. This was several years before it is alleged that Hitler even planned any extermination project. Nor should Russia be overlooked. Writing in October, 1938, Walter Duranty observed that “Stalin has shot more Jews in two years of purges than were ever killed in Germany.”
IT IS WORTHWHILE here to indicate briefly the significance of the book by Professor Taylor for citizens of the United States. So far as revisionist scholarship is concerned, this is greatly strengthened and its basic contentions are confirmed. It will now be easier to treat the causes of the second World War realistically and honestly without being accused of mental defect or moral depravity.
The awe and reverence with which English historians are customarily regarded by the American historical guild will make it the more difficult and embarrassing for the latter to laugh off Professor Taylor’s confirmation of the basic tenets of American revisionist historical scholarship. The frenetic reviews of the American edition have already revealed their schizoid reaction—a sort of intellectual “twist” dance.
The Taylor book underlines the accuracy of American anti-interventionism which had been supported by revisionist historical writings in this country. The interventionists based their policy on the fantastic assumption, actually voiced by such able historians as Samuel Flagg Bemis, top commentators like Walter Lippmann, and superb journalists of the type of Walter Millis, that the United States was in mortal danger of infiltration and attack by Nazi Germany. Professor Taylor’s book further emphasizes the grotesque fallacy of this contention. Hitler did not even wish to attack England or France, to say nothing of proceeding westward across the Atlantic. Nor was it necessary for the United States to enter the war to protect Britain or France. Hitler sought peace after the Polish War and again after the fall of France, and Dunkirk.
In the light of the facts brought forward by Professor Taylor, which are not at all new to American revisionist historians and had previously been well stated by Tansill, Beard, and others, President Roosevelt’s allegation that Hitler planned to invade the United States by way of Dakar, Rio de Janeiro and Panama—his notorious timetable for the Nazi occupation of Iowa—is shown to be as fantastic and untenable as his statement that he was “surprised” by the Japanese attack in December, 1941.
Professor Taylor’s book should serve as a warning that a third world war will not be prevented by an avalanche of stale Germanophobia, or by merely mouthing arrogant platitudes and benign homilies about the virtues and superiorities of democracy and the “Free World.” These semantic gestures must be supplemented and implemented by all the wisdom, precaution, foresight and statecraft that can be drawn from the disastrous experience with two world wars and their ominous aftermaths. Failing this, we shall not have another opportunity.
We are not likely to succeed so long as we resolutely reject searching self-examination but continue to seek a scapegoat on whom we may lay the blame for all international tragedies. The effort to make a scapegoat out of the Kaiser and Germany after the first World War produced the Versailles Treaty and, in time, the second World War. The same process was continued on a more fantastic scale after the second World War, and it has already led us to the brink of nuclear war several times. Professor Taylor has made clear the folly in seeking to make Hitler’s foreign policy the cause of all the miseries and anguish of the world since 1939—or even 1933.
We can get no valid comfort from the illusion that nuclear warfare will be withheld in the third World War, as poison gas was in the second. As F. J. P. Veale pointed out so well in his Advance to Barbarism, the Nuremberg Trials took care of that. These showed that the rule in the future will be that defeated leaders, military and civilian, will be executed. Hence, no leader in wartime will spare any available and effective horrors which may avert defeat. Field Marshal Bernard Law Montgomery got this point when he stated in Paris in June, 1948: “The Nuremberg Trials have made the waging of unsuccessful war a crime: the generals on the defeated side are tried and then hanged.” He should have added chiefs of state, prime ministers, foreign ministers, and even secretaries of welfare.
While it is easy to demonstrate that the second World War and American entry into it constituted the outstanding public calamity in human history, and perhaps the last—surely, the next to the last—of such magnitude, the question is always asked as to what should have been done.
There is no space here to write a treatise on world history or to combine prophesy with hindsight. But a reasonable answer can be suggested.
Britain should not have started the second World War. The British leaders knew that Hitler was no threat to them. Next to assuring German strength, he was mainly interested in bolstering the British Empire. Even after Dunkirk he offered to put the German Wehrmacht and Luftwaffe at the service of Britain if she would make peace.
Germany and Russia had made a pact in August, 1939, and both were interested in turning east and south. If they remained friendly they could have developed and civilized these great untamed areas. If they quarrelled and fought, they would thereby have reduced the two great totalitarian systems to impotence through military attrition. Once the war started and Germany had invaded Russia, the United States should have remained aloof and allowed these totalitarian rivals to bleed themselves white and thereby end their menace to the Western World.
The wisdom of such procedure was recognized by public leaders in both major political parties, such as ex-President Herbert Hoover, Senator Robert A. Taft, and Senator Harry S. Truman. Communism would not now dominate a vast portion of the planet or have over a billion adherents. Nor would we be faced with a war of nuclear extermination.
But the combined power of Roosevelt’s lust for the glamor of a war presidency, the communist line about “collective security,” so successfully propounded by Litvinov at Geneva and adopted by American liberals as the ideological basis of their interventionism, and Churchill’s gargantuan vanity and vast enjoyment of his prestige as wartime leader, was far too great to be overcome by either factual information or political logic. The dolorous results of the folly of American intervention and Roosevelt’s concessions to Stalinite Communism dominate the material in every daily newspaper and every political journal of our time.
The New Conservativism
AS SOME recent articles in the New Individualist Review and other magazines have indicated, the intellectual Right in the United States is divided into at least two large factions. Each of the factions has its own firmly-held ideology, its own history, its own roster of heroes and demons. And some members, at least, of each faction are not at all sure that large numbers of their fellow-Rightists are not more profoundly in error and more dangerous to the Republic than are even the infernal legions of the Left.
In the interests of harmony and good-fellowship, many conservatives have lately suggested that such discussions be played down and that the Right return to its principle business: exposing the foibles and inanities of the American Left. Were the differences minor ones, then the airing of them in public would do little or no good for the advancement of the principles held by those commonly referred to as “conservatives.” However, when such differences are radical, when the only area of agreement is anti-communism, then to call for harmony in the interest of a united “anti-communist” and “anti-socialist” front is as reprehensible now as the actions of those who called for a Popular Front in the closing years of the Thirties “to oppose Fascism.”
These articles, especially those by Mr. Hamowy and Mr. Facey in this magazine,1 indicate that the differences are radical, and that the older philosophies of libertarianism, laissez-faire economics, and constitutionalism have little in common with what has been called the “new conservatism.” It is the attempt of this article to point out these differences and to show why they are incompatible with the older philosophy, which might be called, as it had been before the name was pirated by the statists and interventionist of the Left, “liberalism.” The task is complicated, for, as Professor F. A. Hayek points out: “Since it [conservatism] distrusts both abstract theories and general principles, it neither understands those spontaneous forces on which a policy of freedom relies nor possesses a basis for formulating principles of policy.”2 An analysis of this new conservatism must begin, then, with an investigation of the ideas put forth by its proponents in their writings.
Those who read the works of the new conservatives are struck first of all by the contempt in which reason is held. Russell Kirk, so it seems, cannot write a book without sneering at “defecated rationality” or the “puny private stocks of reason” possessed by individuals. Mr. Kirk prefers to remain an “intellectual dwarf perched on the shoulders of a giant—Christian, Western tradition.” But the errors of reasoning made by those Professor Hayek calls the “rationalist liberals” in no way invalidate the tool used: reason. As a method for combatting the errors of the planners and interventionists, reason is far superior to appeals to tradition. Indeed, Professor Ludwig von Mises asks, in Human Action, if the traditional doctrines so constructed are in agreement with the actual beliefs held by the ancestors so venerated. Tradition and custom possess no validity per se; their rightness or wrongness depends solely on their agreement with those principles, discoverable by reason, which regulate human action.
To the new conservative, such ideas are “the murkiest Bentham.” These traditionalists make much of the fact that many libertarian authors use a utilitarian cause-and-effect approach in their writings on economics. Utilitarianism is “materialistic,” they claim, ignoring the fact that it studies, especially the modern agathistic utilitarianism, not only material pleasures, but all human desires, for one purpose: to discover the correct method of fulfilling such desires. Others would condemn it for ignoring the irrational, the unusual in life. Such a censure is foolish, for it ignores the fact that economics limits itself to the study of the analyzable; it does not attempt to comment on the goals and desires of an acting individual. As Prof. Mises puts it:
The teachings of economics and praxeology are valid for every human action without regard to its underlying motives, causes and goals. The ultimate judgments of value and the ultimate ends of human action are given (that is, undefined in the logical sense) for any kind of scientific inquiry; they are not open to any further analysis.3
More often than not, the new conservative will content himself with simply sneering at utilitarian ideas. They are “immoral,” they are “relativistic” or, to use a phrase common to conservative polemicists, “the ideas of Bentham, ‘the great subversive,’ find their reductio ad absurdum in Marxism.” Such criticism is meaningless, for modern utilitarianism is neutral with respect to final choices; charges of “immorality” or “relativism” when applied to it are absurd. As for its alleged connection with Marxism, one could, with more justice, establish an ideological relationship between conservatism and fascism. It is but a few steps from Burke’s veneration of the “oaks of the English aristocracy” to Maistre’s veneration of “throne and altar,” Metternich’s censorship, the racism of a Carlyle or a Gobineau, the nationalism of a Barres, until we reach our reductio—the fascism of a Maurras. Even those supposedly in the Burkean tradition were eager, at times, for a man on horseback—Irving Babbitt, founder of the New Humanism and intellectual mentor of new conservative Russell Kirk, declared, in his passion for order, that there would be a time when “we may esteem ourselves fortunate if we get the American equivalent of a Mussolini; he may be needed to save us from the equivalent of a Lenin.”4 That Millian socialism and its bastard brother Marxism are, in fact, perversions of Benthamite utilitarianism seems to escape the new conservatives; the fault lies not in utilitarianism itself, but in the minds of those misinterpreting it.
It is this rejection of reason for tradition and custom that has brought out what Mr. Hamowy called “the whips, thumbscrews and firing squads” in his article;5 it is obvious that, if the only defense of the new conservatism is tradition, then that tradition must be maintained, no matter what the cost to liberty. It is the love of custom that brings forth the shibboleths of the new conservatives commonly applied against libertarian-liberalism—the system considers men as an individual; it is “rational” and “atomistic.” To the new conservative, “community” is all. Indeed, Mr. Kirk makes merry over a group of people who wished to found a society of individualists; in considering their choice of intellectual mentors, he declares:
These same gentlemen (who profess to be individualists, but are really conservatives in their impulses) cried up a pantheon of philosophers after their taste: Lao-Tse, Zeno, Milton, Locke, Adam Smith, Tom Paine, Jefferson, Thoreau, John Stuart Mill, and Spencer. No thinking conservative would be much inclined to pull these chestnuts out of the fire for the sake of the commonwealth. I suggested that if they were to substitute Moses or St. Paul for Lao-Tse, Aristotle or Cicero for Zeno, Dante for Milton, Falkland for Locke, Samuel Johnson for Adam Smith [!], Burke for Paine, Orestes Brownson for Ralph Waldo Emerson, Hawthorne for Thoreau, Disraeli for Mill, and Ruskin [—yes, Ruskin!] or Newman for Spencer, then indeed they might make the dry bones speak, and kindle the imagination of the rising generation.6
This alone would be enough to validate the thesis of this article: new conservatism has nothing to do with the individualism of American libertarian-liberalism; the inclusion of a socialist and a few rabid monarchists in the renovated pantheon indicates that it is as hostile as it ever was to individualism. And why not? The individual, especially the innovator and the dissenter, is hostile to the ideas of “order” and “tradition”; he prefers to cut his own way. In so doing, he may increase the good of all, but this idea never occurs to our tradition-minded gentlemen.
NOR ARE conservatives content only to celebrate the existence of such anti-individualists; they are ready and willing to go to further impositions on individual liberty. Willmoore Kendall is unalterably opposed to the open society; Revilo Oliver sees in an established church—preferably high-church Episcopal or Roman Catholic—the salvation of America; all join in supporting the House Committee on Un-American Activities, despite its questionable status in a nation dedicated to a Rule of Law. Yet, such people assure us that they stand for “liberty” over equality. In contrast to these opinions, the ideas of H. L. Mencken, who was not only an anti-communist but an anti-democrat, bear repeating:
“I believe in only one thing and that thing is human liberty. If ever a man is to achieve anything like dignity, it can happen only if superior men are given absolute freedom to think what they want to think and say what they want to say. I am against any man and any organization which seeks to limit or deny that freedom.” Mr. Mencken was speaking to Hamilton Owens at the time, and when Owens asked if Mencken would limit freedom to superior men, Mencken replied that “the superior man can be sure of freedom only if it is given to all men.”7
Together with his strong dislike of individualism, the new conservative is contemptuous of the system it produced: the laissez-faire economy. Indeed, one can find almost any other kind of system outside of socialist collectivism praised in their writings—despite the fact that such systems are unworkable. Some, like Mr. Kirk, would restore “community” to economics. What is needed is a higher morality, a “humanity,” among businessmen and workers. The new system will not be socialism, for private property will be preserved, not interventionism, for there will be no need for government intervention in a “moral economy,” nor capitalism, for the profit motive will be supplanted by conscience. While such a system is, apparently, non-coercive, it makes an assumption which is almost as unjustifiable as that which suggests that all men can be made reasonable: that all men can be made moral. If it seeks to bring such a moral millenium about by force, it will fail. We need only consider Prohibition to understand the futility of attempts to enforce a morality above and beyond the normal laws needed for cooperation in society.
Plans to restore “community and morality” overlook one fact: such things can only come about, if they ever existed, spontaneously; they cannot be forced.
AN ECONOMIC system more pleasing to new conservatism is corporatism. Rejecting even the idea of a “moralized” capitalism, our new conservatives seek the solution of the “problem of community” in the guild system of the Middle Ages. Whether the system advocated is “guild socialism” or corporatism, the idea is easy enough to grasp. Each branch of business forms a monopoly which is fully autonomous; the only purpose of the state is to settle quarrels between different bodies. Unfortunately, such a system ignores the fact that the market cannot be divided in such a manner. It serves to protect inefficiency and prevent the diversion of capital to other sources where it would be more productive. Under such a system, the worker qua worker might enjoy a feeling of security and community; as a consumer, however, he would suffer.
In the times it has been tried, either it has failed miserably, as in the case of the American NRA experiment, or has resulted in continued bureaucratic control, as in the case of Italian Fascism. We are offered a new collectivism, a collectivism of the Right, to save us from both the “inhumanity” of capitalism, with its “rootless individualism,” and the collectivism of the Left. In offering this as a substitute for the market economy, the new conservatives, for all their wrath against rationalist planners, become planners themselves and triflers with individuals.
It will be objected that my notion of freedom is dangerous in that it ignores values. Such an objection indicates a basic misunderstanding of the concept of freedom. It does not posit the right of every man to act as he pleases; such is a definition of license rather than freedom. Freedom is the right of an individual to think and act as he pleases, so long as he bears full responsibility for his actions, and refrains from using coercive or aggressive force against the life, liberty or property of any other. Such a definition implies the recognition of a set of absolute values binding on all men which govern interpersonal relations; it implies the existence of personal values in each individual and it implies that these values serve as a standard of right and wrong, which serves to fix responsibility for actions. It is the new conservative who is more of a relativist, for he, in his search for order, would destroy such standards. Why else the desire for an “American Mussolini?” Why the hatred of capitalism and the support for socialistic measures which has, at times, marked conservatism. It is a desire for freedom for an elite, rather than a desire for freedom for the individual, that provides conservatism with a relativistic outlook.
This, then, is the new conservatism: a doctrine which is not only anti-rationalistic, in that it opposes the wild dreams of the planners, but anti-rational, in opposition to all reason; it holds to a creed of anti-individualism and anti-capitalism; in its search for “order,” it embraces a relativism of its own. It has its sources in neither the libertarianism of an Albert Jay Nock nor in the constitutionalism of a Liberty League; if we seek its sources, we find them in the ludicrous union of the New Humanists, Eliot, More and Babbitt, on the one hand, and southern agrarians, men such as Robert Penn Warren, on the other, well salted with mediaevalists, Distributists and followers of the socio-ethical theories of Carlyle and Ruskin. In its extremes, it either drifts over to a fascism or, in its attempts to reject capitalism, to a mild socialism. Indeed, so foreign are its principles to those of libertarianism that it can be hailed by the Left as a good. Mr. Kirk quotes, with some pride, the words of one interventionist on the new conservatism.
. . . Mr. Arthur Schlesinger, Jr., writing in the quarterly journal Confluence, remarks that “the aim of the New Conservatives is to transform conservatism from a negative philosophy of niggling and self-seeking into an affirmative movement of healing and revival, based on a living sense of human relatedness and on a dedication to public as against class interests, all to be comprehended in a serious and permanent philosophy of social and national responsibility.8
In short, the new conservatism is not what most people would call “conservative” at all; it favors, not freedom, but an exchange of power, from the present bureaucrats to an “aristocratic elite.” In calling the attention of individualists to its beliefs and dogmas, I am not trying to attack the Right, or cause a “schism”; I am trying to point out that the doctrines of individualism are being misrepresented and that those who are misrepresenting these ideas are doing more harm than good, and should be repudiated.
WHAT YOU CAN DO TO HELP NIR . . .
During the past year, the circulation and staff of NEW INDIVIDUALIST REVIEW has been expanding rapidly. This journal is now being sold at many local newsstands and at over 40 colleges and universities. Despite a few dissenting notes, the general reaction of libertarian and conservative leaders has been favorable. The author of “The Conservative Mind,” Prof. Russell Kirk, for instance, has said that NEW INDIVIDUALIST REVIEW is a work of “genuine intellectual power” and the editor of “National Review,” William F. Buckley, Jr. has called it “by far the best student magazine on our side of the fence.” If you agree that this is a useful magazine which ought to be read by more people, there are four things that you can do to further the growth of libertarian-conservative ideas.
(1) You can urge your college library or your local public library to subscribe. A library subscription makes an excellent donation since it may introduce the magazine to dozens of people.
(2) You can urge your friends to subscribe or to donate subscriptions to students.
(3) If you are a college student, you can volunteer to act as our representative on your campus.
(4) Our student subscription price ($1.00 a year) does not cover the cost involved; this price is purposely kept low to encourage as wide a readership as possible among undergraduates. Our deficit is made up by voluntary contributions from individuals. Any donation which you might be able to afford at this time would be gratefully received. None of our staff, by the way, receives any remuneration of any kind.
Individual Freedom and Economic Security
THE POPULAR slogan “freedom from want,” which links freedom and economic security, is either an anachronism or a semantic illusion which helps to becloud the socio-economic problems which face the free world.
There was a time when individual freedom and economic security went hand-in-hand. In a world of general insecurity—out at the fringe of the Frontier—the freedom of the individual to make the fullest possible use of his physical and mental resources was the only assurance he possessed that he would survive in a hostile world. Similarly, at the time of John Locke, the late 17th century English defender of the rising middle class, freedom from government oppression and repression enabled the individual to use his faculties in order to acquire property and thus gain economic security and social status. The economic well-being of the middle class was dependent upon freedom of enterprise, freedom of trade and protection of private property, all three of which had been previously hampered by absolutism and mercantilism. It was for that reason that Locke argued that “life, liberty and property,” are inherent rights of the individual which no government could, or should take from him.
This, however, is not the type of personal freedom and economic security which the advocates of the modern “freedom from want” slogan have in mind but, rather, a policy under which the individual surrenders much of his personal freedom as a producer and consumer in return for economic security provided by a more or less powerful and benevolent state.
This is obviously not a new idea. The same trend of thought regarding freedom and security prevailed during the troubled decades of the declining Roman Empire. The entrepreneurial middle class was increasingly subjected to rigid government controls; the acquisition of personal wealth was pictured by writers of the time, both Christian and pagan, as a useless, if not immoral occupation; and slaves and free peasants alike came under the “protection” of the large landowners. By the 5th century, the economic liberalism which had prevailed in Rome during its centuries of power, had given way to an economic system, characterized by those features which in the following centuries blossomed into feudalism and serfdom. No doubt the slaves and serfs of this period, if theirs was a good master, enjoyed a fairly high degree of economic security, considering the general low standard of living of the time. But good masters have a tendency of turning into tyrants, and who assures the people of today who rely on the benevolence of the welfare state, that it will remain benevolent?
More than any other industry, American agriculture has been the beneficiary of government aid for many years. But instead of solving the problems of agriculture, government aid has deepened and prolonged them. Here is what the President of the American Farm Bureau Association, one of the two large farm organizations in America, has to say about the effects of government interference: “America has been known as the land of opportunity, but opportunity depends upon freedom, and freedom means individual responsibility—not the rule of force by government. The government interventionist abandons freedom of choice because he is contemptuous of the ability of individuals to know what is best for them.” And Mahatma Gandhi, the great Indian philosopher and humanitarian warned: “While apparently doing good by minimizing exploitation, the state does the greatest harm to mankind by destroying individuality which is the root of all progress. . . . The state is a soulless machine; it can never be weaned from violence to which it owes its very existence.”
The rapid economic growth during the 19th century, the rising standard of living and the unprecedented expansion of personal freedom coincided with, and to a large extent were due to, the rise of the entrepreneurial middle class which followed the overthrow of 17th and 18th century collectivism through the Glorious Revolution of 1688 in England, the American Revolution of 1776, and the French Revolution of 1789. Only after political and economic power had passed from the aristocracy and its proliferating bureaucracy to the middle class, whose vitality and strength sprang from personal freedom and private initiative, did the western world achieve its greatest economic, social and political advance.
Yet today we are turning away from the ideals of 19th century liberalism, which during the past 150 years have turned the western world from “underdeveloped countries” plagued by poverty and hunger into the “affluent societies” of today. What is causing the declining faith in personal initiative and freedom and the widespread demand by the people for “freedom from want” on the one hand, and on the other, the promise of the government to provide the country with a “great living program of human renewal?” Technological, institutional, social, political and cultural conditions have changed rapidly during the past 50 years; and rapid changes make for insecurity, which is undoubtedly one of the reasons for the craving of the millions for security provided by the state.
IN A WORLD of handicraft and small shops, of small towns and small farms, man was able to make a living without too much dependence upon the rest of society. He produced—or at least could produce—the food he needed. Most of his clothing was homemade, his house was not filled with gadgets which only an expert could keep in repair, home remedies were used in times of illness, and the aged found a spare room in their children’s home. Life was, of course, neither idyllic nor comfortable, if measured by modern standards. There was little variety in food, houses were hot in summer and cold in winter, sanitary facilities were primitive, the infant birth rate was twice as high as it is today, and the chances of surviving a major illness were poor. But these material short-comings of the “good old days” were offset by two great advantages: man could and to a large extent did preserve his economic independence, at least in the United States, and whatever man created by his hand or his mind was an expression of his own spirit and his own personality—and his personal property.
Science and technology have wrought a radical change. They have taken from man’s shoulders many of the burdens he carried for thousands of years. They have flooded the western world with consumer goods, have raised the quality and quantity of our food and raiments, have put dozens of intricate gadgets into our homes which replace human labor, have produced sanitation and medical knowledge which have doubled man’s lifespan; but they have also deprived man of his economic independence and his individuality as producer and consumer.
URBANIZATION, like mass production, produces interdependence. A subsistence farmer can survive a depression; an unemployed industrial worker, living in a metropolis far removed from the soil, cannot survive by his own resources. A strike of elevator operators or tugboat crews can paralyze New York. By destroying a power line, a windstorm can deprive hundreds of people many miles away of heat, water, light and their ability to cook a meal. Having thus lost the economic basis of his independence, his ability to survive without the coordinated effort of society, modern man naturally seeks security, and only a strong and well organized state seemingly can assure the necessary order to prevent economic chaos and give the individual the security which he lost when he traded his hand-tools for a place at the assembly line, and his subsistence farm for an apartment in the big city. As the dependence of the individual upon nature in an agrarian society gradually shifted to increasing reliance of the people on government in an industrial society, man’s attitude toward government intervention changed.
But the changes in modern man’s mode of living do not tell the full story. Man’s outlook on life is no doubt influenced by his physical make-up, his environment, and his religious beliefs; but neither the flow of glandular secretions, nor subconscious influences of a Freudian nature, nor theological doctrines, and—contrary to Karl Marx—most certainly not man’s physical surrounding can deprive man from choosing one course or another. Being free to make decisions distinguishes man as a moral being from an electronic computer. The same holds true of nations. At the end of the war, England was partly destroyed and generally impoverished; devastation and poverty were infinitely worse in Germany. Nations were free to choose between welfare state socialism and free enterprise. England adopted the former, and under the rule of the Labor Government her economy skidded from one near-crisis to the next. Western Germany on the other hand, having learned from fifteen years of bitter experience the consequences of statism and economic interventionism, turned to free enterprise, and within less than a decade the world began to talk of the “German miracle.” Just as the British and the Germans were not forced by circumstances to adopt the policies which they chose, industrialization and mass production do not compel the American people to turn to welfare state socialism.
THERE ARE other and deeper reasons, however, which help to explain the politico-economic shifts of the recent decades. During the past 50 years western civilization has experienced a radical change in its concept of the individual and of the nature of society. For two hundred years, since the days of the Enlightenment, man had been regarded essentially as a rational being, able to judge what was best for him. Modern psychology no longer subscribes to this assumption. Freudian man is impelled by subconscious drives; his actions are supposedly often irrational; and with the population growing at an extremely rapid rate during the past 100 years and with the spread of mass-education, the masses have swamped the intellectual and cultural elite. One writer speaks of the “vertical barbarian invasion” of western civilization by the masses resulting in an overemphasis of material goods and a disregard of basic cultural values; the desire for security on the part of the Massemench which has overwhelmed the demands for personal freedom of the superior individual.
“Our people in a body are wise,” wrote Thomas Jefferson to Joseph Priestley, “because they are under the unrestrained and unperverted operation of their own understanding.” “I believe, as I believe in nothing else, in the average integrity and the average intelligence of the American people,” wrote President Wilson more than a hundred years later.1 This was not campaign oratory; Jefferson and Wilson believed in the rationality and goodness of man.
The spirit has changed in the past 30 years. “Without results we know that democracy means nothing and ceases to be alive in the minds and hearts of men,” warned the President’s Committee on Administrative Management in 1937.2 Government had come to look upon the people as being not primarily interested in abstract ideals—“freedom buys no milk for the baby” was a popular phrase during the depression years—but in material results, in security. As Lord Keynes put it: “In the long run we are all dead.” Instead of assuming that man is able to judge the nature and long-range effects of a government policy, which is the basic assumption upon which democracy rests, we have come to “tax and tax, and spend and spend, and elect and elect,” because, in the words of Harry Hopkins, one of the best-known exponents of New Deal philosophy, “the people are too damned dumb.”
Just as the concept of man has changed during the past few decades, so has our social philosophy.
THERE ARE two basic concepts of society, the atomistic and the organic. We can postulate that society consists of a large number of small independent atoms, namely human beings, who represent ultimate reality, while, the state is merely an agglomeration of individuals. This was the premise of 19th century economic and political liberalism. The state, having no independent existence of its own, was not expected to provide either welfare or economic security. In vetoing an appropriation of $25,000 to buy seed corn for drought-stricken farmers, President Cleveland wrote: “There is no warrant in the Constitution of the United States for taking the funds which are raised from taxes and giving them from one man to another . . . while the people support the government, the government does not support the people.”
On the other hand, we can picture society as an organism, somewhat like the human body. Human beings then become cells of the social organism, unable to exist by themselves. Society alone can provide the necessary security. The emphasis thus shifts from the individual to the group. The latter becomes ultimate reality and the individual and his wants are subordinated to the needs and wants of “society.” This was the philosophy which prevailed throughout the Middle Ages and down to the 18th century, and which was gradually replaced by the atomistic view. In our own days we seem to have returned to a large extent to the organic concept of society.
The growing dependence of the individual upon society, the declining faith in the rationality of man, and the growing emphasis upon society rather than the individual have resulted in the development of a new branch of economics, macro-economics, which today accounts for almost two-thirds of the typical college curriculum.
While nineteenth-century economics concerned itself primarily with the economic problems which confronted the individual and the firm, such as price and value and the distribution of income, twentieth-century economists are primarily interested in the problems which affect the nation as a whole: growth and stagnation, monetary and fiscal policies, welfare economics, and economic planning.
In the construction of their analytical models, nineteenth-century economists made two basic assumptions: (1) The mainsprings of economic activity were thought to be rational individuals who were intent on maximizing their income while minimizing their efforts; and (2) the economy was assumed to be governed by so called “economic laws,” conceived as somewhat similar to the causal laws of Newtonian physics, which, if not interfered with by man, would produce an economic equilibrium which most economists assumed would provide the greatest good for the greatest number. Twentieth-century economics has largely dropped both assumptions. The rationality of man and the profit motive are regarded as premises which are neither helpful nor realistic; and the notion that automatic forces, if not interfered with, would automatically promote the greatest general welfare has been replaced by the modern postulate that the economy is constantly in need of state intervention. The “perfect machine,” our social organism, which the Enlightenment thought had been created for the benefit of man by an all-wise and all-kind God, appears to the twentieth-century badly in need of repairs by economic planners.
While the term macro-economics is the creation of our time, and while the approach is new as far as the twentieth-century is concerned, modern macro-economics offers many parallels to the mercantilism of the absolutist age. There is, however, one striking difference. While economic planning in the totalitarian countries is directed almost exclusively toward the strengthening of the state, at the expense of the freedom and well-being of the people, just as it was under mercantilism 300 years ago, economic planning in the western world is directed primarily toward the improvement of the economic status of those members of society who seem unable to achieve by their own efforts the standard of living which the government regards as appropriate.
THE MOST significant move in this direction has been the Employment Act of 1946. Its basic philosophy was outlined by President Roosevelt in a message to Congress in 1944 and restated in the Democratic Platform of the same year. It calls for the federal government to provide “full employment for the unemployed and guarantee . . . a job for every man released from the armed forces and the war industries at fair pay and working conditions.” A year later, Secretary of Commerce Henry Wallace restated the principle that “the essential idea is that the federal government is ultimately responsible for full employment,” and the Employment Act itself provided that “it is the policy and the responsibility of the Federal Government to use all practical means . . . to promote maximum employment.”
The Employment Act of 1946 did not immediately impair the personal freedom of the American people and, since cause-and-effect relationships are often not clear to the untrained observer, the eventual loss of personal and economic freedom, which may well be the result of the job security created by the Employment Act, has not yet dawned upon the great mass of the American people.
Employment depends upon demand, demand upon prices, and prices, in turn upon the cost of production. As wages rise the cost of production increases, and as prices rise the demand declines, which results in a drop in employment. Under the Employment Act, however, the Federal Government is required to prevent unemployment and it can do so by creating additional demand through the expansion of credit and the creation of more money, which means inflation. While the labor unions need not worry any longer that an increase in wages might lead, via higher prices and declining demand, to unemployment, the price which the American people as a whole, including the workers, have paid for this type of job security has been high. The cost of living has increased steadily since the end of the war. The retail price index stood at less than 77 in 1945 and had almost reached 129 by the end of 1961. Since 1950, moreover, the United States has steadily suffered a deficit in her balance of payments. Since the “gold crisis” in the fall of 1960, the dollar has been dependent upon international support in order to prevent a run on the sharply reduced gold reserves of the nation. The weakening of America’s international monetary position has affected her political stature as well. While the totalitarian countries directed their economic planning toward the strengthening of the state, economic planning in the western nations has been aimed thus far primarily toward a higher standard of living for the people, even though, as we have come to realize today, this has resulted in a relative weakening of the American position in world affairs.
Obviously, this trend cannot continue indefinitely if we are to survive as a nation. The American people are confronted with a far-reaching choice. Will there be a turning away from the welfare state policies of the past decade and a return to greater reliance on private initiative, or shall we increase further the power of the state, as Rome did in its time of trouble?
When Diocletian came to power toward the end of the third century, after decades of civil wars, the weaknesses of the Roman Empire were far more pronounced than are the difficulties which face the United States today. Inflation had progressed much further and the Roman economy was stagnating, while the American economy of today is booming. But Rome had one advantage: while it had many border conflicts—the Koreas, Laos’s and Congos of those days—it was not challenged by a major power comparable to modern Russia.
Diocletian’s efforts to restore the strength of the Roman Empire followed methods which are quite similar to those which economic and political leaders suggest today. In order to stop inflation, he “fixed” prices and wages by law, with the death penalty for those who violated the ceilings; made local trade associations responsible for price maintenance and in some instances for production quotas; and froze workers, especially farm laborers, in their jobs. His “emergency” measures designed to restore economic stability at the beginning of the 4th century affected the freedom of the people of Europe for 1500 years. The trade associations, which Diocletian entrusted with price maintenance, set the pattern for the medieval guilds which regulated prices, production, working conditions, and entry into the trade. While these guilds could well have developed without Diocletian’s reforms, the latter helped to set the pattern. Farm workers, whom Diocletian froze in their jobs to assure the necessary food supplies, eventually became the medieval serfs.
Lord Beveridge, the father of the full employment philosophy, recognized at the outset that a full employment policy can easily produce chronic inflation and suggested that the government, if necessary, freeze wages and prices and regulate production and employment, i.e., tell the entrepreneur what and how much to produce, and the worker where he can work and at what wages. The attempt of the 1940’s to provide job security may thus very well lead, in the 1960’s, to federal wage and price regulations, and, when these fail, to more rigid regulations of production and employment.
TWO OTHER Congressional efforts to assure economic security—and one can, of course, list many more—illustrate the danger that man can easily lose in freedom as he gains in security. The National Labor Relations Act of 1936 was intended to protect the worker against oppression and exploitation by the employer, but an integral part of the same legislation was the creation of powerful unions and of the union shop provision—or its various equivalents—under which a worker has to pay union dues in order to retain his job. The courts have even held, that if a worker, during a national election, advocates a view opposed by his union, he can be expelled from the union even though this may automatically result in his discharge from his job and may prevent him from getting another job in the occupation for which he is trained.
The farm aid program is designed to provide the farmer with “parity” income by placing a floor under farm prices. Aside from the fact that the program has failed to provide a minimum of income security for the small farmer—the average subsidy for about 3.5 million small farmers amounts to less than $125 a year—the American farmer had to surrender a substantial portion of his personal and economic freedom in exchange for the security which the program was intended to provide.
In Wickard vs. Filburn3 and in a number of other decisions, the Supreme Court has ruled that Congress can not only regulate the flow of farm products in interstate commerce, but that it can also prescribe how much a farmer may raise on his own farm for his own consumption. Acreage allotments, production and marketing quotas—all of them restrictions on the freedom of the farmer to use his land and his ability to produce—are the logical concomitant of the attempt of the federal government to place a floor under farm prices, and thus provide “security” for the farmer.
IN THEIR search for “economic security” the American people are in danger of chasing a phantom. We live in a world of rapid technological and economic change which results in insecurity for the investor, the entrepreneur and the worker. Growth means change, and change means insecurity. Yet the same people who advocate maximum economic growth very often also clamor for maximum economic security.
Economic security is relative. Simpler societies of the past enjoyed a feeling of abundance and security, while we, despite our riches, are plagued by the frustrating sense of want and insecurity, because we have replaced the more or less stable requirements of necessities with spiraling desires for more goods and services. Nobody questions the need for “minimum security.” As St. Thomas wrote 700 years ago, “A minimum of comfort is necessary in life for the efficient practice of virtue.” The danger lies in the fact that the American people, and for that matter, western civilization as a whole, may succumb to the temptation of trading their individual freedom for the promise of economic security and an ever rising standard of living.
Mass production and urbanization, which impose upon us interdependence, make us security-minded, and the intellectual climate of our time seems to favor the same trend. We have gone a long way during the past 50 years toward trading away our personal freedom, and the 1960’s are not likely to bring a reversal of the trend. In fact, men like Professors Hansen and Galbraith assure us that henceforth the well-being of the American people as a whole will be advanced more by “social” rather than by “individual” consumption. Instead of permitting the individual to “fritter away” his income on tail-finned cars, martinis and hula-hoops, the government should channel, via increased taxation, an ever greater share of the national income into social consumption: schools and hospitals, recreational facilities and roads. After all, we are told, America spends almost three times as much on liquor, tobacco and cosmetics as on education; five times as much on dog food as on college textbooks. Perhaps our standard of values, as reflected in the buying pattern of the consumer, is basically unsound. But will it be changed for the better, if we transfer the responsibility for the allocation of the nation’s resources—together with the people’s freedom as producers and consumers—to government officials, politicians and “experts?” This is the fundamental question which faces the American people during the 1960’s.
New Individualist Review welcomes contributions for publication from its readers. Essays should not exceed 3,000 words, and should be type-written. All manuscripts will receive careful consideration.
Sin and the Criminal Law
The freedom we enjoy extends also to ordinary life; we are not suspicious of one another, and do not nag our neighbor if he chooses to go his own way.1
He who imagines that he can give laws for the public conduct of states, while he leaves the private lives of citizens wholly to take care of itself; who thinks that individuals may pass the day as they please, and that there is no necessity of order in all things; he, I say, who gives up the control of their private lives, and supposes that they will conform to law in their common and public life, is making a great mistake.2
WHILE ONE would expect the criminal arm of the New York City bureaucracy to be fully occupied with more momentous problems, it was able in February of this year to find time to hail a group of girls before the bar of justice for “glue sniffing,” allegedly for the narcotic effect. Though we think of the criminal law as designed to protect individuals against the acts of other individuals, these girls were charged with “impairing their own health and morals.” In a nation which allows persons to be punished on such grounds, those of us who are interested in the maximization of personal liberty should consider whether John Stuart Mill’s maxim that the state should not interfere in the private acts of individuals is a minimum condition for a free society.
The maxims are, first, that the individual is not accountable to society for his actions, in so far as these concern the interests of no person but himself. Advice, instruction, persuasion, and avoidance by other people if thought necessary by them for their own good, are the only measures by which society can justifiably express its dislike or disapprobation of his conduct.3
Most conservatives will be quick to agree with Mill’s refusal to let the law interfere in the private sphere when some new economic regulation is under consideration. Ambassador Galbraith’s vendetta against tailfins and his desire to protect the consumer against his imprudent free choice, Ambassador Stevenson’s indignation at the “myth of privacy,” and Social Security taxes designed to provide workers to provide for themselves in their old age, are all greeted by the “right” as attempts to subvert our liberty. Yet, conservative “defenders of freedom” are often found shouting the loudest for coercion to enforce their particular moral concepts or standards of decency in situations directly affecting only those who are voluntary parties to an act. (And, to compound the paradox, it is often the interventionist liberal who most vigorously upholds personal freedom in these areas.)
At the risk of offending those who not only support these prohibitions but also feel they are too delicate to be discussed in a scholarly journal, I invite the reader to examine an area of law which is all too often ignored in analyses of the role of the state. If adherents to the libertarian-conservative philosophy of limited government are to deserve the intellectual respectability which they claim, they have a duty to develop a more consistent practical application of that philosophy.
LIKE MOST important intellectual controversies, the proper role of state intervention in the private affairs of citizens was a point of vigorous dispute among the early Greeks. Little is known of that Greek libertarian tradition which received one of its most eloquent formulations in the funeral oration of Pericles and seems to have been advocated by Democritus, Protagoras, and Lycophron.4 Lycophron is quoted by Aristotle as demanding that the state should be merely a “covenant by which men assure one another of justice” and a “co-operative association for the prevention of crime.” In reaction to this first known advocacy of an “open society” came the totalitarianism of Plato, who urged the complete regulation of every important aspect of the citizen’s moral life as a necessity for a “virtuous society.” Though seeing more value in freedom of action than Plato, Aristotle still saw “virtue” as the end of the polis, an end to which the secondary value of freedom of action readily gave way. While this Platonic-Aristotelian tradition triumphed over its libertarian alternative and was not successfully attacked until recent centuries, the early Christian and medieval period gave far more deference to individual freedom than is commonly supposed. St. Augustine viewed the state as completely unable to improve the moral constitution of its already corrupted citizens and consequently advocated severe limitations on state intervention. While St. Thomas, like Aristotle, saw virtue as the principal end of the state, he emphasized the importance of “individual autonomy” and the necessity of allowing free choice between right and wrong. The last few centuries have seen an erosion of this Platonic-Aristotelian tradition and a return in the Protestant countries to the social contract and policeman state concepts of Lycophron. Ironically, most Catholic countries do not proscribe many of those acts which are penalized in some supposedly more liberal Protestant nations: gambling, use of alcoholic beverages, homosexuality, adultery, and fornication.5
The early nineteenth century saw England, under Benthamite influence, gradually move toward the position that the state should not interfere in the private lives of its citizens. It was, however, in the United States that lovers of freedom placed their greatest hopes. Lord Acton was prompted to write:
Europe seemed incapable of becoming the home of free states. It was from America that the plain idea that men ought to mind their own business, and that the nation is responsible to Heaven for the acts of State, burst forth like a conqueror upon the world they were destined to transform, under the title of the Rights of Man.6
Unfortunately, the “home of free states” has had to endure periods of legal moralizing on such a hysterical scale that even Plato might have blanched to see it. Characteristic of the “yahooism” so vividly described by H. L. Mencken was the career of Anthony Comstock, who in the 1870’s led a nationwide campaign for legal proscription of conduct which he considered sinful. He succeeded, as leader of the New York Society for the Suppression of Vice and later as special agent for the Post Office, in preventing the circulation through the U. S. mails of birth control information and other literature considered by persons of his turn of mind to be obscene or immoral. This jehad against sin which characterized the Populism and Bryanism of the turn of the century had not even the mitigating value of being directed by a Platonic elite; rather it was a grassroots movement based largely on religious moralizing and a dislike of the “strange ways” and religious trends in the urban centers of the East.
The crowning glory of this radical democracy was the Prohibition Amendment, which stands as one of the most striking examples in the history of a free nation of an attempt by one group to impose its personal mores on others. Since the beginning of the New Deal and the repeal of Prohibition in all but a few states, government action against private immorality has tended to decline. The grassroots citizenry has since turned to the more lucrative pastime of regulating the economic lives rather than the moral lives of others. Though state regulation of private conduct is at present far less pervasive than economic intervention, it is still extensive enough to constitute a serious limitation on freedom.
A long and comprehensive list of the areas of personal morality regulated by law, from federal statute down to minute municipal ordinance, could easily be composed, but some of the more far-reaching are: (1) prohibitions against the sale of birth control devices or the dissemination of birth control information, (2) statutes forbidding the intermarriage or cohabitation of persons of different racial groups, (3) statutes classifying suicide or attempted suicide as a crime and classifying voluntary euthanasia (the killing of a suffering and hopelessly incurable person at his own request) or otherwise aiding a person in committing suicide as murder, (4) prohibition of fornication, prostitution, and homosexual and other unnatural sex acts between consenting adults, (5) prohibitions against the sale of narcotics and alcoholic beverages, (6) statutes prohibiting gambling, (7) censorship of allegedly “obscene” or “immoral” books or movies.7
Even should we accept the classical liberal maxim that a person should not be punished for his own “sins” as long as no non-consenting party is injured, other problems are raised by the laws mentioned above. If we hold that the law should not protect adults from themselves, we still must deal with the thorny problem of special protection for juveniles, the insane, and possibly the feeble-minded. Should the sale of certain narcotics or alcoholic beverages be likely to lead to crime, it has been argued that legal prohibition is needed to protect innocent parties. Moreover, are persons to be born entitled to any protection? Should measures be taken to protect their genetic stock from deterioration? In examining the extent of government coercion in these areas of private conduct, I will attempt to show that almost none of this legislation can be justified as preventing harm to juveniles or innocent third parties. I conclude with a theoretical proposal concerning the proper delineation of the private sphere from the sphere of allowable government intervention.
NOT ONLY IS the Comstock Act, which prohibits distribution of contraceptives or birth control information for other than medical purposes through the mail, still federal law but, also, twenty-two states prohibit or limit the sale of contraceptives. In five states, Connecticut, Kansas, Massachusetts, Mississippi, and Nebraska, the statute makes no exceptions; even if pregnancy would result in death or serious injury to the wife, the sale of contraceptives or dissemination of information still can technically land the party in jail. In Connecticut the use of contraceptives is also a criminal offense. While the status of these laws is unclear in Mississippi, Kansas, and Nebraska, high courts in Connecticut and Massachusetts have held that the health or life of the wife does not prevail against legislative fiat.8 To any future widowers as a result of this statute, the highest Connecticut court offers this brutal condolence: the legislature left them free to practice the alternative of “abstention.” The United States Supreme Court has thus far avoided testing the constitutionality of these statutes, but it will soon be forced to reach a decision. On November 1st of last year a Planned Parenthood Center was opened in New Haven, Connecticut, to disseminate birth control information. Ten days later officers of the center were arrested. There seems to be no gimmick by which the Supreme Court can avoid deciding this case, now on appeal, on the constitutional issues involved.
These anti-birth control laws, though reduced in importance because of the reluctance of officials to enforce them, (except when their hand is forced, as in New Haven, by deliberate publicity) make a travesty of that “plain idea that men ought to mind their own business.” No injury is claimed to third parties, unless one is willing to argue that a potential person deprived of existence is somehow entitled to legal protection. Protection of juveniles seems to raise no special problem, as it might in the case of narcotics and alcoholic beverages. These laws seem to be a clearcut example of a partially successful attempt to impose a personal religious and moral code on dissenters.
THE SUPREME COURT has also avoided passing on the constitutionality of state laws which touch on one of the most personal decisions in an individual’s life: the choice of a spouse. Twenty-two states, six of them outside the South, still prohibit interracial marriages. While one-eighth Negro blood is enough to constitute a person a Negro in most of these states, in Georgia and Virginia an “ascertainable trace” is sufficient.9 The record of such prohibitions is sanctified with age; a Jamestown ordinance proclaimed that a white man should be publicly whipped “for abusing himself to the dishonour of God and the shame of Christians by defiling his body in lying with a Negro.” After passage of the Fourteenth Amendment, state courts generally upheld the constitutionality of these statutes. The Alabama Supreme Court summed up the usual rationale for these laws in an 1877 case:
The natural law, which forbids their intermarriage and that amalgamation which leads to a corruption of races, is as clearly divine as that which imparts to them different natures.10
The California Supreme Court is the only high court which has thus far struck down such a statute as violating the federal constitution.
Unlike some “moral legislation,” which is allowed to lie dormant on the statute books, these statutes are still often vigorously enforced, especially in the South. The depravity to which “protectors of the white race” will resort in enforcing these statutes is illustrated by the fate of Davis Knight, who at the age of twenty-three was sentenced to five years imprisonment in Mississippi for marrying a white girl. He was classified as a Negro because his great-grandmother was a Negro, a fact which neither he nor his parents knew at the time of his marriage.11
These fantastic laws fail to arouse the attention they deserve both because they affect only a small group of people and because they have for so long been a part of the judicial landscape. In a nation where social security or the banning of a Communist speaker bring forth cries of tyranny, one might expect more indignation at prohibitions in this area of private choice. If we require injury to innocent parties as the basis for criminal liability, these laws clearly cannot be justified on the grounds by which they are usually defended: that intermarriage is inherently sinful, contrary to the Bible, or opposed to natural law. Likewise, the notion that the “white race has a right to protect itself” is necessarily a collectivist and mystical sentiment which can hardly give justification for criminal sanctions if freedom is a serious goal. These laws are defended by the more sophisticated on a eugenic basis: intermarriage will allegedly lower the genetic quality of unborn generations; hence anti-miscegenation laws prevent injury to unborn persons. But even if this argument is admitted to have a sound scientific basis, few if any of the proponents of this legislation would admit the general principle that the state may regulate marriage whenever justified by “scientifically proven” eugenic principles. As a general principle, the choice of a spouse is one decision almost all of us want left to the individual, regardless of the genetic consequences; it is only when the additional prejudices and fears surrounding intermarriage are involved that we allow such restrictions.
PERHAPS THE ACID test of one’s belief in freedom of choice arises when we are asked: does a person have the legal right to terminate his own life? Further, may a person, without incurring criminal liability, assist another in terminating his own life? Though our Christian tradition has regarded suicide as a heinous crime, authorities have been understandably vexed in applying sanctions. By canon of King Edgar in 967, English suicides were denied burial rites; this tradition was later embellished by burial on a highway with a stake through the heart, coupled with forfeiture of the suicide’s possessions to the Crown. In a famous early case the court outlined the reasons for treating suicide as a crime:12 (1) it is against nature “because it is contrary to the rules of self-preservation, which is the principle of nature, for everything living does by instinct of nature defend itself from destruction, and then to destroy one’s self is contrary to nature and a thing most horrible,” (2) to kill one’s self is a breach of God’s command “thou shalt not kill,” (3) the king loses a subject, “he being the head has lost one of his mystical members.”
Classification of suicide as a felony is not an academic matter when applied to attempted suicides and accomplices to a suicide. Attempted suicide is a crime in England; from 1946 to 1955, 5,794 cases were tried by courts and 308 persons actually went to prison. In the United States neither suicide nor attempted suicide is a crime in most states, and in those states that do make attempted suicide a criminal offense, prosecutions are rare. However, assisting a person in committing suicide at his own request is generally punished as murder in both England and the United States.13
The legality of voluntary euthanasia, or “mercy killing,” provides a particularly controversial variation to this general question. Courts are frequently called upon to decide whether a doctor commits murder when he kills a patient who is in serious pain from an incurable disease, and requests that the doctor end his misery. Polls have indicated that about half of Americans and an even larger portion of Britons favor mercy killings;14 this is reflected in the frequent refusals of juries to find that a killing was committed, even when the defendant had admitted the killing.15 Though euthanasia societies in both Britain and the United States have pressed for a change in legislation,16 only Uruguay has enacted a law legalizing voluntary euthanasia. Since many of the objections to mercy killing may be overcome by proper legal draftsmanship—the danger of a hasty decision due to a temporary whim induced by suffering, and the danger that the doctor could get away with actual murder for selfish reasons—the argument against it rests largely on theological and ethical strictures concerning felo de se, a crime against one’s self.17 We are told that suicide, of which voluntary euthanasia is a form, is a sin which must be discouraged by the law. According to St. Thomas and the mainstream of Catholic thought, “suicide is the most fatal of sins, because it cannot be repented of.”18 Further, God reserved the right to take away life at the appointed time, an argument which would seem to prohibit the use of any medicine to prolong life beyond the time when the patient would otherwise die. Finally, the Biblical commandment against killing, as well as the nobility and desirability of suffering as part of a “divine plan,” is invoked. Advocates of personal liberty may or may not agree that suicide to prevent a slow and agonizing death is sinful. Undoubtedly, they would unanimously admire the person who would voluntarily stick it out to the bitter end on moral or religious grounds. And certainly they would not argue that such a person should be killed against his will; the compulsory euthanasia centers for the feeble-minded and the deformed set up by the Nazis have left a bad taste in the mouth of humanity. But even to those who accept this view of the sinfulness of suicide and who would live by this principle themselves and fervently urge it on others, it should appear as an act of barbarity to urge state coercion against those who choose relief from pain over the moral principle involved. Imposition of religious views here smacks not only of cruelty but of hypocrisy, when those who would let the “absolute value of human life” prevail against the voluntary choice of the subject refuse to take an unqualified stand against the involuntary taking of human life through capital punishment or war. Joseph Fletcher, a noted physician and writer, has commented on this strange double standard:
We are, by some strange habit of mind and heart, willing to impose death but unwilling to permit it: we will justify humanly contrived death when it violates the human integrity of its victims, but we condemn it when it is an intelligent voluntary decision. If death is not inevitable anyway, not desired by the subject, and not merciful, it is righteous! If it is happening anyway and is freely embraced and merciful, then it is wrong.19
Probably the most common ground for attacking voluntary euthanasia is the fallibility of doctors. Either the malady might be erroneously diagnosed as incurable or a cure might be discovered in time to save the patient. Since it is certainly possible that, even with the checks provided by legislation, some would choose euthanasia who might have been cured, we are driven back to the original question: should it be a criminal offense to aid a person in taking his own life, regardless of whether he is doomed to a painful death in the near future. If we hold strictly to the principle that the criminal law should protect only innocent parties and should not recognize “crimes against one’s self,” we could not sanction the use of law to prevent a person from seeking aid in his own destruction. (As a corollary we could not punish the person who gave the aid.) Unlike some of the other issues raised in this article, this would appear to be an unambiguous and clearcut application of this principle. However, this conclusion runs so directly counter to certain threads in our political and moral thought that it would be a bitter pill even for many strong advocates of freedom to swallow. The notion of the “absolute sanctity of human life” has been a vital part not only of our religious tradition but also of the classical liberal tradition from which our attachment to freedom largely derives. (Another related and possibly more difficult question is whether dueling should be permitted when both parties agree to the duel.) While sanctions against voluntary euthanasia might be liberalized or even abandoned in the future, emotions and countervailing philosophical attitudes are probably too strong to make a change in the law in regard to aiding a suicide likely in the near future.
WHILE CRIMINAL sanctions against voluntary euthanasia raise serious issues for a philosophy of freedom, our attempts to regulate the private sexual conduct of adults through state coercion have been more in the nature of a very bad joke. At present, the fantastic hodgepodge of statutes and ordinances of the various states prohibiting fornication, adultery,20 and those acts amorphously grouped as “unmentionable crimes against nature,” even when in private between consenting adults, cover such a wide range of sexual conduct that, according to Kinsey’s sampling, ninety-five percent of all males have committed a criminal sexual offense sometime in their lives.21 These statutes, if literally applied, would in most states not only result in criminal sanctions against all those engaged in sexual relations of any sort outside the marriage bond, but would, under the fantastic statutory definitions of “crimes against nature,” etc., result in criminal penalties against a large percentage, probably a majority, of married couples for their private relations. State interference in private sexual affairs was not extensive in the western world until rather recently; at present, statutory meddling in the Anglo-Saxon countries is probably more comprehensive and detailed than it has ever been.
Most of the advanced nations of the world attach no criminal penalty to fornication, though they probably do this out of recognition of the fact that the practice is well-nigh universal and literal enforcement would lead to a call to arms by the citizenry rather then out of a feeling that this is none of the state’s business. Further, even adultery is normally not a statutory offense, though the injured spouse usually has civil redress. The Code Napoleon abolished such offenses between consenting adults, and France’s example has been followed throughout Western Europe. American crusaders against sin, however, have been able to hold the line in most state legislatures in an unsuccessful attempt to enforce their moral code on a population that generally rejects it. All but eleven states make fornication a criminal offense, penalties ranging from a three-year jail sentence in Arizona to fines alone in seven states. The federal government has plunged into the battle with the Mann White Slave Act, which prohibits the transporting of a woman across a state line for “immoral purposes.” The courts have held that a federal crime is committed should one have relations with one’s girl friend on an interstate drive.22 But the interventionists have had their reverses; the California courts recently struck down all municipal legislation prohibiting fornication and adultery (because state legislation, which contained no bar, had preempted the field), in the face of the warning of the Los Angeles Police Chief that this was a “Bill of Rights for prostitutes” and that “a hedonistic philosophy is filling the void created by the destruction of the Victorian culture.”23
Until rather recently, criminal law codes in western countries had paid little attention to homosexual acts between consenting adults, and the civil and ecclesiastical sanctions which did exist were justified in large part by the notion that God would visit the fate of Sodom and Gomorrah on those nations where such conduct was tolerated. Since we optimistically assume that this belief has lost wide acceptance, we might expect the laws to likewise disappear. All the nations of Western Europe except West Germany and Austria have repealed laws penalizing these “crimes against nature” except when juveniles are involved or when the crime occurs in a public place. But in Britain and the United States the statutes have been extended rather than repealed and now in most states cover such a wide variety of allegedly “unnatural acts” that husband and wife are no longer safe from legislative meddling. Practices widely recommended by most modern marriage manuals and in no conceivable way the legitimate concern of “organized society” are technically criminal even though probably practiced by a majority of married couples.24 While there is fantastic variety and confusion in the laws of the various states (partly due to the embarrassment of legislators who wanted to prohibit the act without mentioning its name or describing it), every state except Illinois makes some form of private homosexual conduct between consenting adults a prison offense. In England the maximum penalty for sodomy is life imprisonment. Many states impose a high minimum sentence; for example, Rhode Island sets seven years. Maximum sentence varies from life in Nevada and sixty years in North Carolina to three years in several states. Some states, such as Colorado, underline the social danger by depriving those convicted of the right to vote, serve on a jury, or hold public office.
Perusing the statutes and ordinances alone, we might assume that minute government regulation of almost every form of sexual activity has reached a level of intensity which would gratify Orwell’s Anti-Sex League. But the citizenry has been given a reprieve: Big Brother seldom enforces these laws. Only in Massachusetts is the adultery statute widely enforced, and prohibitions against fornication and the various “unnatural acts” between male and female in private are only sporadically enforced by exceptionally vigorous upholders of community mores. (Witness the zeal of the Atlanta ordinance which required that shades should be pulled when mannequins were being undressed.) Even homosexuals, though subject much more to public indignation, are rarely imprisoned for voluntary private acts between adults. In both Britain and the United States, most arrests involve either juveniles or “public indecency” of some sort. However, sporadic round-ups by the police, even when resulting in probation rather than a jail sentence, have been so blatantly unjustifiable as a necessary police measure, that respected jurists in both Britain and this country have argued vigorously for abolition of this offense. The authoritative American Law Institute, upon the recommendation of such noted jurists as Learned Hand, has urged that such laws be repealed. Thus far, their recommendation has been adopted only in Illinois, where the legislature this year removed all criminal penalties for private relations between consenting adults. In Britain, the Wolfenden report recommending that all such prohibitions be removed is still the subject of heated debate.
Jeremy Bentham classified prohibited sex acts in which there is neither violence, fraud, nor interference with the rights of others, as “imaginary offenses” in which no penalty is justified. Fornication (including prostitution) and homosexuality, when juveniles are not involved and when not occurring in a public place, obviously qualify under this criterion as “imaginary offenses”: no question of violence or fraud is involved (these are covered by separate statutes) and no third parties are directly injured except insofar as they are somehow offended or indignant that such things occur in their vicinity. The most important exception would be that fornication could be viewed as injuring resulting children due to social opprobrium placed upon illegitimacy. The usual motivation for such legislation, however, is to either protect the offender from himself or to give vent to community indignation and to enforce its notion of sin. This fact seems to be generally recognized by informed persons who have a high regard for individual freedom, as is indicated by the widespread support for the Wolfenden Report in Britain and by the attempts to liberalize the law in this country. In an otherwise statist era, this is probably one field in which the trend will be toward more individual freedom rather than less.
WHILE ALCOHOLIC beverages are barred in only a few states, a pervasive net of state and federal laws and international treaties provide almost universal prohibition or regulation of addicting narcotics. In addition to the desire to save the potential addict from his own misconduct, to prevent him from metaphorically “selling himself into slavery,” these prohibitions are motivated by two other important considerations. Addiction allegedly leads to crime either through its direct effect on the mind or due to the need for money to buy more of the drug; hence, innocent parties are protected by these laws. Further, it is argued that if drugs are sold freely, there is no practical way to prevent them from falling into the hands of juveniles, who it is felt should be protected from addiction. As to the first consideration, a recent joint American Bar Association-American Medical Association report has accepted the position of most modern authorities that drugs themselves reduce the propensity to commit crimes of violence. Addicts are forced to commit crimes mainly to pay the exorbitant prices of illegal narcotics which can be supplied only by the underworld.25 Narcotics, if traded on a free market, would be among the cheapest of commodities.
Anti-gambling statutes, as well as the traditional unwillingness of courts to enforce gambling contracts, are, like narcotics laws, attempts to protect individuals from their own sinfulness and imprudence. Like alcohol, gambling is regarded by many religious groups not only as a “vice of Babylon” but as one justifying state interference. Just as Galbraith would prevent the consumer from imprudently wasting his money on “unnecessary” car appliances, many anti-gambling zealots would mark off this area as one where consumer choice should not be allowed. The money could, in their estimation, be put to better use. While anti-gambling and anti-narcotics laws are both to an extent honestly motivated by a desire to prevent crimes of violence, these very laws have provided a new haven for organized crime, evicted from its most lucrative business by repeal of Prohibition, since it is now the only institution that can carry on the trade.
Censorship statutes and ordinances are too complex and present too many issues to be covered adequately here. Book-banning to prevent “immoral” and “obscene” literature from passing into the hands of the citizenry by local and state officials as well as by the Post Office has been one of the most comprehensive devices for imposition of the tastes and values of a majority or of a vocal group upon the rest of the community. The Supreme Court itself applies a majoritarian test in determining whether a book may be constitutionally banned:
Whether to the average person, applying contemporary community standards, the dominant theme of the material taken as a whole appeals to prurient interest.26
However, many advocates of censorship are undoubtedly motivated not by a desire merely to impose their tastes and sentiments by force but by a genuine concern for the effect pornography will have on juveniles and on juvenile crime. While the effect of pornography on juveniles is still an open question, a comprehensive recent study concludes that there is no evidence that pornography has harmful effects on juveniles and promotes crime, and that there is some indication that attempts to keep pornography from juveniles may be harmful.27
The problems discussed above are only some of the more spectacular and controversial examples of moral legislation. A complete survey would deal with laws regulating or prohibiting commercial activity on the Lord’s day, artificial insemination, voluntary sterilization, polygamy, incest, and physical injury to a consenting subject. We have dealt exclusively with criminal laws; civil laws making various “immoral” contracts unenforceable or otherwise interjecting community moral values into the process of civil recovery raise similar issues. Instead of extending this survey of state intervention into the sphere of private morality, I propose to devote the remainder of this article to the more fundamental question: why should the state be prevented from legislating morality?
WHILE ALL GENUINE friends of freedom will be distressed at this panorama of official meddling, it is admittedly more difficult to translate this sentiment into a practical and consistent principle delineating a sphere of private moral conduct with which the state may not interfere. Mill would bar the state from interfering with a person’s acts which “concern the interests of no person but himself.” Unfortunately, this maxim can be stretched to sanction almost any conceivable state intervention, since it is difficult to conceive of any act which could not adversely affect others. Excessive drinking, excessive TV watching, or refusal to go to college may make a person less productive, thereby lessening his ability to support a family, to produce goods desired by others, and to pay taxes. One who regularly reads the Congressional Record might conceivably be driven insane and commit homicide. An improper diet could make one less attractive or pleasant. Birth control might lower the population growth, which could arguably hinder the defense effort. Even when practiced in an outpost isolated from public view, poker games, nudist colonies, rock ’n’ roll dancing, even theatre going may be offensive through their mere presence to some citizens.
While the primary purpose of this article is to present the problem rather than to propose an airtight theoretical answer, a few suggestions may be ventured. Juveniles and other persons given a similar status because of insanity or judicially declared incompetence will undoubtedly be given a special protective status under the criminal law; the limits of such protection present too complex a problem to be subsumed under a general principle. After side-stepping this problem, the following general maxim can be offered: the criminal law may punish no category of acts which are not directly injurious to persons who do not consent to the act.28
“Directly” in this context is an admittedly ambiguous term. I employ it in a sense analogous to the legal term “proximate cause.”29 Under this criterion I would exclude the following acts from the area of legitimate concern of criminal legislation: (1) Injury to a consenting party would not be grounds for punishing another party. (2) Mere outrage or indignation that such acts are going on would not constitute direct injury. (3) Acts which injure others only insofar as they make the actor a less able, virtuous, or pleasant person (by affecting his money-making capacity, etc.) or which deprive others of his contributions altogether (e.g., suicide) will not be sufficient grounds for government interference. (4) Acts which, when taken in the aggregate, might injure others in a remote and secondary way, as by tending to decrease the birth rate or to debase the genetic stock of the community, or by tending to lower the “moral tone” of the community by causing changes in attitudes, will not be grounds for punishment, especially when the alleged tendency is hypothetical and supported only by popular attitudes rather than by conclusive scientific evidence.
This position, held implicitly or explicitly by classical liberals for the last two centuries, has been the subject of intense criticism, some eloquent and penetrating, in recent years. The most frequent criticisms might be mentioned.
First is the “civil libertine” charge: those who would not punish vice by jail sentences thereby spend their nights practicing it. While classical liberals might incur their proportionate share of sin, this argument needs no reply. The personal habits of advocates of an idea can hardly affect the validity of the idea.
Second, those who would not penalize immorality either do not believe in absolute values or do not have any system of values at all. This assessment may be valid to the extent that one who rejects a system of fixed, absolute values is more likely to object to state enforcement of such alleged values. One is more likely to reject state enforcement of morals if he agrees with F. A. Hayek that:
. . . even what we regard as good or beautiful is changeable—if not in any recognizable manner that would entitle us to take a relativistic position, then in the sense that in many respects we do not know what will appear as good or beautiful to another generation.30
and rejects W. F. Buckley’s claim that, as to moral values
. . . all that is finally important in human experience is behind us; that the crucial explorations have been undertaken, and that it is given to man to know what are the great truths that emerged from them.31
However, libertarians number among their ranks those who base their position on absolute moral values and who find freedom a necessity because it is a condition precedent to moral choice and because the state, by its intervention, will inevitably thwart morality.32 At any rate, to condemn a political position merely by linking it with a philosophical position completely begs the issue of its validity.
Third, by repealing laws prohibiting immoral acts, “society” somehow appears to condone and encourage them. This argument, like its twin brothers in the economic realm—if you vote against Kennedy’s farm program, you are against the farmer; if you are against forced desegregation of private dwellings, you are against the Negro—is based on a pernicious premise: the state condones and encourages those things which it does not condemn or act to prevent. Holding to such a doctrine presupposes a rejection of the concept of limited government, whereby most decisions in vital matters, both moral and economic, are left to individual choice. This area of unrestricted freedom is narrowed in principle as much when the state acts as moral conscience as when it acts as policeman. Such is true even when the law is not enforced, though the policy will be less effective.
It has been urged with more plausibility that it would be better to leave such statutes on the books and not enforce them or only enforce them sporadically, since repeal would be taken as a positive encouragement to engage in the prohibited act. If it were true that such statutes would actually remain dead and would not be gleefully “rediscovered” by some zealot, there would be little advantage in repeal other than the possibility that by not enforcing certain laws we breed disrespect for the remaining laws. However, playing Russian roulette through sporadic enforcement wreaks havoc with the rule of law. The selection of persons to be prosecuted becomes arbitrary and rests solely on the whims of the bureaucrat who happens to be charged with its enforcement. In any case, no evidence has been offered that persons will engage in acts simply because they think repeal of a criminal law showed that “society” or the government thereby encouraged and condoned it.
Fourth, one cannot separate law and morals, as those who would not punish private immorality allegedly would do. This criticism is at least in part due to the tendency of liberal legal philosophers, such as H. L. A. Hart,33 to emphasize the “separation of law and morals” and to couple this with a plea that the law not enforce morality. Actually, neither criminal nor civil law can be divorced from community moral concepts (a close reading of Hart shows he does not deny this), and this in no way undermines the position advocated generally by classical liberals. Generally speaking, an act must be considered in some way “immoral” or “evil” before a criminal sanction can be applied to it. If the criminal law is radically out of step with what Eugen Ehrlich dubbed the “living law,” i.e., if acts are punished which are not considered meriting punishment by most members of society, the statutory law will probably be changed to parallel this living law. Prohibition is the usual example given. Further, the severity of punishment will be determined in large measure by the degree of turpitude attached to the crime. However, this “moral turpitude” is clearly at most a necessary, not a sufficient, condition for criminal sanction. No one would argue that all acts considered immoral—lying, indolence, not going to church—should be punished. The position taken here is that not only must the penalized act be condemned morally by members of society, but that also some non-consenting party must be directly injured by this category of act.
Fifth, coercive enforcement of a moral code is necessary to prevent society from “disintegrating.” Sir Patrick Devlin has eloquently argued, in an attack on the Wolfenden Report’s contention that there must be a realm of private morality which is “not the law’s business,” that the threat of such disintegration gives a justification to moral legislation which has no theoretical limit.34 “Society means a community of shared ideas,” of which moral and ethical ideas are a part. Without fundamental agreement on good and evil, “society will fail.” Since a recognized morality is necessary to society’s existence, “prima facie, society has the right to legislate against immorality as such,” just as it has a right to legislate against subversion. As “society” is used here in a vague sense and has, at first glance, a suggestion of mysticism, we need to pin down just what is meant by “disintegration of society.”
Two possibilities suggest themselves. The way people live and interact becomes changed in a manner short of the breakdown of law and order—i.e., violence is still controlled by law and commerce continues. Let us say that “society” consisted of shared ideas A, B, C, D. Then A disappears and is replaced by E. Since the original society, that is, shared ideas A, B, C, D, no longer “exists,” in this definitional sense, society has disintegrated. This is what Devlin seems to have had in mind when he asserts that since monogamy is part of the structure of our society, it “could not be removed without bringing it down.” No assertion is made that polygamy is not workable where practiced or that its adaption or partial acceptance would lead to any breakdown in law and order. Merely, the society in question would cease to exist because monogamy is claimed to be an essential ingredient of this society. In this sense the statement that “society has a right to protect itself” is a mystical guise for the assertion that certain persons have a right to prevent, by use of the policeman, the adoption by others of a practice which they do not like. Devlin’s “society,” “a community of shared ideas” is an abstraction which, as such, is incapable of any action whatsoever.
Devlin may be interpreted to mean, however, that if an established morality is not enforced, anarchy and a breakdown of law and order will result, as allegedly occurred in Rome. This would be a more concrete and persuasive argument. One could object to this thesis that there seems to be no evidence that private immorality has been the cause rather than a symptom of any descent into chaos; and if an immoral trend was actually threatening chaos, laws would be ineffectual to stem it, due to enforcement difficulties. Actually, this “trend” would certainly be impossible to properly diagnose, and such an alleged danger could be used to justify almost any moral legislation. Devlin, however, gives little indication that he is concerned with this problem. He makes no offer of criteria to determine when immorality threatens chaos. Instead, he contends that the state may intervene in private affairs whenever an act is considered “a vice so abominable that its mere presence is an offense.” This is a mere thermometer of intolerance, and is unrelated to any analysis of the danger to law and order. I would conclude that Devlin and others who argue about “society’s right to defend itself” are, in company with those who propound “the white race’s right to defend itself,” are merely urging that changes in voluntary relations between individuals which they strongly dislike be stopped. It is undoubtedly true that a certain consensus as to morality may be an absolute necessity to prevent chaos. If the majority of persons believed in the goodness of indiscriminate theft or murder, the strongest law enforcement would probably not be sufficient to uphold minimum order. But as to conduct not directly injuring third parties, there seems to be no indication that a divergence of practice will do any more than prove vexatious to an indignant majority.
Even if the usual attacks on our maxim are rejected, the basic query remains. Why should “society” permit individuals to act according to their own dictates, even if the course chosen is one considered markedly evil or immoral? The reasons are identical with those which lead us to urge that economic decisions be left to private individuals rather than government, even when, in our estimation, not enough books and education and too many large cars and cigarettes are being purchased. While we may regret the result of freedom in particular cases, we hold that on balance the good will outweigh the bad.
Virtually all readers of a conservative-libertarian journal accept, to some degree, that aspect of Western political theory which has made it unique—the emphasis on what is broadly described as the “innate dignity of the individual.” This notion can only be given content if these individuals are free to make their own mistakes as well as to make “correct” decisions. Freedom of choice becomes either an intrinsic good or a necessary prerequisite for truly moral decisions. The Platonic tradition of paternalism, in which obedience and uniformity replace freedom as ultimate goals, leaves no place for an autonomous individual to whom is accorded the dignity and responsibility of making by his own decisions a success or failure of his life. If we agree that the “innate dignity” of an individual implies his right to make his own mistakes, this principle would nowhere be more applicable than in those areas of allegedly sinful acts which injure no one but the consenting actors.
A more subtle but probably more forceful argument is that developed by those writers in the Anglo-Saxon Whig tradition, which has received its most recent formulation in F. A. Hayek’s Constitution of Liberty. Writers in this tradition, who include David Hume and Lord Acton, argue that civilization will advance more rapidly and more satisfactorily if decisions are left to the spontaneous interaction of individuals acting voluntarily rather than to the edict of a governmental body with a monopoly of coercion. Even the great conservative Edmund Burke, who derived most of his political philosophy from this tradition, argued that, contrary to the latent totalitarianism of the French Revolution (which he successfully predicted would burst forth into actual totalitarianism), the inalienable rights of Englishmen provide that, “whatever each man can separately do, without trespassing on others, he has a right to do for himself.”35
This view is based on a pessimistic view of man’s nature and his knowledge, rather than an optimistic one, as is sometimes charged. Man’s ignorance of the factors of his environment and his inability to take into account more than a few facts at a time make it difficult enough for him to plan out his own life, even with his knowledge of his own desires and of those factors of his immediate environment which most directly affect his existence. To place such decisions in the hands of a central legislature or executive merely compounds a thousandfold this ignorance. The Authority not only has no way of effectively collecting and assimilating data on the desires of all those it controls, but it cannot assimilate the fantastic amount of other relevant data that individuals would use in making their decisions.
There was also no assumption that men were by nature good. On the contrary, defects in human nature would be magnified when persons were given extensive power of political coercion over others; this was pithily summed up in Acton’s warning about absolute power corrupting absolutely. They concluded that balanced progress could best be obtained by allowing human institutions to adjust to changed conditions and changed desires through a gradual process of free, voluntary interaction of individuals. Through trial and error this evolutionary process would reject what was found to be unsuited. This theory is, consequently, not built on an optimistic but on a pessimistic view of man’s nature and knowledge. It is only more optimistic than its “conservative” counterparts in that it holds that when men are given as much freedom as is possible while still providing for protection of innocent persons from violence, civilization will descend into neither chaos nor a grand Saturnalian debauchery.
While this position is most commonly associated with freedom in the economic sphere, it applies to all aspects of our lives which might come under central regulation, including our very value structure. As Hayek has noted:
It would be an error to believe that, to achieve a higher civilization, we have merely to put into effect the ideas now guiding us. If we are to advance, we must leave room for a continuous revision of our present conceptions and ideals which will be necessitated by further experience. We are as little able to conceive what civilization will be, or can be, five hundred or even fifty years hence as our medieval forefathers or even our grandfathers were able to forsee our manner of life today.36
Consequently, even our patterns of accepted morality should be open to some experimentation and change. It, too, in large part has been adapted through a continuous process of experimentation and evolution to suit the more basic problems of the times. Social pressure and the odium attached to conduct considered “immoral” are sufficient to insure a considerable degree of uniformity in moral conduct. However, if “immoral” conduct is not punished by the government, a few individuals will risk social odium if the incentive is great enough; as a result, there is room for some experimentation and gradual change.
It will be argued that, granted the general argument for freedom, certain immoral acts have no redeeming features whatsoever, and punishment of these acts can in no conceivable way impede progress. However, even if we can only see advantages in an ad hoc piece of legislation, we are much better off sticking to a general principle which bars all prohibitions of private immorality:
The argument for liberty, in the last resort, is indeed an argument for principles and against expediency in collective action . . . Not only is liberty a system under which all government action is guided by principles, but it is an ideal that will not be preserved unless it is itself accepted as an overriding principle governing all particular acts of legislation. Where no such fundamental rule is stubbornly adhered to as an ultimate ideal about which there must be no compromise for the sake of material advantages—as an ideal which, even though it may have to be temporarily infringed during a passing emergency, must form the basis of all permanent arrangements—freedom is almost certain to be destroyed by piecemeal encroachments. For in each particular instance it will be possible to promise concrete and tangible advantages as the result of a curtailment of freedom, while the benefits sacrificed will in their nature always be unknown and uncertain.37
We cannot tell in advance whether freedom in a given area will bring results which we or a later generation will consider desirable. If we could tell exactly in which areas beneficial results would flow from freedom, the general case for freedom would disappear, since it is the unforeseeable results of free interaction of individuals on which the case for liberty largely rests. Consequently, we may expect to be forced to put up with immediate results which we do not like if we are to consistently apply our doctrine. It is in the belief that on balance men will be better able to solve their problems if they are left as free as the necessary protection of others will permit, as well as out of the conviction that only maximum freedom of choice is consistent with our belief in man as a creature of inherent dignity and with moral responsibility, that we conclude that the sphere of private morality and immorality is none of the law’s business.
The Shortcomings of Right-Wing Foreign Policy
IN A RECENT article commenting on “American Conservatives,” a perceptive European champion of liberty suggested as a problem for the consideration of American intellectuals the capacity of a democracy to have a sensible foreign policy.1 This is a problem which will have to be answered and analyzed by the American Right before it can assert its claim for power or effectively exercise national office. This is because the Right has not been immune to certain long-standing inadequacies which have characterized American thinking on foreign policy. These shortcomings appeared in the foreign policy of Robert A. Taft, and appear also in the drastically different foreign policy position of Barry Gold-water. But before discussing these two men in particular, let us look at the historical background of these weaknesses in our traditional posture toward the rest of the world.
American foreign policy has long been crippled by two ideas which have persisted throughout our history: moralism and belief in American invincibility. By moralism I mean the tendency to apply ethics and canons of perfection governing individual action to collective action by society or the state. Such naivete, in the name of righteousness, would call for the state to follow the dictates of the beatitudes and the counsels of individual spiritual perfection and would decry the use of power in influencing diplomacy or foreign affairs. This false moralism fails to appreciate that political action should be governed by moral imperatives derived from the nature and purposes of the state, which is to exercise its authority and force, if need be, to promote justice, freedom, security, the general welfare, and civil unity or peace.2 The other attitude, overconfidence in American invincibility, is the paradoxical belief that the “innocent, moral United States” will be strong enough by itself to defeat and punish any potential aggressor. In short, these attitudes unite so as, first, to prevent American involvement in any balance of power politics which would hinder the development of aggressive nations and, second, to cause us to rely exclusively on our own capacity to repel and defeat those aggressors whose development we refused to prevent.
These attitudes probably derive from the twofold uniqueness of the American experience: our isolation behind the Atlantic Ocean and our orthodoxy of constitutional liberalism. Lacking the continental strife between an old and a new order (in our case the old order was the new order; there had been no old order or establishment to be overthrown), and secure in our isolation, we lacked all empathy with the objectives of nineteenth century European statesmen. We refused to understand their attempts to preserve a balanced concert of nations which could meet and absorb the various nationalist and revolutionary movements and thereby insure the organic and peaceful development of political liberty. Our national policy was to keep our hands clean of the “immoral” power-balancing machinations of the European nations.
At the time, isolationism was probably the policy best suited for the national interest. But the unfortunate attitudes engendered by it were to result in disastrous consequences when the force of events and technological progress would involve the United States in world affairs. The American leaders would have to invoke the slogans of a moralistic crusade to gain popular support for our intervention. During the first world war, for example, the popular attitude was rapidly changed from moralistic isolation and detachment from the “corrupt, fratricidal strife” of the European “war lords” only by an appeal to a mixture of anti-German racism and a crusading zeal in behalf of the secular religion of democracy.
The Wilson Administration did not base its case for our entry into the war on the legitimate grounds that the impending German domination of the seas would be incompatible with our national interest. Rather, we remained aloof from these balance of power considerations and stressed American idealism and our determination to fight “to end all wars” by making the world “safe for democracy.”
The duty of the statesman should be to determine the policies required by the national interest, and then to educate and lead the nation in the acceptance and implementation of those policies. He should lead public opinion and not follow it. It is dangerous for statesmen to start to follow public opinion, or to cater to public misunderstanding of foreign policy by justifying measures with crusading slogans. By doing so they run the risk of unleashing the engines of mass enthusiasm on a course of action which will be difficult to control or restrain. These engines may well follow through the logic of the crusade’s slogans to ends quite opposed to the designs of those who issued the call for a crusade.3
This danger was borne out by our post-World War I experiences. A public which had been called upon to fight a war to end all wars could scarcely be expected to bear the burdens of keeping the peace and international order when these tasks required the very same instruments which European statesmen had used for a century and against which America had allegedly fought in the war for “democracy.” The American flight from international responsibility was all the more tragic, since the temporizing influence of American power could perhaps have dampened the nationalistic passion for “final solutions” which developed out of the disillusionment of the democratic masses who had sacrificed so much in the war. These final solutions ranged all the way from “hang the Kaiser,” through demands for security, Bolshevism, to the madness of Nazism.
The debate preceding American entry into the Second World War was based on the same mistaken premises of American invincibility and moralism. Isolationists insisted that America could remain secure behind the Atlantic ocean no matter what happened abroad, and the interventionists called for us to enter the fray, arm in arm with British Toryism, Bolshevism, and the Kuomintang, carrying forth the banners of “anti-fascism,” the United Nations, and a world-wide New Deal.4 Our confusion of purpose in entering the war is easily appreciated by observing the present world situation and recalling some of the objectives for which the West then fought: Polish Independence, the security of the Western position in the Far East, and Chinese National Independence. The former two are probably less near realization today than in 1939, and the latter has been secured, but with rather dire consequences for the West. In view of the debilitating attitudes which have continually marred American foreign policy, let us proceed to examine the role of the American Right-wing spokesmen as critics of our post-war foreign policy.
THE AMERICAN Right, out of office since 1932, was untiring in its criticism of the utopianism which had dictated our wartime collaboration with the Soviet Union, and no doubt much political capital was gained by such. But the Democratic Administration had, to all intents and purposes, admitted these mistakes by changing its attitude towards international Communism and by shaking itself loose from the lingering proponents of the wartime policy, such as Henry Wallace. Hence, the post-World War II Right has to be judged by its criticism of the new policy which had been adopted by the Truman Administration in its attempt to check further Communist expansion: the containment policy.
Under it, aid was furnished to the anti-Communist forces in Greece and Turkey, European economic recovery was sought by such means as the Marshall Plan, an attempt was made to strengthen the economies and societies of the free world by Point Four aid, and NATO was formed. The primary shortcomings of this policy were its purely defensive nature and its tendency to regard the Soviet danger solely as a military threat. It relied primarily on military defenses and economic well-being to meet the expansion of Communism. This was a refusal to wage the Cold War in the ways which the Communists had initiated. No practical proposals were put forth for making inroads on the Communist empire, nor for meeting and countering the multi-faceted Communist challenge, particularly in the realm of political, diplomatic, and psychological warfare.
As a result, by refusing to exploit our atomic and military superiority we permitted the consolidation of the Communist gains of the Second World War, we failed to halt the expansion of Communism in the Far East, and, worst of all, we permitted ourselves to become embroiled in a drawn-out struggle on the enemy’s terms over territory we had originally written off. We had to settle for an unsatisfactory truce as exaggerated fears of provoking a general war inhibited us from utilizing our military and technical advantages in that action.
In Congress and out, the Right-wing opposition successfully exploited the public discontent with the Korean War and the concern over domestic Communism. However, the heritage of isolationism and the strong persistence of the traditional attitudes of moralism and American invincibility kept the Right from advancing any serious alternative proposals to the containment policy. At best, these limitations were overcome by a belated acceptance of some of the containment policies, but a continuing distrust of alliances and a basic lack of feeling for foreign affairs prevented the nationalist and anti-Communist enthusiasm of the American people from being channeled into a serious foreign policy alternative to containment. Thus, should they have assumed office, the actions of the American Right would probably have been just a poor imitation of the old containment thesis pursued with reservations and with less competence than it could be by the original authors of that policy.
This probability is borne out by the record of the Eisenhower Administration. Eisenhower was a conservative who simply expanded his isolationism and pacific moralism so as to include a broader area than the Western Hemisphere. His foreign policy was, to a large extent, an imitation of Truman’s. The major difference was that Eisenhower tended to withdraw himself from considerations of power politics in his apparent belief that the simple expression of his good intentions would be enough to secure international peace.
Naturally, today’s articulate Right-wingers disown Eisenhower, but let us examine the foreign policy ideas of the man who would have been their choice in 1952, Senator Robert A. Taft.
THE TAFT foreign policy, as outlined in his book, A Foreign Policy for Americans, was in general a critical and hesitant acceptance of the goals and measures of the containment policy. His criticism of specific ventures of the containment plan was based on the old Whiggish grounds of their formidable expense, their tendency to undermine American diplomatic independence, and their disregard for the authority of the Congress in declaring war. He feared the possible over-extension of the U. S. in an attempt to defend the whole free world. Furthermore, since he only hesitatingly abandoned his faith in the United Nations as an instrument of international law and peace, he was reluctant to enter defense or military alliances which would arrogate to themselves the functions of the UN and which did not have a foundation in international law.
Taft’s attitude toward the Atlantic Pact revealed his basic premises on international affairs. He approved the policy of notifying Russia that an attack on Western Europe would involve her in a war with the United States, for such was to him simply “the extension of the Monroe Doctrine to Europe.”5 Despite this, he voted against the ratification of the Atlantic Pact, because he considered it to be “contrary to the whole theory of the United Nations Charter, which had not then been shown to be ineffective; . . . because . . ., at least by implication, it committed the United States to the policy of a land war in Europe.”6 He had considered the pact to be a violation of the spirit of the United Nations since N.A.T.O. would not harmonize its actions with, nor seek authorization from, the Security Council.
Taft, however, gradually became convinced of the UN’s inadequacies, as he realized that in practice it was not based on a system of international law nor justice to which all the signatories would be bound. The use of the veto especially prevented this. As a result, deception and expediency became the rule for all sides in the organization. Taft criticized as expediency the Truman Administration’s attempts to cloak its independent anti-Communist activities in Europe and Korea under the mantle of the world organization. The disharmony between these legitimate anti-Communist measures and the United Nations, which was called upon to approve them, would eventually paralyze the implementation of the former.
Taft criticized Truman’s intervention in the Korean War under the UN mandate as an expedient use of the UN, because, in contrast to the organization’s rules, the mandate was not based on the consent of all the permanent members of the Security Council. He insisted that this expediency put us in a trap which would prevent us from getting a mandate to continue the war effectively once Russia returned from her “walkout.”
The Ohio Senator prophetically opposed our attempt to bypass the Security Council and the Russian veto by appealing to the General Assembly on certain issues. He pointed out that no nation had contracted to abide by any decision of the General Assembly. “Furthermore,” he remarked, “we would have only one vote among sixty, which sometime in the future, even in the very near future, may subject us to very arbitrary treatment.”7
Taft’s classic opposition to Truman’s use of American troops in the Korean War without Congressional consent reveals a determining factor in his foreign policy. As a strict constitutionalist he opposed any undermining of the authority of Congress in declaring war. Consequently, he opposed the committing of American troops without Congressional authorization to any spot where they were liable to come under attack or become involved in a war. Under this reasoning, the sending of troops to Europe, where they would serve as part of the N.A.T.O. defenses, was prohibited unless it received previous authorization from Congress. He likewise challenged the validity of committing troops simply under a UN mandate without Congressional approbation, for “on the same theory, he [the President] could send troops to Tibet to resist Communist aggression or to Indo-China or anywhere else in the world without the slightest voice of Congress in the matter.”8
Taft was, no doubt, an excellent theorist of the principles of international organization and law, and a perceptive critic of the expedient disharmony between our independent containment policy and our United Nations policy. Yet he lacked the creative imagination in foreign affairs for constructing serious alternatives to the administration’s containment policy or for appreciating the full nature of the Communist challenge. His tendency to rely almost exclusively on American air power for deterring Soviet expansion demonstrated an inflexibility in molding the necessary means for meeting the various facets of the Communist challenge. Furthermore, he devoted only three paragraphs in his book to a suggestion that we seek to promote anti-Communist movements behind the Iron Curtain, and his suggestions on political warfare are limited to proposals for the creation of a propaganda agency. He expressed no notion of exploiting diplomatically the internal situation of the Communist world, not even as a counter to their various threats and demands. In short, his criticism of the Truman Administration’s foreign policy was an attempt to modify its policy by the application of his strict Whiggish principles, as well as to note the imprudence of certain of its steps, rather than to postulate an alternative. Perhaps such was in accord with his ideas that the duty of the opposition party is to oppose rather than to present alternative policies.
THE AMERICAN Right never felt at home in the Elsenhower Administration, although large segments of the old Taft bloc played significant roles in that administration. Then in the twilight of the failing Eisenhower Administration, a charismatic new leader, Barry Goldwater, arose on the national scene and was assigned the mantle of Taft as the leader of American political conservatism.
In his foreign policy views Goldwater took a position quite unlike that of Taft, whose primary criticism of Truman had been of the arbitrary executive commitment of American forces abroad and who had stressed the defensive task against Communism. Goldwater, however, insisted that the national goal of “peace in which freedom and justice will prevail . . . is a peace in which Soviet power will no longer be in a position to threaten us and the rest of the world. A tolerable peace . . . must follow victory over Communism.”9
Where Taft had worried about over-extending and over-committing ourselves in the Cold War, Goldwater notes that even our “alliance system is not coextensive with the line that must be held if enemy expansion is to be prevented.”10 Sharing with Taft a realization of the inadequacy of American conventional ground forces to meet the Communist challenge in all corners of the globe, Goldwater called for the West to develop a nuclear capacity for limited war and “to learn to meet the enemy on his own grounds” of political warfare. Goldwater’s major criticism of the Western alliance system is of its completely defensive nature and outlook vis-a-vis the area of world Communism.
In subsequent foreign policy statements, Goldwater has insisted that the continuing expansion of Communist domination and political influence has resulted from the Western failure to deal with “the key problem of international relations,” namely, the uses a nation makes of power. He claims that for the policy makers of the United States the “effort to please world opinion . . . has become a matter of grand strategy, . . . the guiding principle of American policy.”11 He urges us to abandon this pre-occupation, and to fully commit ourselves to the Cold War and to the use of Western power to “defeat” international Communism. Specifically he would repudiate disarmament discussions, eliminate Castro, declare Africa a Western protectorate, and encourage, and prepare to assist, uprisings within Eastern Europe.12 More recently he has denounced “coalition governments” as a “tactic of the enemy,” and has asserted that all Communist regimes must be opposed whether in Yugoslavia, Moscow, or North Vietnam.13
The Goldwater foreign policy is immensely different from that of Taft. Goldwater basically advocates what might be called American Imperialism, or the extension of American power to more and more areas of the globe, whereas Taft postulated the traditional Whig principles of limited foreign involvement and legislative control over the commitment of the military power. Taft’s policy was basically defensive with a premium placed on limiting our commitments, our expenses, and the executive power. Few have called attention to this remarkable difference of views, a divergence as vast as the difference between Gladstone and Disraeli.
This disagreement is all the more startling when one considers that the bulk of the Goldwater support comes from the old Taft circles. This fact should evoke some reflection from those who may see Goldwater as a potential answer to their hopes for a policy for the West which would assume the diplomatic initiative against Communism, which would not hesitate to exercise Western power, and which would develop the capacity for meeting the Communists on all levels and indeed carrying the struggle to the other side of the Iron Curtain. Can a man be an effective proponent of such a policy who is, in reality, beholden to or dependent for support on the vast bulk of the old Taft forces? Aside from his own possible inadequacies in the sphere of intellectual power or of leadership skill, would Goldwater really be an effective fighter in the “protracted conflict” with Communism, or would he revert to the older traditions of his party and of the principal part of his supporters: isolationism, withdrawal, and pre-occupation with strict internal constitutionalism?
The hard core of intellectual conservatism in the United States sees Goldwater through the eyes of the National Review. And Goldwater, in his writings for, and communications with, National Review, echoes to a large extent the National Review line on the waging of the Cold War, the use of American power, and the carrying of the battle to the enemy’s territory. But the National Review, after all, despite its pre-eminent position in conservative intellectual circles and its close contact with the greater Western conservative tradition and the hard-line strategists and theoreticians of the world-wide anti-Communist movement, is a relatively minor force in the broader Goldwater movement.
Indeed, Goldwater’s mass appeal does rest partly on the heightening anti-Communist enthusiasm of the electorate and in a revival of basic American nationalist and patriotic sentiments. But these elementary emotions need guidance, and, I fear, a large part of the political leaders of the Goldwater movement who are benefitting from these emotions are pre-occupied with different concerns than those of the cold war strategists of the National Review, or, more emphatically, of the Foreign Policy Research Institute.14 These leaders can deliver much more important votes to Goldwater than can the National Review, with its subscribers occupying their minority positions in the electorate of the Eastern States.
The primary concerns of these local political leaders, who will be contributing so much to the Goldwater movement, are the traditional concerns of mid-western Republicanism: federal fiscal responsibility, states’ rights, and a friendly environment for their business activities. In foreign affairs they have a general determination to resist Communism, but it is doubtful if they have really adapted themselves to the exigencies of modern international relations. Basically, they believe in the traditional notion of American invincibility, and, as a result, pay little heed to the task of strengthening the non-Communist areas of the world, so as to enhance the use of Western power and halt Communism. For example, they have their doubts about our associating with the European Common Market, and enter alliances hesitantly and only after someone else has demonstrated their necessity. They show no inclination to participate in any ventures for strengthening and modernizing the underdeveloped nations both economically and politically. Their response to direct Soviet challenges is always a determined resolve to use force if necessary, and to not give in, but they are slow in developing the means for resisting the various short-of-general-war techniques of Soviet aggrandizement. They profess their sympathies with the captive nations of Eastern Europe, yet limit their suggestions for assuming a diplomatic offensive against the Communist world to a few useless condemnatory resolutions.
The basic failure of these elements of the Republican Party, as with Taft, is their lack of that imaginative understanding of foreign affairs which is essential for securing the free world and waging the intricate and difficult diplomatic maneuvers essential to making inroads on the Communist Empire, so as to make it no longer a threat to peace and freedom. An administration based on such forces might be startlingly like the Eisenhower Administration, which also relied heavily on “heartland” Republican support: an administration that would half-heartedly accept the frustrating containment policy and thereby not even achieve its own primary goals of fiscal responsibility and governmental decentralization.
J. B. Conant’s “Slums and Suburbs”
IF THERE IS one observation about Dr. James Bryant Conant which the esteemed scholar has, himself, made painfully apparent, it is that he is a firm supporter of the notion that the public school should assume a vast societal role which would virtually supplant the traditional functions of the home, the church, and private education. In Slums and Suburbs,1 Dr. Conant offers a provocative potpourri of self-assured assertions of opinion, as well as some extremely perceptive insights concerning the ills of contemporary urban education. The book is a hurried summation of several of the findings of the Carnegie Foundation’s “Study of the American High School,” an inquest which is yet to be completed.
The initial premise of this study is “that to a considerable degree what a school should do and can do is determined by the status and ambitions of the families being served.” Thus, there is no ideal or single purpose or curriculum which should dominate secondary education: the function of the school is to be determined by the “socio-economic composition” of its students. Proceeding from this thought, Dr. Conant explores the relative needs of secondary schools in the well-to-do suburbs of our large central cities, where Johnny’s parents are determined that their son shall be enrolled in an Ivy League college, and in the cultural vacuums of America’s urban slums, neighborhoods which the New York State Department of Education would have us call “older, more overcrowded areas.”
Dr. Conant refuses to cloak his recommendations in the silly euphemisms of the social-worker set. He recognizes, forthrightly, that the American slum problem is largely a Negro problem, and that it cannot be remedied without this fact in mind. As a result of decades of patent discrimination, the Negro in our cities rarely expects job opportunities commensurate with his abilities: hence, there is little desire in the Negro slum to progress academically, since the white establishment will not recognize any such progress.
One observation of Dr. Conant should provide a lesson for those who build new slums at public expense in the name of “urban development.” He emphasizes that there is a far greater correlation between desirable social attitudes and job opportunities than there is between such attitudes and housing conditions. Unfortunately, the Conant solution to the lack of slum opportunity is to use the school as the vocational training ground for future employment, a solution which would tend to freeze the urban Negro permanently in his present status as unskilled or semi-skilled worker. One would think it preferable to improve Negro education for the sake of instilling academic excellence among Negro youth, but Dr. Conant has concluded that “in a heavily urbanized and industrialized free society the educational experiences of youth should fit their subsequent employment.” Such “experiences” almost invariably turn out to be a vocationalized edition of life-adjustment education. The best that Dr. Conant can say about the fine “Higher Horizons” project of New York City (which is an eminent example of the use of the school to raise the cultural level and interests of slum youth) is that the project is “encouraging.” Vocational training, on the other hand, is “necessary,” and not just encouraging.
Not all of Dr. Conant’s recommendations are restatements of the vocational heresy, however. He notes that slum parents must be encouraged to support education, perhaps with an adult education program sponsored by the community high school: this would help to create a home environment more conducive to academic success. Dr. Conant also recognizes the futility of arbitrarily shifting Negro and white students to far-away schools as a method of improving the education of Negro slum children. The slum schools should be improved where they are, he concludes; inter-neighborhood integration only lowers the standards of the white schools without appreciably improving Negro education.
Perhaps the most significant finding made by Dr. Conant is the presence of what he calls “social dynamite” building up in our urban slum areas among unemployed, out-of-school youth. The unemployment problem here is directly traceable to the hiring policies of labor unions and management. Dr. Conant believes that the frustration engendered by this incapacity to secure employment may erupt into actual physical violence in slum neighborhoods if the situation is not soon corrected. He would entrust such social treatment to the high school, an institution which, it would seem, is not intended to perform this type of surgery.
Slums and Suburbs serves a valuable function in isolating several significant maladies afflicting urban high schools and their students. But the book suffers from Dr. Conant’s faith in the schoolroom as the central correction agency of American society. He would expand the functions of the public school in the slums to the point where its academic purposes would become at best peripheral. There may well be a need for social action to remedy the problems elucidated by this study, but, we submit, Dr. Conant has offered the wrong blueprint.
BE THE FIRST UNRECONSTRUCTED REVOLUTIONIST ON YOUR BLOCK TO READ THE NATION’S FASTEST GROWING STUDENT JOURNAL.
INSIGHT AND OUTLOOK
WILL BE MAILED TO ANYONE UPON REQUEST. GRATUITIES TO DEFRAY OUR EXPENSES WILL BE MOST HUMBLY ACCEPTED. WRITE OUR EDITORIAL OFFICE, 2545 UNIVERSITY AVE., MADISON, WISCONSIN.
F. J. Johnson’s “No Substitute for Victory”
IN No Substitute for Victory1 Frank J. Johnson, a former naval intelligence specialist on Soviet affairs, proposes what he believes to be the only strategy by which the United States can be certain of maintaining both peace and freedom. He asks us to deliberately adopt a policy of victory over Communism; only such a policy, he holds, can prevent an atomic holocaust.
By using the phrase “victory over Communism,” Mr. Johnson is not suggesting that we conquer the Soviet Union and make it a United States satellite; what he does propose is that we should pursue “creative initiatives” (to borrow a pet Liberal term). These initiatives will be aggressive in character and will be designed to convince the Soviet rulers that it is in their own interest to terminate the Cold War and to dismantle the world-wide Communist conspiracy. Mr. Johnson believes this can be accomplished if we recognize the fact that “peaceful co-existence,” as propounded by Khrushchev, is a fraud, that the Soviets are willing to negotiate and to trade territory back and forth in the “war zone” (the non-Communist world) but not in the “peace zone” (the Communist world). By using paramilitary warfare, subversion, terror, sabotage, strikes, guerilla techniques and other means short of nuclear war, the United States could carry the Cold War to the Soviet Union by invading the “peace zone” and endangering the security of the Communist rule in the mother country. Our object should be to encourage rebellion in the satellites and thus make Eastern Europe a liability and not an asset to the Russian government. The price for ending these pressures on the Communist bloc will be iron-clad guarantees from the Politburo that they, in turn, will abandon their adventurous foreign policy. Our hope will be that the peoples under Communist rule will eventually achieve their own freedom, either through peaceful means or by force.
The alternative to this policy—appeasement—must ultimately lead to a nuclear war since the Soviet Union will not attack the United States until the greater part of the world has been brought under Communist control by paramilitary means. An appeasement policy by the West will allow this timetable to be carried out; a firm counter-offensive, however, will prevent the Soviets from reaching the take-off stage of their plan for world conquest. Readers of this book will understand that we are not left with only two choices (“Red or dead”). There is a third course, one that will enable us to avoid both war and submission, and Mr. Johnson has outlined that course with clarity. It only remains for Mr. Kennedy to follow it.
No Substitute for Victory is aimed directly at the person who must ultimately be convinced in a democracy: the average voter. This is both a strength and a weakness. It is of necessity plainly written and so contains a number of over-simplifications. The emphasis on the essential un-Americanism of Communism—its European origins—could, for instance, have been easily omitted. Although it must be classed as an introduction to its subject2 it is a solid and generally reliable work; it contains almost none of the unfortunate vulgarizations perpetrated by so many of the “authorities” on Communism currently cropping up all over the lecture circuit. These half-educated, unthinking popularizers may be well-meaning, but their wild outcries have the effect of obscuring and even discrediting the serious proposals of reputable spokesmen for an anti-Communist strategy. For this and other reasons, they are doing far more harm than good. It is even more unfortunate, however, that most of the West’s intellectuals (who lack the excuse of ignorance) still believe that “peaceful co-existence,” as defined by Khrushchev, is possible with the present governments of the U.S.S.R. and Communist China.
Although the West is weakened in its struggle by the (let us hope) temporary neutrality of so many of its finest minds, we should probably not be too surprised by this phenomenon. After all, the greatest part of Europe’s intellectuals believed, at one time, in the flatness of the earth, just as they once fervently upheld chattle slavery, blood-letting and leeches, witchcraft and the theory that the sun revolves around the earth. History demonstrates, however, that even intellectuals can learn, so we can hope that hard experience will eventually triumph over wishful thinking and that the myth of “co-existence” will someday go the way of the Ptolemaic theory. Mr. Johnson’s book will do much to hasten this process.
NEW BOOKS AND ARTICLES
THE FOLLOWING IS A SELECT LIST OF BOOKS AND ARTICLES WHICH, IN THE OPINION OF THE EDITORS, MAY BE OF INTEREST TO OUR READERS.
is a quarterly
$4.00 the year
64 East Jackson Boulevard
Chicago 4, Illinois
Eugen von Boehm-Bawerk:
CAPITAL AND INTEREST (in three volumes), $25 the set
Special one volume edition, $15.00
Paperback Extracts from Volumes I and II:
The Exploitation Theory, 97 pages, $1.50
Value and Price, 160 pages, $2.00
SHORTER CLASSICS of Eugen von Boehm-Bawerk, 1962, 392 pages, $7.50 I The Austrian Economists, II Whether Legal Rights and Relationships Are Economic Goods, III Control or Economic Law?, IV Unresolved Contradiction in the Marxian Economic System, V The Ultimate Standard of Value
Ludwig von Mises:
PLANNING FOR FREEDOM, 192 pages, paperback, $2.00
A collection of essays and addresses, new enlarged edition, 1962
366 East 166th Street, South Holland, Illinois
NEW INDIVIDUALIST REVIEW . . .
as the mail-clerk in Ida Noyes Hall can testify, is the beneficiary of a worldwide correspondence. Most NIR readers are more articulate than the average citizen; most of them are the kind of people which are known in the trade as “opinion-molders.” Here is what a few of these “influentials” (who care enough about NIR to pay the extra dollar required of foreign subscribers) have written to us lately.
“I was delighted with your last issue. I can testify from bitter experience that your Dr. Rothbard is entirely correct when he demonstrates that public ownership of lighthouses is the first step on the road to communism. Please prepare a Chinese edition of 600,000,000 copies; I would like to mail them (bulk rate, of course) to selected opinion-molders in the occupied provinces of my country.”
President Chiang Kai-shek
“I congratulate you for having published, and I congratulate myself for having read, the Winter issue of NIR. Although nothing in this universe is of real importance (excepting, of course, the life to come and the Honour of France) your magazine is rather relaxing. Your discussion of whether or not David Hume was a whig or a tory was charming but quite needlessly prolonged. He was a tory.”
President Charles de Gaulle
“What are your rates for original poetry?”
Chairman Mao Tze-tung
“Please send 20 copies air mail of your stimulating magazine. I am trying to convince the members of the Politburo that economic progress will come much faster if we adopt a freer trade policy. None of my colleagues is yet in favor of private ownership of the streets but I am working on them.”
Deputy Premier Anastas Mikoyan
“SOCIALISM is only an idea, not an historical necessity, and ideas are acquired by the human mind. We are not born with ideas, we learn them. If socialism has come to America because it was implanted in the minds of past generations, there is no reason for assuming that the contrary idea cannot be taught to a new generation. What the socialists have done can be undone, if there is a will for it. But, the undoing will not be accomplished by trying to destroy established socialistic institutions. It can be accomplished only by attacking minds, and not the minds of those already hardened by socialistic fixations. Individualism can be revived by implanting the idea in the minds of the coming generations. So then, if those who put a value on the dignity of the individual are up to the task, they have a most challenging opportunity in education before them. It is not an easy job. It requires the kind of industry, intelligence and patience that comes with devotion to an ideal.”
—Frank Chodorov, Founder and President, Intercollegiate Society of Individualists, Inc.
[* ] Harry Elmer Barnes is the author of numerous books and articles on twentieth century history, among which is the Genesis of the World War which deals with the responsibility for the outbreak of World War I. He is co-author of Perpetual War for Perpetual Peace, (1953), characterized by Raymond Moley as “the most solid of recent books published on foreign policy.”
[1 ] New York: Atheneum Publishers, 1962, 296 pp. $4.50.
[* ] James M. O’Connell is a Ph. D. candidate in mathematics at the University of Wisconsin. He is a contributing editor of Insight and Outlook and writes a column for the University of Wisconsin Daily Cardinal.
[1 ] Ronald Hamowy, “National Review; Criticism and Reply,” in New Individualist Review, November, 1961, pp. 3-11; Edward C. Facey, “Conservatives or Individualists: Which Are We?,” New Individualist Review, Summer, 1961, pp. 24-26.
[2 ] F. A. Hayek, The Constitution of Liberty (University of Chicago Press, Chicago, 1960), p. 401.
[3 ] Ludwig von Mises, Human Action, (Yale University Press, New Haven, 1949), p. 21.
[4 ] Irving Babbit, Democracy and Leadership, (Houghton Mifflin, Boston, 1924), p. 312.
[5 ] Hamowy, op. cit., p. 7.
[6 ] Russell Kirk, A Program for Conservatives, (Regnery, Chicago, 1955), p. 49.
[7 ] Quoted in the Freeman, March, 1962, p. 64.
[8 ] Kirk, op. cit., p. 35.
[* ] G. C. Wiegand is Professor of Economics at Southern Illinois University and is the author of numerous articles on monetary theory and the history of economic thought.
[1 ] Woodrow Wilson, The New Freedom, (New York: Doubleday, 1913), pp. 64, 89.
[2 ]Report with Special Studies, (Washington, 1937).
[3 ] 317 U.S. 111 (1942).
[* ] Robert M. Hurt is an Associate Editor of New Individualist Review.
[1 ] Thucydides, The Peloponnesian War, (New York: Random House, 1951) p. 104.
[2 ]Laws, vi., 780.
[3 ]On Liberty, (Chicago: Gateway, 1952) p. 119.
[4 ]Cf. Popper, The Open Society and Its Enemies, (Princeton: Princeton University, 1950), pp. 112-14, 165 ff., and Hayek, The Constitution of Liberty (Chicago: University of Chicago, 1960), pp. 164-66.
[5 ] Catholic intolerance in areas such as birth control and censorship has tended to obscure the facts that Catholic scholars are by no means agreed on many areas of state intervention and that the most restrictive moral legislation is often due to pressure from Protestant groups. An excellent book which represents the more liberal Catholic tradition is St. John-Stevas, Life, Death and the Law, (Bloomington, Ind.: Ind. Univ. Press, 1961). He concludes that birth control should not be outlawed in countries with large non-Catholic populations.
[6 ] Acton, History of Freedom, p. 55, as cited in Hayek, op. cit., p. 176.
[7 ] I have not included certain laws which raise complicated issues involving possible injury to third parties which cannot be easily answered by any general principle as to the right of government to regulate private lives. For instance, abortion raises the issue as to whether an unborn child is an innocent party that should be protected against murder, and polygamy and artificial insemination raise the question as to whether a future child should be protected agains the social stigma of “illegitimacy.”
[8 ] Tileston v. Ullman, 129 Conn 84 (1942), and Cw. v. Allison, 227 Mass 57 (1917).
[9 ] These ratios make a mockery of the notion that the statutes are designed to protect equally the integrity of the white and Negro races. A consistent application would require a person with less than one-half Negro blood to marry only a person of the white race.
[10 ] Green v. State, 58 Ala 190 (1877).
[11 ]Time, Dec. 27, 1948, p. 18. The police were informed by a relative who as a result of an old family feud, dug up Knight’s genealogy. Mississippi’s zealousness also extends to nipping the threat to racial integrity in the bud. Section 2339 (1959) of the Mississippi Code reads: “Any person, firm, or corporation who shall be guilty of printing, publishing, or circulating printed, typewritten, or written matter urging or presenting for public acceptance or general information, arguments or suggestions in favor of social equality or intermarriage between whites and negroes shall be guilty of a misdemeanor and subject to a fine not exceeding five hundred dollars or imprisonment not exceeding six months, or both . . .”
[12 ] Hales v. Petit, 1 Plow 253 (C.B. 1563).
[13 ] Assisting a suicide is only manslaughter in New York, and in Texas no crime is committed if the accused merely encourages or assists without actually killing the suicide. Many European writers accept the position of the Italian positivist, Enrico Ferri, who argued that, since a man has the right to dispose of his own life as he chooses, he has the right to consent to his own destruction by another. But Ferri’s suggestion has been adopted only in the Swiss Criminal Code, which provides that whoever assists the suicide of another commits no crime unless he is actuated by “selfish motives.”
[14 ] Fifty-four percent of Americans favored mercy killings in an Institute of Public Opinion Poll in the New York Herald Tribune, Jan. 17, 1937, but 46 percent of Americans and 68 percent of Britons favored it in a Gallup Poll recorded in the New York Times Index (1939), p. 1414.
[15 ] A New Hampshire case, State v. Sander, received nation-wide publicity in 1950.
[16 ] The proposed legislation wouild allow a patient to petition a court for euthanasia, provided he is over twenty-one, of sound mind, and suffering from an incurable disease accompanied by severe pain. After an investigation to determine whether the patient fully understands what he is doing and whether his malady is incurable, euthanasia would be administered before court appointed witnesses. While these proposals would avoid some objections to euthanasia—it could not be used as a result of a temporary whim, and a court would not have to rely on the doctor’s word that the patient desired death—this legal ritual might be viewed as governmental encouragement of euthanasia. Glanville Williams’ suggestion, that murder would be presumed and the physician would be required to show that the patient requested death upon mature reflection and was incurably ill, seems preferable.
[17 ] Additional objections are that killing violates the Hippocratic Oath, although the oath also requires a physician to relieve suffering, and that, as Chesterton has asserted, this might be extended to involuntary euthanasia in the future. The non-religious arguments against euthanasia are summed up in Kamisar, Minnesota Law Review, 42: 16 (May, 1958).
[18 ]Summa Theologica, ii-ii q. 64, art. 5.
[19 ] Fletcher, Morals and Medicine, (Princeton: Princeton Univ. Press, 1954) p. 181.
[20 ] Adultery presents a special problem because it is justified as protecting the spouse from injury. In almost every European nation, however, the civil remedy is considered adequate.
[21 ] Kinsey, Pomeroy, and Martin, Sexual Behavior in the Human Male, (Philadelphia: Saunders, 1948).
[22 ] Neff v. U. S., 105 F. 2nd 688 (1939).
[23 ]Newsweek, Jan. 8, 1962, p. 18.
[24 ] Ploscowe, Sex and the Law, (New York: Prentice Hall, 1951) p. 202.
[25 ]Drug Addiction: Crime or Disease, Final Report of the Joint Committee of the American Bar Association and American Medical Association on Narcotic Drugs (Bloomington, Ind.: Ind. Univ. Press, 1961) p. 165: “In terms of numbers afflicted, and in ill effects on others in the community, drug addiction is a problem of far less magnitude than alcoholism. Crimes of violence are rarely, and sexual crimes are almost never, committed by addicts. In most instances the addicts’ sins are those of omission rather than commission, they are ineffective people, individuals whose great desire is to withdraw from the world and its troubles into a land of dreams.”
[26 ] Roth v. United States, 354 U. S. 476 (1957). A letter by a Commander of the Catholic War Veterans illustrates the majoritarian and “low boiling point” attitude toward censorship: “The Illinois Department of Catholic War Veterans would like to go on record as opposing distribution of a book that shocks a judge the first time he reads it. Further, the Department would like to state for its members . . . that their standards of decency still do not accept the use of dirty words or the description of lewd and vulgar incidents into their homes. They have a right, therefore, to resent a book which brings these things into their community.”
[27 ]Ct. Kronhausen, Pornography and the Law. (New York: Ballantine, 1959).
[28 ] This, of course, does not imply that all acts which do directly injure innocent persons should be criminal. Vigorous commercial competition is desirable even though competitors are “injured” in one sense; many acts which result in injuries, such as hurting a person’s feelings, would not be made the subject of law by all but the most ardent statists. And most injuries will be redressed by civil damages rather than criminal sanctions.
[29 ] Black’s Law Dictionary gives this definition of proximate cause: “That which in a natural and continuous sequence, unbroken by any efficient intervening cause, produces the injury, and without which the result would not have occurred.”
[30 ] Hayek, op. cit., p. 35.
[31 ] Buckley, Up from Liberalism, (New York: Hillman, 1959) p. 172.
[32 ] The articles by M. Stanton Evans and Frank S. Meyer in the Fall, 1960, Modern Age and by Murray N. Rothbard in the Spring, 1961, Modern Age illustrate how libertarian political philosophies can be based on a belief in absolute moral values.
[33 ] “Positivism and the Separation of Law and Morals,” Harvard Law Review, 71:593 (Feb., 1958).
[34 ]The Enforcement of Morals, (London: Oxford Univ., 1959).
[35 ]Reflections on the Revolution in France, (New York: Liberal Arts, 1955) p. 67.
[36 ] Hayek, op. cit., p. 23.
[37 ]Ibid., p. 68.
[* ] John P. McCarthy is an Associate Editor of New Individualist Review.
[1 ] E. v. Kuehnelt-Leddihn, “American Conservatives: An Appraisal.” National Review, (March 13, 1962), p. 167.
[2 ]Vide John Courtney Murray, We Hold These Truths, (New York: Sheed and Ward, 1960), p. 286.
[3 ] Vide Walter Lippman, The Public Philosophy, (New York: Mentor Books, 1955), pp. 15-24.
[4 ] There were spokesmen for both positions who reasoned from practical foreign policy considerations. For instance, Herbert Hoover hesitated about intervening wholeheartedly in a war in which the major outcome would be the strengthening of the position of international Communism.
[5 ]A Foreign Policy for Americans, (Doubleday, 1952), p. 88.
[6 ]Ibid., p. 89.
[7 ]Ibid., p. 43.
[8 ]Ibid., p. 33.
[9 ] Barry Goldwater, The Conscience of a Conservative, (New York: Hillman Books, 1960), p. 92.
[10 ]Ibid., pp. 94-5.
[11 ] Barry Goldwater, “A Foreign Policy for America,” National Review, (March 25, 1961), p. 179.
[12 ]Ibid., pp. 180-81.
[13 ] Barry Goldwater, “To Win the Cold War,” New Guard, (March, 1962), p. 37.
[14 ] The research institute at the University of Pennsylvania which sponsored the studies, Protracted Conflict and A Forward Strategy for America, by Robert Strausz-Hupé, Stefan Possony, et. al.
[* ] Robert M. Schuchman received a B.A. in history from Queens College and an LL.B. from the Yale Law School where he was an Earhart Fellow in economics. He is National Chairman of Young Americans for Freedom.
[1 ]Slums and Suburbs, by James Bryant Conant, (McGraw-Hill: New York, 1961), 147 pp.
[* ] Robert Schuettinger is an Associate Editor of New Individualist Review.
[1 ]No Substitute for Victory, by Frank J. Johnson, with introduction by Admiral Arleigh Burke, (Regnery: Chicago, 1962), 230 pp.
[2 ] Most readers of this review will be familiar with a more complete work, A Forward Strategy for America, by Strauz-Hupe, Kintner and Possony (Harper, New York, 1961).