Source: Chapter in Nisbet's The Present Age: Progress and Anarchy in Modern America (Indianapolis: Liberty Fund, 2003).
Of all faces of the present age in America, the military face would almost certainly prove the most astounding to any Framers of the Constitution, any Founders of the Republic who came back to inspect their creation on the occasion of the bicentennial. It is indeed an imposing face, the military. Well over three hundred billion dollars a year go into its maintenance; it is deployed in several dozen countries around the world. The returned Framers would not be surprised to learn that so vast a military has inexorable effects upon the economy, the structure of government, and even the culture of Americans; they had witnessed such effects in Europe from afar, and had not liked what they saw. What would doubtless astonish the Framers most, though, is that their precious republic has become an imperial power in the world, much like the Great Britain they had hated in the eighteenth century. Finally, the Framers would almost certainly swoon when they learned that America has been participant in the Seventy-Five Years War that has gone on, rarely punctuated, since 1914. And all of this, the Framers would sorrowfully find, done under the selfsame structure of government they had themselves built.
Clearly, the American Constitution was designed for a people more interested in governing itself than in helping to govern the rest of the world. The new nation had the priceless advantage of two great oceans dividing it from the turbulences of Europe and Asia. Permanent or even frequent war was the last thing any thoughtful American of the time would think of as a serious threat. Power to declare war could be safely left to the Congress, and leadership of the military to a civilian commander in chief, the president. Let the president have nominal stewardship of foreign policy but let the Senate have the power to advise and consent, and the entire Congress the power of purse over foreign policy and war.
It was ingenious, absolutely ideal for a nation clearly destined to peace and to the cultivation of the arts and sciences. Agriculture, commerce, and manufacture were the proper and highly probable direction of the American future. The states, to which an abundance of powers and rights were left by the Constitution, would be the true motors of American prosperity.
We did a very good job, on the whole, of avoiding the traps and entanglements of the world for the first hundred and twenty-five years, and even made bold to warn the Old World that its presence in the Western Hemisphere, however brief, would be regarded with suspicion. Then things changed.
The present age in American history begins with the Great War. When the guns of August opened fire in 1914, no one in America could have reasonably foreseen that within three years that foreign war not only would have drawn America into it but also would have, by the sheer magnitude of the changes it brought about on the American scene, set the nation on another course from which it has not deviated significantly since. The Great War was the setting of America’s entry into modernity—economic, political, social, and cultural. By 1920 the country had passed, within a mere three years, from the premodern to the distinctly and ineffaceably modern. Gone forever now the age of American innocence.
When the war broke out in Europe in 1914 America was still, remarkably, strikingly, pretty much the same country in moral, social, and cultural respects that it had been for a century. We were still, in 1914, a people rooted largely in the mentality of the village and small town, still suspicious of large cities and the styles of living that went with these cities. The states were immensely important, just as the Founding Fathers and the Framers had intended them to be. It was hard to find a truly national culture, a national consciousness, in 1914. The Civil War had, of course, removed forever philosophical, as well as actively political, doubts of the reality of the Union as a sovereign state. But in terms of habits of mind, customs, traditions, folk literature, indeed written literature, speech accent, dress, and so forth, America could still be looked at as a miscellany of cultures held together, but not otherwise much influenced, by the federal government in Washington. For the vast majority of Americans, from east to west, north to south, the principal, if not sole, link with the national government was the postal system—and perhaps also the federal income tax, which was approved at long last by constitutional amendment in 1913.
The Great War changed all of this. By November 1918 after four years of war in Europe and nearly two years of it for America, the whole world was changed, Europe itself ceased in substantial degree to be a contained civilization, and the United States, after close to two years of what can only be called wrenching military nationalism under the charismatic Woodrow Wilson, was brought at last into the modern world of nations. State loyalties and appeals to states’ rights would not vanish overnight; they aren’t gone yet in constitutional law, and aren’t likely to be. But whereas prior to 1914 one still saw the gravamen of American development in the four dozen states, “provinces” in European terms, by 1920, it had shifted to the national culture, with the states becoming increasingly archaic.
The Great War, unwanted by any nation, even Germany, unexpected, really, until it burst devastatingly and irreversibly upon Europe, was at its height by far the largest, bloodiest, cruelest, indeed most savage in history. Churchill wrote:
All the horrors of all the ages were brought together, and not only armies but whole populations were thrust into the midst of them. . . . Neither peoples nor rulers drew the line at any deed which they thought would help them to win. Germany, having let Hell loose, kept well in the van of terror; but she was followed step by step by the desperate and ultimately avenging nations she had assailed. Every outrage against humanity or international law was repaid by reprisals—often of a greater scale and of longer duration. No truce or parley mitigated the strife of the armies. The wounded died between the lines: the dead mouldered in the soil. Merchant ships and neutral ships and hospital ships were sunk on the seas and all on board left to their fate or killed as they swam. Every effort was made to starve whole nations into submission without regard to age or sex. Cities and monuments were smashed by artillery. Bombs from the air were cast down indiscriminately. Poison gas in many forms stifled or seared their bodies. Liquid fire was projected upon their bodies. Men fell from the air in flames, or were smothered, often slowly, in the dark recesses of the sea. The fighting strength of armies was limited only by the manhood of their countries. Europe and large parts of Asia and Africa became one vast battlefield on which after years of struggle not armies but nations broke and ran. When all was over, Torture and Cannibalism were the only two expedients that the civilized, scientific, Christian States had been able to deny themselves: and they were of doubtful utility.*
The greatest single yield of the First World War was, however, none of the above; it was the Second World War, which came a bare quarter of a century after the First, germinated and let loose by the appalling consequences of 1918, chief among them the spawning of the totalitarian state, first in Russia, then in Italy and, crucially in the end, in Germany under Hitler. World War II was fought, of course, on a much wider front, or set of fronts, than its predecessor. There was no part of the globe that was not touched in one way or other. From the Second World War, officially ended in late 1945, has come a rash of wars during the last forty years, chief among them the Cold War between the Soviet Union and the United States. But we should not overlook the dozens of other wars fought during this period, in Asia, Africa, the Middle East, the Far East, Oceania, and so on. Between the last shot fired in 1945 and the present moment, war, somewhere at some time, has been the rule, peace the exception.
There is every reason for referring to the “Seventy-Five Years War” of the twentieth century, for that is about the length of the period of wars that began in 1914 and, with only brief punctuations of peace, continues through this year, certainly the next, and to what final ending? In so referring to twentieth-century war, we are only following the precedent of what we read routinely in our textbooks of European history about the Hundred Years War at the end of the Middle Ages. That war also had its punctuations of peace, or at least absence of overt hostilities.
War is indeed hell in just about every one of its manifestations through history. But for human beings during the past several thousand years it has plainly had its attractions, and also its boons for humanity. The general who said it is “good that war is so hideous; otherwise we should become too fond of it” spoke knowingly of the mental “wealth” that inheres in most wars along with the mental and physical “illth.” So practical and pragmatic a mind as William James believed that we needed a “moral equivalent of war” as the means of attaining the good qualities of war without entailing the evil ones.
Without wars through the ages, and the contacts and intermixtures of peoples they—and for countless aeons they alone—instigated, humanity would quite possibly be mired in the torpor and sloth, the fruits of cultural and mental isolation, with which its history begins. Before trade and commerce broke down cultural barriers and yielded crossbreeding of ideas as well as genetic systems, wars were the sole agencies of such crossbreeding. Individualism, so vital to creativity, was born of mingling of peoples, with their contrasting cultural codes—the very diversity aiding in the release of individuals from prior localism and parochialism, always the price of cultural insularity.
War and change—political and economic foremost, but social and cultural not far behind—have been linked in America from the beginning. War was the necessary factor in the birth of the new American republic, as it has been in the birth of every political state known to us in history. War, chiefly the Civil War, in U.S. history has been a vital force in the rise of industrial capitalism, in the change of America from a dominantly agrarian and pastoral country to one chiefly manufacturing in nature. War, in focusing the mind of a country, stimulates inventions, discoveries, and fresh adaptations. Despite its manifest illth, war, by the simple fact of the intellectual and social changes it instigates, yields results which are tonic to advancement.
By all odds, the most important war in U.S. history, the war that released the greatest number and diversity of changes in American life, was the Great War, the war that began in Europe in August 1914 and engulfed the United States in April 1917. Great changes in America were immediate.
In large measure these changes reflected a release from the sense of isolation, insularity, and exceptionalism that had suffused so much of the American mind during the nineteenth century. The early Puritans had seen their new land as a “city upon a hill” with the eyes of the world on it. It was not proper for the New World to go to the Old for its edification; what was proper was for the Old World, grown feeble and hidebound, to come to America for inspiration. A great deal of that state of mind entered into what Tocqueville called the “American Religion,” a religion compounded of Puritanism and ecstatic nationalism.
What we think of today as modernity—in manners and morals as well as ideas and mechanical things—came into full-blown existence in Europe in the final part of the nineteenth century, its centers such cities as London, Paris, and Vienna. In contrast America was a “closed” society, one steeped in conventionality and also in a struggle for identity. This was how many Europeans saw America and it was emphatically how certain somewhat more sophisticated and cosmopolitan Americans saw themselves. The grand tour was a veritable obligation of better-off, ambitious, and educated Americans—the tour being, of course, of Europe.
Possibly the passage of American values, ideas, and styles from “closed” to “open,” from the isolated to the cosmopolitan society, would have taken place, albeit more slowly, had there been no transatlantic war of 1914–1918. We can’t be sure. What we do know is that the war, and America’s entrance into it, gave dynamic impact to the processes of secularization, individualization, and other kinds of social-psychological change which so drastically changed this country from the America of the turn of the century to the America of the 1920s.
War, sufficiently large, encompassing, and persisting, is one of the most powerful media of social and cultural—and also material, physical, and mechanical—change known to man. It was in circumstances of war in primordial times that the political state arose, and at the expense of the kinship order that had from the beginning been the individual’s sole community. Ever since, war has had a nourishing effect upon the state; it is “the health of the state,” Randolph Bourne observed darkly but accurately, when America went to war in 1917. Werner Sombart, historian of capitalism, devoted a volume to the tonic effects of war on the rise and development of capitalism. But no less true is Max Weber’s pronouncement of war and the barracks life of warriors as the true cause of communism. War communism precedes, indeed gives birth to, civil communism, Weber argued. The Communism of Soviet Russia has been based from the very beginning upon war preparation, upon the Red Army and its absolute power in the Soviet state.
War tends to stimulate intellectual and cultural ferment if only because of the mixture of ideas and values that is a by-product of combat, of victory and defeat, in any war. In both world wars, millions of Americans, men and women alike, knew the broadening and enriching effects of travel abroad, of stations in exotic places for the first time, as the result of enlistment or conscription. Granted that some were killed. Far more were not.
War tends to break up the cake of custom, the net of tradition. By so doing, especially in times of crisis, it allows the individual a better chance of being seen and heard in the interstices, in the crevasses opened by the cracking up of old customs, statuses, and conventionalities. This was remarkably true once the European war touched the millions of lives which had been for so long familiar with only the authorities and rhythms of an existence largely rural and pretty much limited to towns of the hinterland.
Lord Bryce, who loved America, was nevertheless forced to devote a chapter in his The American Commonwealth, published in the late nineteenth century, to what he called “the uniformity of American life.” He was struck by the sameness of the buildings, houses, streets, food, drink, and dress in town after town, village after village, as he crossed and recrossed the country by rail. Not even one great capital, one flourishing city, Bryce felt obliged to report in his classic. That, however, was before the Great War and its transformation of the United States. It brought the literature of “release” in the novels of Sinclair Lewis, Sherwood Anderson, Willa Cather, Ruth Suckow, and others, a literature constructed around the drama and sometimes agony of a protagonist’s escape from Main Street or Winesburg or Elmville or wherever, to the freedoms, chilling as these could be, of a Chicago or New York. In postwar New York, America at last got a true world capital. Much of the dreadful sameness began to crack under the force of the Great War. No wonder this war remains popular in American memory; even more popular than the War of Independence with Britain, which, truth to tell, was observed at the time by a majority hostile or at best lukewarm to it. Woodrow Wilson made the war his personal mission, his road to salvation for not only America but the world; and in the process, he made the war the single most vivid experience a large number of Americans had ever known. Even the casualties among American forces (not many compared to those of France, Britain, Russia, and Germany) didn’t dampen enthusiasm at home; nor did the passage of legislation which put in the president’s hands the most complete thought control ever exercised on Americans.
What the Great War did is what all major wars do for large numbers of people: relieve, if only briefly, the tedium, monotony, and sheer boredom which have accompanied so many millions of lives in all ages. In this respect war can compete with liquor, sex, drugs, and domestic violence as an anodyne. War, its tragedies and devastations understood here, breaks down social walls and by so doing stimulates a new individualism. Old traditions, conventions, dogmas, and taboos are opened under war conditions to a challenge, especially from the young, that is less likely in long periods of peace. The very uncertainty of life brought by war can seem a welcome liberation from the tyranny of the ever-predictable, from what a poet has called the “long littleness of life.” It is not the certainties but the uncertainties in life which excite and stimulate—if they do not catastrophically obliterate—the energies of men.
There is a very high correlation between wars in Western history and periods of invention and discovery. If necessity is the mother of invention, then military necessity is the Great Mother. Roger Burlingame was correct when he said that if war were ever to be permanently abolished on earth, then something would have to be found to replace it as the stimulus and the context of inventions—mechanical but also social and cultural inventions. (When Leonardo da Vinci wrote to the duke of Milan listing his greatest accomplishments as possible stimulus to patronage from the duke, more than half the list consisted of his mechanical inventions. He combined painting and sculpture into one item, the better to give prominence to the mechanical achievements, nearly all of which were military.) America between 1914 and 1918 was no exception. Inventions of preceding years like the telephone and electric light were brought to a higher degree of perfection; so were the automobile, the radio, and the prototypes of what would become cherished household appliances. The federal government, justified and encouraged by war pressure, was able to do what would have been impossible in time of peace: directly encourage and even help finance new, entrepreneurial ventures such as the airplane and radio, each to revolutionize American life after the war.
Advances in medicine rank high among the benefactions of war. The sheer number of the wounded and sick, the possibility—the necessity—of new and radical techniques of surgery, and the focus of effort that war inevitably brings all combine to make times of war periods of major medical advancement, with incalculable boons for posterity. The whole field of prosthetics, for example, opened up in World War I—to be enormously advanced in the Second War—and with it came widespread relief from the obvious disfigurements of war, so abundant and ubiquitous after the Civil War.
Revolution and reform are common accompaniments of modern national wars. America underwent no political revolution as the consequence of either world war, but in each the acceleration of social and economic reforms and the germination of still other reforms to be accomplished later were striking. Not many wartime reforms long survived the wars, but their pattern was indelibly impressed on the reform mind. Without doubt the long overdue enfranchisement of women, which took place immediately after the First War, as did Prohibition, each the subject of a constitutional amendment, was the fruit in large part of women’s conspicuous service during the war in a variety of roles, military and civil, in offices and in factories. The cause of the illiterate was stimulated by the appalling results of the mass literacy tests given recruits in the war; the cause of the unorganized worker was advanced by the special allowance for union labor during the war; the real stimulus to the work toward racial and ethnic equality that has been a prominent part of the social history of the last sixty or so years came from federal agencies in the First World War. It is a matter of frequent note by historians that almost everywhere war needs inspire, in the interest of equity and of social stability, more “socialist” reforms than does the ideology of socialism.
Sometimes, indeed, more than simple reform becomes entwined with war. Revolution takes place. This was one of Lenin’s insights. The German Socialists had made peace and pacifism almost as prominent as the revolutionary cause itself. Lenin broke utterly with this position, insisting that every national war should be supported in one way or other in the hope of converting war into revolution. America did not, of course, go into revolution as a result of the Great War, nor did England or France. But a good many of the countries engaged in that war, on both sides, did know very well, sometimes very painfully, the surge of revolution. What can be said of America in the war is that the people participated widely in a revolutionary upsurge of patriotism and of consecration to the improvement of the world in the very process of making “the world safe for democracy,” as the moralistic President Wilson put it.
Yet another by-product of modern wars, those beginning with the French Revolution at least, is the sense of national community that can come over a people and become a landmark for the future. In the kind of war Americans—and others too—knew in 1917 and again in 1941, there is a strong tendency for divisiveness to moderate and a spirit of unity to take over. This was particularly apparent in America in the First War. It is not often remembered or cited that economic and social tensions were becoming substantial by 1914. Very probably the nearest this country has ever come to a strong socialist movement was during President Wilson’s first term. A great deal was written in those years about “class struggle,” “class revolt,” and “class war” in America. Unemployment was severe in many industries, unions struggled for recognition, the homeless and hungry demonstrated, sometimes rioting, and strikes in all the great industries including mining, steel, and textiles were at times small revolutions. The entrance of the United States in the war in 1917 spelled the end of much of this tumultuous and often violent behavior. Two decades later the same thing would be true for World War II. A full decade of the deepest and most agonizing economic depression America had ever known lasted, the vaunted New Deal notwithstanding, down to our joining the war in Europe and Asia.
But economic prosperity, while vital, is not the same as the sense of community. War induces, especially in fighting forces, a sense of camaraderie and mutual recognition that becomes community. As Remarque wrote in his great World War I novels, the “Western Front” was a torture but so was “the Road Back” to civilian life at the end of the war. Even the trenches could instill a feeling of moral and social community—that was Remarque’s major point, as it was of a number of novelists, dramatists, and poets in the aftermath of the war. World War I, quite unlike its successor a quarter of a century later, was both a singing and a writing war, and in song and letters the theme of war’s spur to comradeship and the mordant sense too of the “spiritual peace that war brings,” to cite the British L. P. Jacks, are striking.
War is a tried and true specific when a people’s moral values become stale and flat. It can be a productive crucible for the remaking of key moral meanings and the strengthening of the sinews of society. This is not always the case, as the American scene during the Vietnam War made painfully clear. But that war is more nearly the exception than the rule. Even our divisive, sanguinary, radical Civil War produced a reseating of values, with the nation for the first time exceeding the regions and states in political importance.
Rarely has the sense of national community been stronger than it was in America during the Great War. True, that sense had to be artificially stimulated by a relentless flow of war propaganda from Washington and a few other pricks of conscience, but by the end of the war a stronger national consciousness and sense of cohesion were apparent. But, as we know in today’s retrospect, with these gains came distinct losses in constitutional birthright.
All wars of any appreciable length have a secularizing effect upon engaged societies, a diminution of the authority of old religious and moral values and a parallel elevation of new utilitarian, hedonistic, or pragmatic values. Wars, to be successfully fought, demand a reduction in the taboos regarding life, dignity, property, family, and religion; there must be nothing of merely moral nature left standing between the fighting forces and victory, not even, or especially, taboos on sexual encounters. Wars have an individualizing effect upon their involved societies, a loosening of the accustomed social bond in favor of a tightening of the military ethic. Military, or at least war-born, relationships among individuals tend to supersede relationships of family, parish, and ordinary walks of life. Ideas of chastity, modesty, decorum, respectability change quickly in wartime.
They did in Puritan-rooted America during World War I—changed radically in many cases, and irreversibly. Mars and Venus cavorted, as they always had in time of war, and not least in America. When the brave young doughboy in the AEF was about to go overseas, perhaps to his death, couldn’t his sweetheart, even the girl next door, both honor and thrill her swain? Of course she could—in life and voluminously in fiction. The relaxation not only of ancient rules and dogmas in the spheres of marriage and family, religion and morals, but also of styles of music, art, literature, and education, although concentrated in the cities, nevertheless permeated the entire nation.
So, above all, did the new spirit of materialistic hedonism, the spirit of “eat, drink, and be merry” with or without the “for tomorrow we die,” capture the American mind during the war. The combination of government-mandated scarcities in some areas, as in meat, sugar, and butter, and the vast amount of expendable money from wages and profits in the hands of Americans led to a new consumer syndrome, one that has only widened ever since World War I and has had inestimable impact upon the American economy. Manufacture of consumer goods directed at the individual rather than the family greatly increased, further emphasizing the new individualism and the new hedonism of American life.
The American Way of Life proved both during and after the Great War to be exportable, to peoples all over the world. These peoples may have had an inadequate grasp at the end of the 1920s of just where America was geographically and just what it was made of mentally and morally, but they had acquired during the decade a lively sense of Coca-Cola, the Hamburger, Hollywood Movies, Jazz, Flappers, Bootleg Gin, and Gangsters. The flapper came close to being America’s First Lady in the movie houses of India, China, Latin America, and other abodes of what today we call the Third World. On the evidence of tickets bought, they adored what they saw almost as much as did the American people. Despite Prohibition, drinking was in to a degree it had never achieved when legal—that is, among young people of both sexes but generally middle-class by the end of the twenties. The gangster and the cowboy both achieved a fame in that decade earlier denied their prototypes.
The 1920s was par excellence the Age of Heroes. The age had begun in April 1917 when soldiers, from Black Jack Pershing at the top down to Sergeant York, were given novel worship by Americans at home. The spell lasted through the twenties to include heroes of the industrial world like Ford and Rockefeller; of the aviation world like Lindbergh and Earhart; of the sports world like Babe Ruth, Red Grange, Knute Rockne; and of the movies like Chaplin, Fairbanks, Swanson, and Pickford. To this day names such as these are more likely to come off the American tongue than are those of any living heroes.
Almost everyone and everything became larger than life for Americans during the First World War. This began with the armed forces we sent over to Europe, a million and a half strong by the end of the war. Promotions were numerous and so were medals of one degree or other for valor, each with full publicity on the scene and back home. No military breast looked dressed unless rows of ribbons and galaxies of medals adorned it. Rife as decorations were, though, in World War I, these were almost as nothing compared with World War II. And the tendency has heightened immeasurably since that war. One illustration will suffice: In the recent, embarrassingly awkward invasion of tiny Grenada, when three American services, army, navy, and marines, were brought to combat six hundred expatriate Cuban construction workers, less than half of them armed, victory, if that be the word, was celebrated by the issuance of eight thousand decorations—there and back in Washington.
As is so often the case in history, what began in the military spread quickly to nonmilitary society during the First World War. Under George Creel, President Wilson’s czar of war propaganda, about whose activities I shall say something in the next chapter, the custom arose of Home Front awards, honors, and decorations. Farmer of the Week, Worker of the Month, Lawyer of the Year, Surgeon of the Decade—these and many, many other honors festooned once quiet, modest, and shy America. The custom lasted, growing spectacularly during the 1920s, slackening somewhat in the 1930s, but regaining speed during World War II and thereafter. Today American professions spend uncountable hours giving awards to themselves. The academic profession probably leads, with journalism a close second, but lawyers, bankers, and dry cleaners are not far behind either.
A possibly more important, more creative change that came with the Great War in America was in language, written as well as spoken. It became obviously bolder than it had ever been in American history—yet another boon, or casualty, of the Great War and its smashing of old icons of respectability and conventionality. In journalism the tabloid flourished, and a newspaper vernacular came close to driving out the statelier papers such as the Boston Transcript and the New York Sun. Just as newspaper reporters had at last found a prose that brought the realities of war a little closer to readers, so, in the 1920s, they found a prose for the retailing of sex, murder, scandal, and other of the seamier aspects of life that was far more vivid than anything before. One of the great accomplishments of the respected novelists, dramatists, and critics—Hemingway, Dos Passos, Fitzgerald, Anderson, O’Neill, Mencken, and others—in the twenties was a sharper, terser, more evocative language than had prospered in the Gilded Age.
All in all, the America that came out from a mere year and a half of the Great War was as transformed from its former self as any nation in history. The transformation extended to fashions and styles, to methods of teaching in the schools, to a gradual letting down of the barriers between men and women and between the races, to informalities of language as well as simple habits at home and in the workplace.
It is not often realized that among war’s occasional tonic attributes is that of distinct cultural renascences, brief periods of high fertility in the arts. Here too we are dealing with results of the shaking up of ideas and values that so frequently goes with war in history. To examine such a work as A. L. Kroeber’s Configurations of Culture Growth, a classic in comparative cultural history, is to see the unmistakable and unblinkable connections between wars and immediately subsequent years of creativity in literature, art, and kindred disciplines. The celebrated fifth century in Athens began with the Persian War and concluded with the Peloponnesian. Rome’s greatest period of cultural efflorescence, the first and second centuries, are inseparable from European and Asiatic wars. The Augustan Age emerged directly from the Civil Wars. In the more recent ages of Elizabeth I and of Louis XIV, and in the Enlightenment, we are dealing with distinct periods of cultural fertility which are inseparable from the wealth, power, and ferment of wars.
We don’t often think of the 1920s in America as one of the more impressive intellectual and artistic outbursts in history, but it was. In terms of literature, we are obliged to go back to the American renascence just prior to the Civil War: to the single decade, perhaps, of the 1850s when works came forth from Melville, Hawthorne, Whitman, Emerson, Thoreau, among others—a constellation of creative genius that can pretty well hold its own in world competition.
The 1920s may not quite match the 1850s, but we are nevertheless in the company of novelists of the stature of Faulkner, Cozzens, Hemingway, Fitzgerald, Dreiser, Glasgow, Lewis, and others; the poets Eliot, Pound, Frost, Robinson; and intellectual czars—a new breed—who had H. L. Mencken at their head. The war figured prominently in the early works of some, though not all, of the novelists: Dos Passos, Faulkner, Hemingway in some degree, Fitzgerald in less, and the psychological atmosphere of war in these novels was unfailingly one of disenchantment and repudiation. The literature of disenchantment with war was much more abundant in England and on the Continent than it was in America; and well it might be, given the four long, bloody, shell-shocking, and mind-numbing years in the trenches that the Europeans, unlike the American soldiers, had had to endure.
Even more historic and world-influencing than our literature of the twenties, however, was our music of that decade: first and foremost jazz in all its glories, ranging from blues to early swing; very probably nothing else of a cultural nature is as distinctly and ineffaceably tied to the American matrix as is jazz, in composition and in voices and instrumental performances. But in the musical theater of Kern, Rodgers, and Hart in the twenties America took a lead in the world that would continue for close to fifty years. These names, and also of course those of Gershwin, Berlin, and Porter, were as lustrous in the cities of Europe and Asia as in the United States.
Hollywood perhaps became the American name of greatest reach in the world. Well on its way before the Great War, it was helped immeasurably by the war; when the federal government took over the movies for propaganda uses, an assured supply of funding made possible a great many technical as well as plot and character experiments which might have been slower in coming had there been no war. And of course the opportunity to cover the actual war in Europe, its details of action, death, and devastation, provided a marvelous opportunity for further experimentation. There were excellent movies made in the 1920s in America—movies fully the equal of those in Germany and France—on war, its carnage and tragedy, romance and heroism. In any event, it is unlikely that the phenomenon of Hollywood—its tales of actors and actresses, producers and directors as well as the remarkable quantity and quality of its films—would have burst forth as it did in the 1920s had it not been for the heady experience of the war. In art as in literature and philosophy, war can bring forth forces of creative intensity.
There was of course the myth of the Lost Generation to occupy memoirists, critics, and romantics through the 1920s and after. I shall say more about this myth in the final chapter. It will suffice here to emphasize that apart only from the appalling loss of a whole generation of brilliant minds in the trenches, there really wasn’t any such thing—only the literary rumor thereof.
In sum, in culture, as in politics, economics, social behavior, and the psychological recesses of America, the Great War was the occasion of the birth of modernity in the United States. It is no wonder that so many historians have adopted the stereotype of the Age of Innocence for what preceded this war in American history.
Another national legacy of the Great War is what I can think of only as the Great American Myth. This is the myth—it sprang into immediate existence with the armistice in 1918—that America, almost single-handedly, won the war. Such was American prowess in war, derived from clean living and good hearts, that it did in a matter of months what the British and French had been at unsuccessfully for more than two years: that is, lick the Hun. In such popular magazines as American, Everybody’s, The Literary Digest, The Saturday Evening Post, and local newspapers everywhere would appear staple pieces beginning “The reason the American doughboy won the war for the Allies was . . .”. There would follow reasons ranging from the Puritan ethic all the way to the “fact” that Our Boys all came from farms where they had plenty of milk and butter, learned to shoot squirrels with deadly efficacy, and could fix anything that broke with a hairpin.
But whatever the reason was, it is doubtful that any American failed to believe, in the twenties, that American soldiers had a genius for war; could, like Cincinnatus of early Rome, take their hands from the plow one day and fight valorously for country the next. In some degree the myth is a corollary of what Lord Bryce called “the fatalism of the multitude” in America: a belief, nay, a compulsion exerted by belief that America had a special destiny of its own—one that from its beginning as a “city upon a hill” in Puritan Massachusetts, through Colonial days, the Revolutionary War, the winning of the American continent in the nineteenth century, would carry America, perhaps alone among all nations, to an ever more glorious fulfillment of birthright. Such was the exceptional fate under which America lived, that she didn’t have to be concerned about all the cares and worries, the forethought, prudence, and preparation for the worst that other nations did.
The Myth would be a commonplace, no more than a charming conceit of a kind found perhaps in every people were it not for the fact that it was and is taken sufficiently seriously by many Americans as to become a utopian block to the military preparation and industrial skill that any nation must have, even if composed of supermen. The Great Myth was operating in full force when the Second World War broke out and it operates today in the form of tolerance of a Pentagon bureaucracy that chokes off initiative and perseverence.
The stark, agonizing truth is, we Americans have not been good at war, and particularly conventional war fought on land. We won our independence from Britain all right, but it’s best for the patriot not to dig too deeply into the reasons, which include key help from abroad, halfheartedness on the part of Britain, and quite astounding luck, benign accident. We were a ragtag lot, and most of the time the Continental Congress acted as if it was more afraid of a bona fide American army coming out of the war than it was of a British victory.
Our first war as a new nation, the War of 1812, was rashly declared by Congress, and it proved to be a mixed bag indeed for the United States. At Bladensburg our militia was routed without serious struggle, and the diminutive President Madison, seeking to demonstrate that he was the troops’ commander in chief, was very nearly captured by the British. There followed the burning of Washington, including the White House, or much of it, and the torching of dozens of settlements on Chesapeake Bay. We were no better on the Canadian border. True, we saved Baltimore and just after the war was ended, Andy Jackson was able to become a hero at New Orleans. Not much else.
In the nineteenth century we were good at beating the Mexicans, but less good at handling the American Indians in pitched battle. From the remarkable Tecumseh and his Red Stick Confederacy in 1809 to Sitting Bull at Little Bighorn in 1876, white Americans were ragged indeed. The West was won more by the momentum of westward expansion than by crucial battles with the Indians, whom we eventually “defeated” almost genocidally through malnutrition, disease, and alcohol. No Federal leader in the Indian wars equaled Tecumseh and Sitting Bull. Custer’s inglorious end at Little Bighorn is not a bad symbol of the whole of the Indian wars.
The Civil War produced, after a year or two of battles, at least two first-rate generals and some superb troops. Unfortunately these were not part of the United States forces; they belonged to the Confederate States of America. This is no place to play the game of “what if,” as in, what if the South had won at Gettysburg? But the very existence of the question attests to the nearness of at least temporary Confederate victory. The United States won in the end—after the unhappy Mr. Lincoln finally got rid of timid or inept generals—through the crude but effective bludgeonings by Grant’s mass army and the organized terror waged in Georgia by General Sherman.
Over the Spanish-American War, a decent curtain will be lowered here.
The American Expeditionary Force of 1917 arrived in France almost three full years after the trench slaughter there and on the Eastern Front had begun. The Allies were indeed glad to welcome the American soldiers, who did well; not brilliantly, but well, all things considered. We had our requisite heroes—Sergeant York, dashing, brilliant Doug MacArthur, Black Jack Pershing whom a grateful Congress elevated overnight to the rank of George Washington himself, and others—to hear about for years and years in the thousands of little towns in America. In all truth, it is quite possible that had the war lasted a couple of years beyond 1918, had more American divisions actually been blooded in battle, and had it been given, in short, the time and seasoning necessary, the AEF might have become a sterling fighting force. But we were Over There for a pitifully short time, from the military point of view.
The American public, however, and, sad to say, the professional military in America, saw it differently. Our Boys had the strength of ten, and after the imperialist-minded, materialistically motivated British and French had stumbled and bumbled for two and a half years, Our Boys cleaned up the job. The Great American Myth gave birth to other myths: Can Do, Know How, and No Fault, myths which abide to this minute in America and yield up such disasters as Korea, Vietnam, Iran, Lebanon, and Grenada.
Under the spell of the myth, Americans begin anticipating victory and peace at about the time war is declared. In World War I and World War II, spurred on by editors and broadcasters, they were chittering within months about getting The Boys home for Christmas.
Our civilian recruits in World War II had hardly been at training six weeks when an eager citizenry proudly declared them “combat-ready right now.” Sadly, some of our military leaders exhibited the same impetuous innocence. When Churchill was taken by General Marshall and other officers to witness for himself the “readiness for combat” of trainees at a South Carolina camp, Churchill bruised some feelings, we learn, by declaring that “it takes at least two years to make a soldier.” So it does. But the Great American Myth says otherwise, and it is seemingly indestructible.
A notorious and potentially devastating instance of the myth was the American shrilling for a Second Front Now in 1942—a shrilling, alas, joined in by Roosevelt and, nominally at least, Marshall and the other Joint Chiefs. They were unimpressed by the nearly fatal experience of the British at Dunkirk in 1940; and they would remain unimpressed by the utter failure in August 1942 of the largely British-Canadian Dieppe assault in France, in which thoroughly trained, seasoned attack troops five thousand strong were repulsed easily, with 70 percent casualties, by German forces well emplaced and armed.
To be sure, Stalin, threatened by Hitler’s armies in the east, was noisily demanding such a second front, in the process calling Churchill and the British cowardly; but even without Stalin’s demand in 1942—instantly echoed, of course, in both England and the United States by Communist parties and their multitudinous sympathizers among liberals and progressives—the Great American Myth, the myth of Can Do, of effortless military strategy and valor, that is, American Know How, would have kept the cretinous pressure going for a storming of the cross-channel French coast, awesomely guarded by the Germans, in the fall of 1942 and early 1943.
As thoroughly mythified as anyone else, President Roosevelt himself developed a plan, as he called it, for such a blind, suicidal frontal assault by the British and Americans (in the very nature of things in 1942, overwhelmingly British) on the French coast directly across the channel. He wrote Churchill that such was the importance of his “plan” that he was sending it over by General Marshall and his aide Harry Hopkins, so that they might explain its merits personally to Churchill and his military chiefs. The decision to storm the French coast must be made “at once,” declared Roosevelt through his envoys. Since only five American divisions would be ready by the fall of 1942, “the chief burden” would necessarily fall on the British, the President charmingly explained. By September 15, America could supply only “half” of the divisions necessary, that is, five, and but seven hundred of the necessary five thousand combat aircraft. FDR’s plan foresaw a first wave of six divisions hitting “selected beaches” between Le Havre and Boulogne. These would be “nourished” at the rate of one hundred thousand men a week. The whole war-ending operation must begin in late 1942 and reach climax in 1943.
What the British, starting with Churchill, really thought of this incredible nonsense we don’t know. Keeping the Americans happy in their choice of Europe First, Japan Second, was of course vital, imperative diplomacy for the British. Thus while offering frequent overt reactions of the “magnificent in principle,” “superbly conceived,” and “boldly projected” kind, the British leaders made immediate plans, we may assume, for weaning the Americans from a 1942 channel assault to North Africa, eased by a pledge that the so-called Second Front would take place in 1943.
Today, looking back on what was required in June 1944, two years after Roosevelt’s plan was unveiled before the eyes of Churchill—required in the way of troops, landing craft, mobile harbors, planes, ships, materiel of every kind, and required too in the way of sheer luck—we can only shudder at the thought of a Normandy invasion beginning in the fall of 1942, less than a year after Germany brought the United States into the European war by its declaration of war on America.
Only the Great American Myth can possibly explain the rashness, the foolhardiness, of Roosevelt’s proposal and the at least ostensible endorsement of it by American generals. Powerful defenses manned by the highly efficient German army, the treacherous currents of the channel, the terrible weather problems, the enforced insufficiency of men and materiel—what are these as obstacles when the invading troops will be joined by Our Boys, fresh from the farms, hamlets, and towns of America the Beautiful, the spirit of Galahad in every soldierly breast?
The Great American Myth fell on its face, though, in North Africa when, following our first eager and confident efforts, there were serious and indeed embarrassing reverses to American troops, whose officers were disinclined even to receive, much less ask for, advice from the well-seasoned British. The Great American Myth, absorbed in basic training, at first stood between American officers and even recognition of the sad state of their strategy and tactics. The American bumblings began in Tunisia in late 1942 and were still only too apparent in the first months of 1943, nowhere more humiliatingly than at Kasserine Pass where in addition to inflicting heavy casualties on the Americans, the openly contemptuous Germans took away half of their strategic weapons. Relations between the Americans and the British were precarious indeed, requiring constant attention by both Churchill and FDR.
American courage was not in doubt; nor was potential once adequate time and opportunity for experience had been provided. Nevertheless, the embarrassing fact is, the Americans, including Marshall and Eisenhower, who had argued so strongly for a Second Front Now on the fearfully manned and armed Normandy coast, with all stops pulled out on Can Do, looked pathetic in the far easier circumstances of Tunisia. And matters weren’t different in the Pacific so far as land forces were involved. An infantry division trained for a year in the hot desert was sent, in desert clothing, for its first assignment to the bitterly cold and wet Aleutians, yielding a record toll of incapacitating frostbite. Hundreds of marines were slaughtered in the beachhead of Tarawa, largely as the result of command failure to use intelligence and readings of charts of coastal waters and island detail. Marines, it was trumpeted, Can Do and already have innate Know How. Presumably the hapless marines in Lebanon, over two hundred in number, were ascribed the same innate attributes when they were sent by Reagan in 1983 without arms, without vital intelligence, and without instructions—ending up, as we know, without lives.
The entrance of America in military array into Vietnam was begun by the Kennedy administration apparently for no other reason than impulse to show the world academic Know How of the sort illustrated by McNamara, Bundy, Hilsman, and Arthur Schlesinger, Jr., among others. We lost Vietnam after an unprecedentedly long war, one hugely expensive in lives and dollars. Desert One, in Iran, was an incredible mishmash of sheer unpreparedness and incompetence of leaders. Tiny Grenada attracted three American services—bumbling, Abbott and Costello–led services, alas—to deal with not more than two hundred armed Cubans. Most recently we have had the Freedom Fighters, and an entry into the Persian Gulf, to convoy tankers, without minesweepers!
Before leaving the myth, it is worth noting that it operates, and perhaps nowhere else so fatefully, in every new president’s conception of himself and his command of foreign affairs. Since FDR it has become almost de rigueur for each president to make it plain to all that he will be his own secretary of state, his own master of foreign policy. The next step is usually that of creating the impression that he not only doesn’t need the State Department and congressional committees to help him, but also frankly finds their presence inimical to the new, suddenly revealed, foreign policy that he—and perhaps a Colonel House or Harry Hopkins or William Casey, but no one else—intends to broadcast to the world.
Churchill, the greatest leader yielded by the war and indeed the century, reported to his War Cabinet every day on his activities; he consulted his assembled chiefs of staff regularly; he reported periodically to Parliament; and he drew constantly on the permanent secretariat, the body of specialists that stayed through all changes of government. He would not sign the Atlantic Charter aboard the battleship off Nova Scotia until its full text had been cabled to the War Cabinet and a reply received. He was still the leader.
Roosevelt saw fit to consult no one but Hopkins and Sumner Welles about the charter; the idea of getting the counsel even of officers of the State Department, much less of congressional committees, would have made him laugh. He knew what was needed and right; experts were unnecessary and actually obstructive. FDR had never met Stalin or any other high Soviet leader; he had never been to or even read particularly about the Soviet Union. But he obviously felt the full impulse of the Great American Myth when he wrote Churchill three months after entry into the war that he “could personally handle Stalin” and do so far more ably than either the British Foreign Office or the American State Department. What Churchill thought on reading this he never told the world, contenting himself with merely including the Roosevelt message in his war memoirs.
Just as each new president must show his spurs by deprecating State Department and congressional committees in foreign policy, so, it seems, must each new National Security Adviser to the president. He too, under the Great Myth, immediately knows more than Congress or the Departments of State and Defense about any given foreign or defense issue that arises. Watching Kissinger perform as National Security Adviser, to the confusion of the State Department and congressional committees, we might have foreseen a day when a National Security Adviser would appear in admiral’s uniform and define his role as that of excluding not only Congress and the Departments of State and Defense from knowledge of purportedly covert NSC operations but even his very boss, the president of the United States.
Add to what has thus far been said about the Great Myth and American Know How the attribute of No Fault, and we have the myth fairly well identified. Presidents, secretaries, and generals and admirals in America seemingly subscribe to the doctrine that no fault ever attaches to policy and operations. This No Fault conviction prevents them from taking too seriously such notorious foul-ups as Desert One, Grenada, Lebanon, and now the Persian Gulf.
The spirit of ingrained Know How is by no means limited to the American military and the national government. Corporate America and Wall Street both bring the Great American Myth to conspicuous presence regularly. Until Black Monday, October 1987, even such unprepossessing goings-on as insider trading, hostile takeovers, flaunting of junk bonds, and golden parachutes were widely regarded by brokers and economists alike as basically healthful, nicely attuned to economic growth and productivity.
We shall not soon forget the efflorescence of the Myth in Detroit for perhaps twenty years after World War II when a vast litter of unsafe, low quality, ugly, and expensive automobiles were the issue of the Know How, Can Do, and No Fault psychology of the auto industry. Not even Ralph Nader would have effected salutary change in Myth-beset Detroit had it not been for the ever-widening competition—and here at home where it hurt—from Japan, West Germany, and other countries.
The Great Myth provides a warm and lustrous ambiance for our towering national debt of close to three trillions, our annual budget deficits, now at two hundred billion, and our even more hazardous trade deficits. Only the intoxicating presence of the Great Myth can explain how otherwise sane and responsible people, including financial editors and professional economists, find not only no danger in such a mess of debts and deficits, but actual nutriment of economic equilibrium and growth. Historically large and prolonged national budget deficits have been almost uniformly regarded by experts as potentially crippling to any society. So has lack of savings and of investments in business been generally regarded as putting an economy in jeopardy. Consumer hedonism with its vast consumption of the fragile and ephemeral has always been looked at with apprehension by statesmen. But during the years of Reagan and his all time record setting deficits and debt-increases a new school of thought has emerged; one that declares debts, deficits, trade imbalances, and absent savings forces for the good, requiring only, if anything at all, substantial tax cuts. Needless to say, the rest of the world, starting with Japan, can only look wonderingly at the U.S. The God who looks out for fools and drunks is indeed needed for the Republic.
Fascination with the amateur steadily widens in America—amateur in the sense of unprepared or inexperienced. We scorn professionality in appointments of officials ranging from the Librarian of Congress to Secretary of State. A Martian might think experience in national and international affairs the first requirement of the Presidency. Not so, for we fairly regularly and confidently elect Coolidges, Kennedys, Carters, and Reagans to the White House as if there were a positive advantage in being ignorant or inexperienced in national and international politics. Both Carter and Reagan seemed to think this was the case when they ran for office. So, obviously, did a great many Americans. It’s an old passion. In the twenties there were millions who begged Henry Ford to step down from Dearborn to Washington and “get things straightened out.” In 1940 there was Wendell Wilkie and then Thomas Dewey, the while a Robert Taft could only gaze from the side line. On the whole it seems the Republicans are more easily dazzled and motivated to go for amateurs than are the Democrats. But it’s a bipartisan failing, for the Great American Myth is everywhere. Naturally the first thing an amateur does when elected to the White House is appoint fellow-amateurs—not only to innocuous posts such as the Librarian of Congress but to State, Treasury, the CIA, Defense, and so on, not forgetting vital ambassadorships.
From McNamara to Weinberger we have seen off and on amateurs as Secretary of Defense. And from McNamara’s TFX and computerized body counts to current miseries with the Bradley, the Sergeant York, the MX, and the B-1 bomber there has been a steady roster of the failed and abortive. Ten years since the mesmerizing RDF small units were announced, Pentagon is still struggling to put one such military unit into being and action. Pentagon, alas, has penetrated large areas of our economy and also, much more important, our universities and their research laboratories. We have not been in a major war since 1945, excepting perhaps for Vietnam, which was selected by the Kennedy White House as a simple counterinsurgency operation—nothing, really, but small wars, calling for special military units.
Why, then, so immense a military? The immediate answer is invariably the Cold War with the Soviet Union. The answer is indeed worth respectful consideration. The record is plain that once Japan was defeated in late 1945, America commenced an immediate pell-mell, helter-skelter demobilization that might well have denuded the nation of a military in the same measure that it did after the First World War. This demobilization stopped for one reason alone: the voracious Russian appetite for land and power that could no longer be hidden once V-E Day came in Europe. In Poland, in the Baltic states, in the Balkans, in Iran, and in the Far East, Stalin either entered or else shored up and consolidated lands and nations he had already appropriated during the final months of the war. The roots of the Cold War are, of course, in these acts of aggrandizement, which became steadily more odious to Americans after the war, and also, by implication, threatening. But the Cold War began in full fact when Truman gave protection to Greece and Turkey, at Britain’s urgent request, and Stalin realized that the United States would no longer tolerate what Roosevelt had during his presidency, when his mind was stubbornly set on winning Stalin’s friendship and postwar favor.
But with all respect to the Cold War and to the highly militaristic, imperialistic nation that wages it on America, it alone is not enough to explain either the size or the type of military establishment we now have on our hands. The Cold War does not by itself come close to explaining the sheer size of the budget, well over three hundred billions a year, much less some of the specifications which are involved in military budgets. Surely a six-hundred-ship navy based upon aircraft carriers and battleships is not a requisite for any conceivable war with the Soviet Union, a war that would inevitably be land-based. The very real potential menace of the Soviets doesn’t require, surely, to make it believable to the American public, that we sweep into the American-Soviet maw every little brushfire war that breaks out in Africa, the Middle East, and Latin America. The confrontations of doves and hawks, both in government and among political and military intellectuals, do indeed involve the Soviets from time to time, chiefly in respect of the size and type of our nuclear force, but far more of such confrontations are pivoted upon incidents and outbreaks only dimly connected with the Soviet Union. The Soviets just won’t pass muster as the cause of everything—Korea, Vietnam, the Dominican Republic, South Africa, Iran, Lebanon, Grenada, Central America, the Persian Gulf, and so on—that we have on our post–World War II record.
There are two powerful, and by now almost inescapable, forces which operate to yield America an ever-larger military. By this late part of the century, after two world wars, a string of smaller ones, and forty years of the Cold War, these two forces would surely continue to operate even if the Soviet Union were miraculously transformed into a vast religious convent. Together the two forces, the one rationalistic, the other moralistic, conjoin irresistibly in our present society.
The first was noted profoundly by President Eisenhower in 1961 in his cogent farewell remarks. He warned Americans against what he called the “military-industrial complex” and also the “scientific-technological elite.” Taken in its entirety the Eisenhower farewell address is as notable as was that of George Washington. It deserves fully as much attention as the Washington address has received over the years.
Ike was struck by how, during the Cold War—a war he believed had to be waged, given the nature of the Soviet Union—the military and the whole armaments-defense private sector had become interlocked fatefully. Each grew stronger from the nutriment supplied by the other. He was also struck by the sheer internal, indigenous power of the scientific-technological elite in the United States and its attraction to war and the military as a vast, virtually free laboratory. Moreover, Ike added, our tendency since World War II has been to meet the threat of Soviet power through “emotional and transitory sacrifices of crisis” rather than through considered planning that would meet foreign dangers without ripping the fabric of American life, without incurring expenses so vast as to worry the most dedicated of patriots. There is, Eisenhower continued, “a recurring temptation to feel that some spectacular and costly action could become the miraculous solution of all current difficulties.” Could President Eisenhower have been thinking about our current Strategic Defense Initiative, or Star Wars, project, hailed in the beginning as a canopy over our heads that would forever shield us from nuclear weapons, and now estimated to cost a full trillion dollars to deploy in full—assuming it can ever be deployed at all?
The cost of alleged scientific miracles is probably less, though, than the total costs of what may from one point of view be called the militarization of intellectuals and from another point of view the intellectualization of the military. I am thinking of the fusion of the military and the university during the last half-century. Eisenhower offered this warning also in his farewell remarks: “The prospect of domination of the nation’s scholars by federal employment, project allocations, and the power of money is ever present—and is gravely to be regarded.” He cautioned too: “Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity.”
Eisenhower was warning primarily of what may happen to the universities as a result of their compulsive willingness to adapt, readjust, and refashion in the interest of the huge sums ever ready to be granted by the military. But a moment’s thought suggests the reverse conclusion: The power of the university and the university culture in this country is such that by virtue of its union with the military, the whole nature and function of the military establishment could become altered, and not necessarily for the better. But whichever conclusion we choose to accept, the symbiotic relationship between the military and a large and increasing part of the university world is only too obvious. The university thus joins the corporation and the technological institute in becoming willy-nilly a complex, possibly deep pattern of culture. The economy has a vested interest in the prevalence of war; that is obvious. Does the university? That could have seismic effects in the academic world.
The military, or defense, intellectual is one of the flowers of the present age, and also one of the great drawbacks in our military establishment. Probably McNamara as Secretary of Defense under Kennedy has the dubious honor of being the first intellectual to demonstrate to the professional military exactly how wars should be fought. His punishment by the gods for hubris consisted of participation in the Bay of Pigs fiasco and then his appalling leadership in the American buildup of troops in Vietnam. But there were other military intellectuals under Kennedy: Bundy, Hilsman, Rostow, and the never-to-be-forgotten Daniel Ellsberg of Defense. Who will forget the saga, now firmly in our schoolbooks, of how our military intellectuals were “eyeball to eyeball” with Khrushchev and the Soviets over Soviet missiles being fixed in Cuba? It is only today, twenty-five years later, that the truth is coming forth from the aforementioned hawks, and we now learn that the truth consists not of intellectual hawks but of doves dressed like hawks eager to make conciliatory gifts to the Soviets and to adopt secret backup lines in the event Khrushchev became hard-nosed and stubborn.
It was in the Kennedy administration that the unlamented, embarrassing Project Camelot was conceived and shortly aborted. This was a covert operation based secretly at American University in Washington and manned largely by academics and free-lance intellectuals who were apparently enchanted by the image of Kennedy and his intellectuals at the top and made eager to earn a few spurs themselves as covert hawks. A couple of dozen professors from some of America’s better universities collaborated with the military to work out an intellectually—sociologically, anthropologically, and psychologically—sound covert operation by which America, with or without green berets, could spark counterinsurgency operations in countries where the resident government seemed perhaps unable to cope. Chile, of all places, was tagged secretly as the proving ground for the scheme. One of the academics became conscience-stricken, however, and blew the whistle on the absurd venture, thus arousing the ire of the Chilean government, the front-page attention of the Washington Star, and an investigation by a special committee of Congress. Although the very participation of a large gaggle of American academics attests once again to the Great Myth, it has to be said that under the fools’ luck codicil of the myth, all participants escaped with nothing more than temporary embarrassment.
There is no evidence that I know of that McNamara’s career as military intellectual—complete, it will be remembered, with computerized body counts and TFX monstrosities—has been bettered since by any of his by now multitudinous flock of followers. More and more centers, think tanks, and institutes in Washington are directed to war policy and war strategy, and to war intelligence. Hardly a night goes by without one or other military intellectual appearing on the television screen to clear up confusions about war and the military. Intellectuals come as “terror experts,” “strategy analysts,” “intelligence consultants,” and no one ever seems interested in where these ever-voluble experts acquired their credentials.
The liaison between scientist and technologist—connected inevitably with the liaison between the military and the corporate world—is especially productive of vast military establishments in present-day America. Eisenhower could have elaborated had he chosen to do so, even back when he said his farewell. But that day is as nothing compared to our own. It is almost as though the scientific-technological has become an immense creature with life and energy of its own. A perceptive article in Barron’s (August 17, 1987) presents a list of “current programs,” “new programs recently authorized,” and “programs emerging from development within 5 years.” Secret programs are not listed; those that are, run into the dozens and include both nuclear and conventional military technology.
Barron’s correctly features the astronomical costs of even the overt part of the weapons program, costs which when they reach a certain not easily specifiable point will be repudiated by the people and Congress, thus presenting one kind of defense crisis. Another kind of crisis we are perhaps already reaching is that between the seemingly infinite productivity of the strictly scientific-technological elites and the very finite capacity of fighting forces, right down to the individual soldier, for assimilating all the wonders of design, for adapting them to the harsh and unforeseeable realities of the battlefield. It is as though the scientific-technological community takes on a life of its own in the design and development of weapons, a life that becomes dangerously aloof to the needs of the soldier. Given this psychology, this urge to impress fellow scientists irrespective of cost or ultimate utility, it is scarcely remarkable that the defense budget skyrockets annually, and the list of unassimilable “problem” designs—such as the unlamented TFX under McNamara, the B-1 bomber, the M-1 tank, the Sergeant York, and the General Bradley troop carrier—keeps growing. In each of these, it would appear, the largely irresponsible imagination of technological designers has outstripped military practicality and basic need on the field.
Electronic and computerized equipment becomes more and more complicated as well as expensive. Soldiers dependent on such equipment are that much more vulnerable in war. When the cruiser Stark was badly crippled in the Persian Gulf by an Iraqi missile launched from a plane, it was not the complex, exquisitely sensitive radar computer system that alerted the ship’s commander and crew but rather a sailor in the crow’s nest—the oldest form of seagoing intelligence, we may assume, in history—who alerted the ship; too late, tragically, but that wasn’t the sailor’s fault.
When one reflects a moment on the failure of the computerized, electronic, mechanized system to do what it was supposed to do, warn of approaching planes and their missiles, and thinks also of the fact that in the end it was the human being using only his own eyesight who put up any kind of action whatever, we can’t help mulling over something like the Strategic Defense Initiative, Star Wars. When it, in its trillion-dollar splendor, is finally deployed in space, with the security of the United States officially declared dependent upon it, will it perhaps turn out that just as the computerized Stark in Persian waters required the sailor in the crow’s nest, the operation of SDI will require the eyes and ears of many thousands of civilians standing watch? Apparently we’ll never have a chance to know, for the first use of SDI in a holocaustic emergency will be final, one way or the other.
Even if there were no Soviet Union or its equivalent to justify our monstrous military establishment, there would be, in sum, the whole self-perpetuating military-industrial complex and the technological-scientific elite that Eisenhower warned against. These have attained by now a mass and an internal dynamic capable of being their own justification for continued military spending. That is how far the military—its substance and its mystique—has become fused with economic and intellectual life. Take away the Soviet Union as crucial justification, and, under Parkinson’s Law, content of some kind will expand relentlessly to fill the time and space left.
Giving help and assistance to Parkinson’s Law in the predictable prosperity of the military establishment in our time is what can only be called Wilson’s Law. That is, Woodrow Wilson, whose fundamental axiom “What America touches, she makes holy” was given wording by his great biographer, Lord Devlin. The single most powerful cause of the present size and the worldwide deployment of the military establishment is the moralization of foreign policy and military ventures that has been deeply ingrained, especially in the minds of presidents, for a long time. Although it was Woodrow Wilson who, by virtue of a charismatic presence and a boundless moral fervor, gave firm and lasting foundation to American moralism, it was not unknown earlier in our history. The staying power of the Puritan image of America as a “city upon a hill” was considerable throughout the eighteenth and nineteenth centuries. America the Redeemer Nation was very much a presence in the minds of a great many Americans. American “exceptionalism” began in the conviction that God had created one truly free and democratic nation on earth and that it was to the best interests of all other nations to study America and learn from her. Even the conservative and essentially noninterventionist President Taft, in 1912, sent a detachment of marines into Nicaragua with instructions to announced to the Nicaraguan government that “The United States has a moral mandate to exert its influence for the general peace in Central America which is seriously menaced. . . . America’s purpose is to foster true constitutional government and free elections.”
But Taft’s message was as nothing in the light of the kind of foreign policy and military ventures that began under Woodrow Wilson in the Great War—or, if it didn’t begin under him, it was enlarged, diffused, and effectively made permanent. Ever since Wilson, with only rarest exceptions, American foreign policy has been tuned not to national interest but to national morality. In some degree morality has crept into rationalization of war in European countries too, but some responsibility for that has to be borne first by Wilson, then by Franklin Roosevelt, each of whom tirelessly preached the American Creed to such Old World types as Lloyd George, Clemenceau, and then Churchill in the Second World War. Those three, and many others, had thought that each of the two world wars was fought for national reasons, specifically to protect against German aggressiveness and then destroy it. Not so, chorused Wilson and Roosevelt, the first of whom composed the Fourteen Points, the second the Four Freedoms and then as encore the Atlantic Charter; and much of America has been singing those notes ever since.
Woodrow Wilson is without question the key mind; Roosevelt was simply a Wilsonian without the charismatic will and absolute power of mind that Wilson had. One thinks here of Karl Marx when someone reminded him that Hegel had opined that history occasionally repeats its events and great personages. Yes, said Marx, the first time as tragedy, the second as farce. Wilson was pure tragedy, Roosevelt farce. Wilson sought to invoke all the powers of his Calvinist god and his beloved city upon a hill, the United States of America, in order to bring about a world assembly, the League of Nations, that would realize for the entire planet the sweetness and light of America. This he sought, preached, and died for. Roosevelt, with much the same dream, spent World War II in pursuit of Josef Stalin, convinced that he, FDR, could smooth out the wrinkles in Uncle Joe, spruce him up, and make a New York Democrat out of him. That was farce—one we haven’t escaped even yet.
Wilson above any other figure is the patriarch of American foreign policy moralism and interventionism. Churchill wrote, in his The World Crisis shortly after the Great War, that to Wilson alone had to go credit for America’s entry into that war; everything depended “upon the workings of this man’s mind and spirit to the exclusion of almost every other factor. . . . He played a part in the fate of nations incomparably more direct and personal than any other man.”
At first Wilson fought and bled for neutrality in the war, for an America “too proud to fight” in the nasty imperialist wars of the Old World. He believed, and said to his intimates, that England and France were basically as guilty as Germany of crimes to humanity. But sometime in 1916 Wilson began to brood over his neutrality policy and to wonder if it was, in the end, the best means of putting America on the world stage as the city upon a hill needing only the eyes of all peoples on it to reform the world. Reform was the iron essence of Wilson’s approach to the world. Born Calvinist, with a deep sense of sin and wickedness, and of the necessity of living by God’s grace, and the necessity too of preaching and ministering this grace to the multitude, Wilson gradually transferred the content, but not the fire, of his faith to the American republic. His book The State enables us to see how in his mind the true church for him had become not the historic church, the institutional church, but rather the state—provided, of course, that it was permeated by virtue, goodness, and redemptiveness.
The passion and wholeness of his desire to reform and to redeem can be seen first at Princeton where as president he put Princeton “in the nation’s service.” When he decided to reform the eating clubs, thus dividing university and trustees into bitter camps, he likened his work to that of the Redeemer in the cause of humanity; he did much the same thing when a little later he and Graduate Dean West were opposed as to where exactly to locate the new graduate school at Princeton. Virtually everything he touched became instantly transformed into an Armageddon. As president of Princeton, as governor for two years of New Jersey, and finally as president of the United States, Wilson burned and burned as moralist, seeing crises where others saw only problems, and endowing even his dispatch of American troops into Mexico, in retaliation for Mexican bandit crossings of the border, with a mighty purpose that would benefit all mankind.
World war was thus cut out for a mind of Wilson’s passionate moralism. What he and America did had to be eternally right, before mankind and God. He had been appointed by God to serve the blessed American republic and to determine what was right in the war. His final decision, which germinated all through 1916, the year of his reelection under the banner of “He kept us out of the war,” and came to thundering expression in early 1917, was that neutrality must be scrapped for intervention. He had been right in his policy of neutrality but the world and the war had changed; and now he must, with equal godliness and righteousness, do the very opposite—that is, plead with heart and soul for immediate American intervention.
Objectively prophets and fanatics change from time to time in their views of man and the world. Subjectively, however, they never change. Always the motivating principle in their lives is the same from year to year, decade to decade. It is only appearance, ever-deceptive appearance, that creates the illusion of change in the great man. Those close to Wilson learned within days of his conversion to intervention, often learned the hard way, never to speak to the President of anything that implied in the slightest that he had ever been other than a dedicated interventionist.
Actually, as Lord Devlin has stressed in his biography of Wilson, the President was in fact interventionist at heart from the very beginning; but he curbed his interventionism until the war and the international scene were just right. Devlin writes:
The Allies did not [Wilson believed] genuinely care about democracy and the right to self-government. He did; and he could proclaim his faith as they had not truly and sincerely done. In his mind it was then and not before, that the war to rid the world of tyranny and injustice really began. What America touched she made holy (emphasis added).
Thus the birth of twentieth-century moralism in foreign policy and war. From Wilson’s day to ours the embedded purpose—sometimes articulated in words, more often not—of American foreign policy, under Democrats and Republicans alike oftentimes, has boiled down to America-on-a-Permanent-Mission; a mission to make the rest of the world a little more like America the Beautiful. Plant a little “democracy” here and tomorrow a little “liberalism” there, not hesitating once in a while to add a pinch of American-style social democracy.
Even before Wilson’s earthshaking conversion from neutralism to intervention in early 1917, his moralism in foreign policy had been displayed to the world. Certain internal political troubles in Mexico attracted his mind and that of his populist-agrarian-pacifist secretary of state William Jennings Bryan. In 1913 the President and his secretary decided to move in. Wilson had the same dislike of professionals, diplomats, and lawyers, that Roosevelt, Kennedy, Johnson, and Reagan would all have, each convinced that he by himself made the best and most enlightened foreign policy. Wilson recalled, for no given reason, his own ambassador to Mexico, immediately replacing him with a friend and former midwestern governor, John Lind. Before Lind left for Mexico, he was given a letter, written by the President himself to guide the new and inexperienced ambassador. Ambassador Lind was to make it clear from the start that the United States was not as other governments were. Never!
The letter informed Lind that the whole world expected America to act as Mexico’s nearest friend; America was to counsel Mexico for its own good; indeed America would feel itself discredited if it had any selfish and ulterior purpose. In conclusion Mr. Lind was to inquire whether Mexico could give the civilized world a satisfactory reason for rejecting our good offices. Not surprisingly, the Mexican government repudiated, flouted, Wilson’s great act of charity. Even when the United States, again not surprisingly, backed up its moral advice with offer of a loan, it too was rudely rejected. Wilson first adopted an air of patience, but that was soon followed by his demand that the president of Mexico step down from office. The United States, Wilson said, would “employ such means as may be necessary to secure this result.” Then, in words heard around the world, Woodrow the Redeemer said: “I am going to teach the South American republics to elect good men.”
There is no need to detail what happened thereafter, first at Tampico, then at Veracruz, citing American gospel all the way: pretending to deepest wounding of American dignity in a minuscule contretemps at Tampico, then sending in several thousand naval troops at Veracruz, who inevitably met some resistance and, under orders, responded with rifles and guns, causing about three hundred Mexican dead and wounded, with fewer than a hundred American casualties, then confusedly retiring from the scene and leaving a distraught President Wilson ready to collapse in the arms of any international mediating tribunal—which he did in May 1914.
He had been blooded, though, as it were, and it was probably ineluctable that after due waiting, he would advance moralistically once again in a year or two, this time on the world stage. What America touches she makes holy. This was Wilson’s adaptation of Christian blessedness to American foreign policy. He had to teach South American governments to elect good men. This earned the United States lasting impotence, save when force has been used, in all of Latin America. Next it became necessary to teach, through our intervention in the Great War, England, France, and the rest of Europe what true democracy and freedom were and how they were best seeded for posterity in all countries, great and small. Thus the birth of what shortly became known as Wilsonian idealism and became in oppressive fact American moralism abroad.
It is no wonder that Wilsonian moralism took hold of substantial segments of the American population. A whole generation of burgeoning political leaders, mostly in the East, was nurtured by Wilsonianism; they were in large part products of old wealth, of private schools and Ivy League universities, able to give ample time to international matters. Roosevelt was emphatically one of this generation, the more so perhaps in that he had served as assistant secretary of the navy under Wilson, had known him, had touched him, had had apostle’s propinquity.
When World War II broke out in Europe, Roosevelt followed almost compulsively, as it seemed, the Wilson model. First neutrality, but in bemused contemplation of America’s relation to the world. What America touched she made holy. It was vital therefore for her to proceed carefully. Roosevelt came to an earlier decision than Wilson had in his war; and that decision was, like Wilson’s, one of intervention as soon as Congress could be persuaded to declare war. But in the meantime there was much that could be done in the way of Lend-Lease and, most especially, vital speeches and conferences in which the war’s true purpose was given Rooseveltian eloquence. Thus the Four Freedoms speech before Congress in January 1941; then the Atlantic Charter conference with Churchill in August. Since the charter anticipated alliance with Stalin and the Soviet Union, which had only just been brought into the war against Hitler by virtue of the German invasion, the earlier Four Freedoms had to be cut to Two Freedoms in the charter. After all, Stalin’s Russia was deficient, embarrassingly so, in freedoms.
Roosevelt had one, and only one, serious reason for taking the United States into the European war, a feat made possible in the end solely by Germany’s declaration of war on the United States. That reason was the Wilson-derived mission of cleaning up the world after the war was won. Now comes the element of farce in Roosevelt that was lacking in Wilson. In Roosevelt’s mind Wilson had lacked a true partner, some nation altogether free of wicked imperialism that the United States could properly, morally, work with. Britain, France, and most of the rest of Western Europe were excluded. All had indulged in imperialism. There was, however, one country that by its very nature was free of imperialism. That was Stalin’s Communist Russia. He, Roosevelt, would work with Stalin during the war, acquiring his trust, perhaps civilizing him and thus Russia a little bit, and then forming a great American-Soviet partnership after the war to superintend the United Nations. All imperialism would be wiped out, all peoples, large or small, endowed with representative institutions, with human rights, and peace and democracy would be insured worldwide.
Roosevelt, like Wilson, lived just long enough to see the bitter fruits of his trust. The ink was hardly dry on the Yalta treaties and manifestoes when Stalin commenced flouting every one of the pieties and moralisms he had readily agreed to at Yalta. Yalta didn’t give him Eastern Europe; his armies had already done that. What it gave Stalin was a sanctimonious imprimatur on the “democracy” and “freedom” and “free elections” the Soviets were imposing upon the subjugated Balkan and Baltic Europeans, together with Poland. Tragedy? No, farce: Can anything in political history be more farcical than an American president putting his trust in a dictator whose hands were bloodied forever by the millions he had starved, tortured, shot, and frozen in Siberia? Whose sworn purpose, inherited from Lenin, was the propagation of Communist revolution throughout the world? Who was openly contemptuous of Roosevelt, actually seeming to enjoy the company of the out-and-out imperialist—and longtime Communist foe—Churchill? Who made no bones about reducing not only Eastern but Western Europe—Britain and France foremost—to Third World status? It was Wilsonian moralism, albeit somewhat debased, that drove Roosevelt to his mission respecting the Soviet Union. He believed as ardently as Wilson had that What America Touches She Makes Holy.
Today, forty years later, moralism continues to inflame American foreign policy, Ronald Reagan being the devoutest successor thus far to Wilsonianism as interpreted by Roosevelt. He too loves to divide the world into the Good and the Evil, and to define American foreign policy as relentless punishment of the Evil by the Good—led by America. He too sees every Nicaragua, every Lebanon, Iran, Persian Gulf, and Grenada as a little bit of Armageddon, with all means justified by purity of mind.
And conceivably bankrupt. If our foreign policy were one of protecting our national security and looking out for the preservation of our political nationhood and general well-being, from time to time doing what little good for others our capacities permitted, we would not require a six-hundred-ship navy, one bulging with supercarriers, battleships, and weaponry better suited to the now historic battles of Jutland in World War I and Midway in World War II than to defense of ourselves against Soviet aggression. General de Gaulle correctly referred to “America’s itch to intervene.”
When we intervene the act is almost compulsively cloaked, even as Wilson’s acts were, in rhetoric of pious universalism. We use our variants of Kant’s categorical imperative in international affairs. We must always explain that behind our intervention lies the imperative of moral goodness—nothing less. For so simple, practical, and obviously necessary a thing as our quick aid to Turkey and Greece immediately after World War II, at England’s request, a Kantian rhetoric had to be devised: that our action sprang from our resolute insistence that freedom will be supported everywhere in the world.
A few years later, in 1960, President Kennedy gladdened the hearts of all political moralists in America with his vow that we would “pay any price, bear any burden, meet any hardship . . . to assure the survival and the success of liberty.” And so we have. Less apocalyptically Jimmy Carter as president in the late 1970s declared that “a nation’s domestic and foreign policies should be derived from the same standards of ethics, honesty and morality which are characteristic of the individual citizens of the nation. . . . There is only one nation in the world which is capable of true leadership among the community of nations and that is the United States of America.”
Such language would surely arouse the mingled concern and amusement of the Framers. It was a constitution for one nation that they designed, not one for the prosecution in all parts of the world of the native values of the thirteen colonies. There is none of the world-saving rhetoric to be found in our constitution that would be found a decade later in the successive constitutions of the French Revolution. Treatment of the armed forces is spare indeed in the American constitution, and it is oriented austerely to “the common defence.” The purpose of the whole document is that of establishing “a more perfect union,” not that of spreading America’s sweetness and light to the needy world. Nor is there hint of worldwide soul-saving in The Federalist. The closest to a treatment of America and the world is Federalist No. 2 by John Jay, and it is directed solely to the necessity of protecting American riches from “Foreign Powers.”
George Kennan is the most notable of living Americans to understand the purpose of a foreign policy in our time. In 1948 he argued that we should stop putting ourselves in the position of “being our brothers’ keeper and refrain from offering moral and ideological advice.” More recently he has said that American interventions in the world can be justified only if the practices against which they are directed are “seriously injurious to our interest, rather than to our sensibilities.” Too many of our foreign interventions, Kennan declares, have served “not only the moral deficiencies of others” but “the positive morality of ourselves.” It is seen as the moral duty of the United States “to detect these lapses on the part of others, to denounce them before the world,” and even to assure “that they were corrected.” How often, Kennan also notes acerbically, the purported moral conscience of the United States turns out to be a set of moralisms limited in fact to a single faction or special interest group. That American foreign policy should be framed within the borders of morality, Kennan does not doubt. Americans have the right to see to it that the government never acts immorally abroad or at home. But it is a far cry from eschewing the immoral and locating the bounds of morality to the kind of assertions just cited from Wilson, Roosevelt, Kennedy, and Carter.
South African apartheid is indeed a repugnant system—as is the system of one or other kind found in a large number of contemporary governments on the planet. We should and do wish apartheid early disappearance, as we do the repressive practices of the Soviets and their satellite governments. But on what basis does the United States attack apartheid? The gods must have been convulsed when, under the heavy pressure of black organizations and student bodies across America, our government was pressed into service for disinvestment and, if possible, sanctions and even a blockading of South African ports. The United States of America, Mrs. Grundy herself, overbearingly teaching another people how to be decent to blacks? America was the very last civilized country to abolish out-and-out black slavery—and this only by Lincoln’s agonizing change of mind on the subject and use of war powers—and then, put the millions of freed blacks in a state of unspeakable segregation—a type of segregation more punishing in many respects than what exists in South Africa, a segregation that finally began to be broken only in the 1960s in a crusade for civil rights that barely missed being a revolution, a full century after emancipation from legal slavery.
There is another form of blindness to reality that can and often does spring from minds beset by moralism and ideology. This is likely to be present more often in the political Right than the Left. It is well illustrated by the fever of “world Communism” that came over right-wing groups in this country in the late 1940s. Everything unpleasant that happened in the world, whether in Egypt, Kerala, or China, was believed to be part of a world conspiracy hatched by the Kremlin. When, in the late 1950s, there were unmistakable signs of a growing rift between Communist China and Communist Russia, the official position of the United States, a position largely initiated by the Right, was for some time that no rift existed, that Mao’s China was a Soviet pawn.
Those who knew their Chinese-Russian history were not at all inclined to doubt the existence of growing hostility between Mao and the Kremlin, for hostility between the two empires, Russian and Chinese, went back several centuries and had not infrequently broken out in fierce fighting. It was the Chinese who coined the name “Great Bear” for the Russian empire. Granted that Mao was a Communist as were Stalin and his successors. Only eyeless ideology could have prevented leading American figures in and out of government from recognizing that just as capitalist nations can engage in bitter warfare with one another, so, obviously, can and will Communist nations. We might have been alerted by the early disaffection after World War II between Russia and Yugoslavia—so confidently but ridiculously denoted as a Russian pawn by our moralist-ideologists in the beginning—and then Albania. Historical, geopolitical, and fundamental military-strategic considerations will always triumph over purely ideological alliances, unless of course one nation has been taken over by cretins, which has assuredly not been the case with either China or Russia.
Moralists from the Right, blinded by their private picture of “world Communism,” fail to see the undying persistence in the world of the nation-state, be it capitalist or communist. Nationalism has spawned more wars than religion—and Communism is a latter-day religion—ever has or ever will. All the while Stalin was bending, rending, torturing, and terrorizing, always shaping Russia into an aggressive military nation, with Marxism-Leninism its established religion, our right-wing moralistic ideologists in this country were seeing stereotypes, pictures in their heads, of the defunct Trotskyist dream of Russia not a nation but instead a vast spiritual force leading all mankind to the Perdition.
This kind of moralism is still a menace to our foreign policy. It is the mentality that converts every incident in the world into an enormously shrewd, calculated operation by the KGB. To sweep every North-South happening into an East-West framework is the preoccupation of the Right—religious and secular. So was it the preoccupation of the Right when for years, all evidence notwithstanding, it insisted that because Russia and China were both officially Communist, therefore they had to be one in faith, hope, and destiny. Richard Nixon was and is no ideologue; neither is Henry Kissinger. Result? Our celebrated entry into China and what now appears to be a very genuine thawing of Communist orthodoxy.
Vigilance is a cardinal virtue in international affairs. But when it hardens into an unblinking stare off into the horizon, a great deal in the vital foreground is overlooked. The plainest trend in the world since the death of Stalin is the gradual, halting, often spastic, movement of the Soviet Union from its iron age to something that, while not yet entirely clear, is a long way removed from the Russia that under Stalin in 1945 very seriously contemplated a European sphere of interest that included Western as well as Eastern Europe. It is entirely likely that only the atom bomb, then in the exclusive possession of the United States, posed a threat serious enough to dissuade Stalin. After that came the Marshall Plan and then NATO, and the Stalinist dream of suzerainty over Western Europe collapsed along with the Stalinist reality of permanent terror over the entire Russian people.
The Soviet Union remains an enigma. It remains also a dangerous adversary in the world, the one other superpower. It bears American watching, and American military preparation is necessary for any of several possible threats. But to pretend that the Russia of Gorbachev is still, just under the skin, the Russia of Josef Stalin is as nonsensical as was the inflexible belief in some quarters back in the 1950s that Maoist China was a willing pawn of the Soviet Union—or the still earlier dogmatism that insisted long after the fact that Tito’s Yugoslavia was but a Stalinist plaything. I take some pleasure in citing some words I wrote more than a quarter of a century ago:
When I am told that Russia—or China—is dangerous to the United States and to the free world, I can understand this and agree. When it is suggested that the United States should suspend nuclear testing, as an example to the rest of the world, I can understand this and emphatically disagree. But when I am told that the real danger to the United States is something called “world Communism” and that our foreign policy must begin with a “true understanding” of the moral nature of Communism, and not rest until Communism has been stamped out everywhere, I am lost. Meaning has fled into a morass of irrelevancies, half-truths, and apocalyptic symbols.*
No nation in history has ever managed permanent war and a permanent military Leviathan at its heart and been able to maintain a truly representative character. The transformation of the Roman Republic into the dictatorial empire was accomplished solely through war and the military. Is the United States somehow the divinely created exception to this ubiquitous fact of world history? Not, assuredly, if instead of a foreign policy based upon national security and finite objectives associated with this security, we indulge ourselves in a foreign policy with an “itch to intervene,” and a purpose flowing out of the preposterous fantasy of a world recreated in the image and likeness of that city on a hill known as the United States of America. That way lies total confusion abroad and an ever more monolithic and absolute military bureaucracy at home.
[* ]Cited in Martin Gilbert, Winston S. Churchill, Vol. 4, Boston, Houghton Mifflin Co., 1966, pp. 913–14.
[* ]Commentary, September 1961, pp. 202–3.
Last modified April 13, 2016