Ask an American of even above-average intelligence what happened in 1964, and the predictable answer would be “Beatlemania” (although the politically sensitive conservative might cite the stirring defeat of Barry Goldwater in that year’s presidential race). But what if it could be shown that 1964 was actually the most important year in all of human existence, as measured in two very different ways: first, in terms of geologic time; and second, in terms of the history of Western Civilization?

I suggest that this can be shown.

Regarding the first measure, I refer to the article, “Defining the Anthropocene,” by English geographers Simon L. and Mark A. Maslin, and appearing in the 12 March 2015 issue of Nature (regarded by many as the world’s preeminent scientific journal). For over a century, a few iconoclastic geologists have suggested that human activity has so altered the global environment, as to constitute a new geologic age. Most scientists still see the earth and its biota (or living things, including us) existing in the Holocene Epoch (which started about 11,000 years ago) of the Quaternary Period (or the Great Ice Age, beginning about 2.6 million years ago). The renegades think they see an Anthropocene — or Human-Shaped — Age appearing in recent decades, one distinct from both the Quaternary and its subsidiary Holocene.

Defining a geological time unit requires that certain formal criteria be met. It must be global in extent; it must be measurably recorded in geological stratigraphic material, such as rock, glacial ice, or marine sediments; and it must involve fairly dramatic alterations in the earth’s atmosphere, oceans, and biota. Moreover, a new geologic time unit requires a “golden spike”: a “single physical manifestation” of change that marks or symbolizes the radically new epoch. The classic example is the iridium strata found in rock formations around the globe that are 66 million years old: the marker of a giant asteroid that struck the earth (in the present day Gulf of Mexico), leaving all dinosaurs, except the birds, extinct and fixing the divide between the Cretaceous and the Cenozoic ages.

Signs of sweeping global environmental change do seem to be mounting. For example, humans have accelerated the “Haber-Bosch” process, which converts atmospheric nitrogen into ammonia (useful as fertilizer), to an extent not seen for something like 2.5 billion years. Human actions have released 555 billion metric tons of carbon (formally stored in coal, oil, and natural gas) into the atmosphere since 1750, raising atmospheric CO2 to a level not seen for at least 800,000 years, and perhaps much longer. Human behaviors have accelerated the extinction of other species by 100 to 1,000 times beyond the “background rate,” which may mark “the beginning of the sixth mass extinction in Earth’s history.” Meanwhile, humans have transported plant and animal organisms among the continents since 1492, “a global homogenization of Earth’s biota” without precedence.

But what of a “golden spike,” recorded in the earth’s surface and necessary to the dating of any Anthropocene Age? Lewis and Maslin offer two temporal possibilities:

  • 1610, when atmospheric CO2 levels, as found in Arctic ice core samples, reached a low of 271.8 parts-per-million in a pre-industrial age, and then began to rise; or
  • 1964, when Carbon 14 isotope levels reached an “unambiguously global” spike in soils and marine sediments (Carbon 14 is a residue of atomic bombs, arguably the ultimate “tool,” the apex of what the authors call the “Great Acceleration” in human activity….. hundreds of nuclear tests occurred in the atmosphere between 1945 and 1963, when a test-ban treaty went into effect).

For dull and pedantic reasons, the two scientists somewhat favor the year 1610 as the dawn of the Anthropocene Age. I strongly favor the later date, for it perfectly conforms to the book that I have long planned to (but probably never will) write — 1964: The Year Western Civilization Died.

What evidence would I cite for this rash claim? My argument would go like this: Western Civilization — as a fused product of Hebraic, Greek, Roman, Christian, and Germanic influences — was still alive, and in modestly good health, in 1914. Then the empires of Europe entered into the fratricidal lunacy known as the Great War. This conflagration consumed the best and brightest men (in civilizational terms) of each combatant. When the erstwhile “winners” tried to put their civilization back together after 1918, they botched the task. Their failure unleashed the monstrous tides of Bolshevism, Fascism, and Nazism; the defeat of the latter two required a second and much more terrible war.

This time, though, the victors in Western Europe produced a more promising agenda for restoring Western Civilization. Called “Christian Democracy,” this movement held that the full flowering of the individual only came through his participation in social bodies such as church, local community, and — in particular — the family. As economist Wilhelm Roepke explained: ”It is surely the mark of a sound society that the center of gravity of decision and responsibility lies midway between the two extremes of individual and state, within genuine and small communities, of which the most indispensable, primary, and natural is the family.”

With leaders such as Robert Schumann in France, Konrad Adenauer in West Germany, and Alcide de Gaspari in Italy, genuine Christian Democratic parties came to power. They worked to return control of education to parents, grant special protection to motherhood and childhood, and deliver “family wages” to fathers so that mothers might be empowered to remain home with their children.

For nearly two decades, this project succeeded. Families stabilized, birth rates rose, and moral order seemed to return to Western Europe. Even the initial moves toward European union were motivated by a genuine desire to create a 20th Century version of the Holy Roman Empire.

Meanwhile, in the United States, Canada, Australia, and New Zealand, similar developments produced similar results: marriage booms; baby booms; swelling church memberships; and other fresh expressions of ordered liberty.

Then came 1964. In his book The Silent Revolution, sociologist Ronald Inglehart sees in that very year a decisive turn among Europeans away from Christian-inspired political values, toward the libertine left. Demographer Ron Lesthaeghe also sees in this temporal phase a strong shift away from values affirmed by Christian teaching (“responsibility, sacrifice, altruism, and sanctity of long-term commitments”) toward “secular individualism” focused on the desires of the self. A third figure, historian Emiel Lamberts, traces the emergence of “naked individualism and unbridled libertinism” in Europe to this same year.

In the United States, meanwhile, 1964 stands out as the first year of the infamous “Sixties,” when the moral order of the post-war era collapsed. The U.S. birth rate fell sharply that year, bringing an end to the Baby Boom, and serving as a statistical marker of a deeper transformation. The following twelve years witnessed open attacks on Christianity, the rapid spread of pornography, new demands for easy divorce, a swelling feminist rhetoric demanding androgyny and “reproductive rights,” steps toward easy abortion, and startling new efforts to manipulate early human life.

As the crude but insightful “Rape of the APE: The Official History of the Sex Revolution” by Allan Sherman (from the Playboy Press, 1973; APE meaning American Puritan Ethic) described this turn: “Everything got devalued. Not just the dollar, but everything in American life. The American Flag was devalued. Marriage was devalued. Virginity. Love. God. Motherhood. Mom’s Apple Pie. General Motors has less value now, and so does the Bill of Rights. War was devalued, and so was the air we breathe. The quality of men available to lead was devalued. Our technology was devalued; our institutions and our customs were devalued. The worth of an individual was devalued.”

In this way (I would argue in my book), and with a last gasp, Western Civilization came to an end. The next fifty years would be, by and large, a mopping up operation by the victorious moral and sexual revolutionaries. Remnants of the old civilization survive, albeit with fading influence and prospects.

Was this temporal congruence of two monumental changes in human history, measured in vastly different ways, merely a coincidence; sort of a cosmic joke? I think not. There is a common theme: simply put, the rejection of limits.

Relative to the emergence of the Anthropocene in 1964, the human obligations to be good stewards of the earth, to understand our place as part of the Creation, to respect other living things, and to meet our material needs in sustainable ways have been overwhelmed by greed and hubris. The nuclear bomb, our geologic signature on the earth’s surface, perfectly embodies these qualities. Relative to the collapse of Western Civilization in 1964, the human obligations to live within an inherited moral code and to honor both ancestors and posterity have been overwhelmed by nihilism, profligacy, and lust. The STD epidemic which the contemporary “West” has bequeathed to the rest of the world is a perfect symbol of this change.

Of course, respecting limits is the oldest of human failings. As Genesis records, the first human beings found it impossible to accept even a single restriction on their use and enjoyment of paradise. In this sense, perhaps, the human narrative has come full circle.

(Image source)

Local Culture
Local Culture
Local Culture
Local Culture


  1. Tracing civilization “collapse” to any one year or era, by its nature, is a risky proposition. But to insist that America, and the rest of the West, took a one-way road to perdition in 1964, driven by visiting British rock groups, is truly insulting to one’s intelligence. This isn’t the first such jeremiad penned by Allan Carlson, but this one is truly surreal in its absurdity.

Comments are closed.