#137: The Enlightenment Revolution and its Grandchildren

November 4, 2017


I’ve mentioned in a number of earlier posts that it is the inescapable destiny of popular, democratic revolutions to be successfully hijacked by tyrannical internal forces. The usual suspects come to mind, of course: the French Revolution, the Russian Revolution, the Chinese Revolution, the German Revolution. I don’t mention the American Revolution since, as I have previously argued, it was not, strictly speaking, a “revolution” at all, rather a case of colonial irredentism.

I’m returning to this theme because of I’ve been reading Peter Gay’s The Enlightenment. The reading chair in my tiny home office is right next to a couple of book cases holding the rather sad remainder of what was once a pleasantly large library. But still, being retired allows me the luxury to read whatever strikes my fancy. Settling down in my reading chair to wallow in a trashy mystery, my eye settled on the Gay book, which I didn’t remember reading, but which looked like an enticing read. Which it has turn out to be, complete with old notes, which I don’t remember making.

Gay is a superbly well-read enthusiast of the Enlightenment, and he attempts to be fair to the many criticisms that have been leveled against it during both the 19th and 20th centuries. Nonetheless, he fails to appreciate that the Enlightenment was, in fact, a single revolution taking place over the space of 200 years that culminated in the murderous paroxysm that began in 1789. This is the more odd since he paints the philosophes as perceiving themselves as (drum roll) revolutionaries. More specifically, they saw themselves as being at war with religion (which they branded “superstition”) and the various cultures from which they themselves had emerged. So, how is it that we have someone like Gay showing himself such a fan of the philosophes when they are the precursors of the Terror?

Now, if the principle with which I began is true (and I have seen no counter-examples), then Gay should have been on the lookout for signs of the totalitarianism which seems the inevitable legacy of revolution. To be honest, I haven’t finished the book yet, so he might still get to that.


But there’s no mystery here. The 17th and 18th century Enlightenment was the incubator of modern totalitarian Socialism (aka “communism”). And what lies at the heart of all these totalitarian take-overs is a forced detachment from cultural/historical antecedents, the erasure of the ancien regime, a forced move from a despised culture to no culture at all.

Edmund Wilson, in his history of Socialism (“To the Finland Station”), traces Socialism back to the French Revolution, where the principle I’m proposing had already made itself felt in the Jacobins. But the theoretical underpinnings of that revolution did not emerge suddenly as from the head of Zeus, nor had they been churning unperceived beneath the European surface. The French Revolution’s theoretical underpinnings had been proposed, debated, evolved, and refined during the two preceding centuries. The Jacobins stood on the shoulders of the philosophes, who were the early generals and strategists of the Enlightenment Revolution.

The scientists and intellectual dilettantes of the French 18th century whom Gay so admires can easily be enjoyed for their wit, their creativity, even their eccentricities. We forgive them much, at least partly because we’ve been taught to do so. We smile at Voltaire’s excesses just as so many smile at Bill Clinton’s. They’re charming, glib bad boys, and just oh so clever. But we wouldn’t be so happy to give them a pass if we reviewed their antics within the shadow of the guillotine. Reading Gay helps make it clear why we don’t notice Robespierre’s face peering out from the background of Gay’s cheerful group photo of the philosophes. It is because the philosophes had not yet taken that fatal revolutionary step right out of history. Yes, they sought to detach themselves from their Christian heritage, but like so many moving into divorce, they had another lover waiting in the wings. The philosophes divorced Christianity, but they were able to do so because they had the writers of classical antiquity ready to take its place. The Jacobins, on the other hand, divorced all of their history and culture, and it is this that finally made the Terror possible. It is this which distinguishes the philosophes from the vicious and murderous ideologues of 1789 and which obscures the direct line of descendancy leading to them.

And yes, it can seem a small price to pay, that of having a bad boy, to have someone very clever at the helm of government. But we must ask ourselves what the consequences will be of increasingly staffing our universities, our courts, and our government institutions with such charming, glib bad boys. Bad boys who hate the culture which nurtured them, on which they feed, and from which they emerge.

The EU is one grandchild of the Enlightenment Revolution. How’s that been working?

The American Democratic Party is another grandchild of the Enlightenment Revolution. How’s that been working?

The Enlightenment Revolution was an irresponsible parent (what else would we expect?), so there are yet other grandchildren mucking about, all illegitimate and all still waiting for parental support which will never come. Revolutions lead to tyrannies, which lead to slavery, poverty, and starvation for their citizens.


But while bashing the Enlightenment may be a necessary antidote to its excesses, one can hardly argue against its one utterly overwhelming benefit: Science. And by “science,” I mean the “natural” sciences, not the jumped-up pseudo disciplines with the borrowed authority of statistics, i.e. the “social sciences.” Lest someone doubt me here, let me indicate that a recent study by a social scientist claimed to demonstrate that around 75% of “published studies” in social psychology and cognitive psychology are not replicable. I wonder if his own study was replicable and whether he sensed some irony here. I tend to think the situation is likely worse for sociology and whatever other invented disciplines are littering the university scene.

Yes, science and math have made astonishing progress and are accelerating at an apparently exponential rate. This is fabulous and must be acknowledged and admired (which I do).

The problem arises only when, as is inevitable, the methods which have proven powerful and successful in one arena are aggressively applied in others where they simply do not work.

There is an enormous amount of money now involved in natural science, and natural science has carried tremendous prestige for a long time.

A situation such as this is like the smell of fresh meat to a scavenger.

The natural sciences have attracted wanna-be disciplines looking to cash in on the natural sciences’ successes. They want to create departments, to increase their staffing and their budgets. Hence they tart up their names by adding “science.” We now have political “science.” Really? Maybe we’ll eventually have philosophical “science,” as well. And we wind up with idiocies like departments of “theory.” Shouldn’t that have been “theoretical science”?

But, worse, the prestige of the natural sciences attracts political and social scavengers who actually burrow into the periphery of real natural science from which position they attempt to affect public policy. Their “scientific” camouflage is the use of computer models and simulations. That’s pretty “scientific,” right? How well has this “scientific” method worked in, say, economics? But we should accept its hysterical results in the Global Warming scam. Right. The social “sciences” per se could be thought of as harmless enough wastes of public money, and if they were satisfied to remain that, I suppose we could tolerate them on campuses. But sadly they have become the instruments of governments and other enormous interests to manipulate public opinion into supporting policies completely incompatible with common sense. We should be very wary of these pseudo-sciences.

This all means that once science became rich and successful, it became a target for parasites. Should come as no surprise. See what happens if you win $50 million.

Now, these are the dishonest attendants on wealth and success, but there is one other that seems to be built right into human nature.

It is that people are naturally inclined to extend a new method that has proven successful beyond its place of discovery and to think of it as the final solution for all human problems. Descartes, for example, invented analytic geometry, an extraordinary achievement. Since he was, among other things, one of the earliest physicists, he immediately thought he could use it to represent motion. Unfortunately, he was wrong in this; the representation of motion had to await the invention of calculus by both Newton and Leibniz (separately). This impulse to extend the method is also behind the attempt to approach non-quantitative empirical problems and questions with “science.” It was certainly present among the 18th century philosophes.

Those whose imaginations are totally in the grip of scientific method often feel compelled to despise the organically grown culture within which they live. This was largely true of the philosophes and it is largely true of many modern intellectuals, e.g. Richard Dawkins or Christopher Hitchens.

The really difficult thing to do is to accept the amazing progress and discoveries of natural science while accepting the universe of inherited myths and values within which we live.

A prudent first step in this is to reject all attempts at persuasion on moral matters based on “social science studies.”


#136: Feeling the Old Age Paradox

October 27, 2017

If one is at, say, one’s 40th birthday, then one is not old.

No one would say that one is. Nor would one be old the next day. Or the day after. The reason is that one’s age status doesn’t change on the basis of one day more-or-less. However, if one adds, say, 12,320 days, making one 75, then on is old. But 12,320 days are just that many individual days no one of which is capable of changing one’s age status. The big group of days can’t have a property not owned by any of its constituents.

This means that at age 75 one has a property that one didn’t have when one was 40 and never acquired since then. How is this possible? It would seem that either we were old at 40 or we are not old at 75. We can’t have oldness at 75 without having acquired it at some point.

This is a paradox. Call it the “Old Age Paradox.” A paradox occurs when two equally compelling beliefs are incompatible. Most commonly, paradoxes involve an incompatibility between a belief of which one is intuitively certain and a belief which is the conclusion of some apparently sound argument. What makes it a paradox is that it seems that one must abandon one of the beliefs, but one is still equally committed to both. Psychologists call this state of mind “cognitive dissonance.” Students of philosophy will recognize the Old Age Paradox as an instance of the “Heap” or “Sorites” paradox.

I don’t think that this paradox is best for introducing paradoxes to young people, there are others in which the dissonance is much more obvious and immediate.

Yet, there is a feature to the Old Age Paradox that I don’t detect in others.

Logical paradoxes are mostly contrived, one doesn’t encounter them in the course of ordinary life. But the Old Age Paradox puzzles almost everyone who actually gets old whether philosophically inclined or not.

Of course, most everyone who gets old bemoans that fact. But it’s not this that I’m after here; it’s that almost everyone is puzzled with respect to when it happened.

I remember my mother telling me that when she looked in a mirror (in her 70s), that she couldn’t understand when it happened. There was the usual emotional component, but there was also a distinct, identifiable cognitive component. “I was young,” she said, “and now I’m old. When did it happen, how did it happen”? She really couldn’t understand how it could have happened without her noticing that it had. Yet, it had. Her puzzlement was exactly the one found in the Old Age Paradox.

I couldn’t answer her question then, and now I find myself puzzled in exactly the same way. And I still can’t answer the question.

The Old Age Paradox may be unique in being a natural and inevitable experiential moment built right into the human condition.

#135 The Public Apology App (PAP)

October 20, 2017

As we all know, technological change often brings with it unanticipated cultural change and attendant challenges. Generally, those challenges are themselves met with novel technological advances. This is no less true for our current struggles with the new social media technology. While the social media revolution has presented new challenges, those challenges have often been overcome by new software in the form of … apps.

Like many beauty pageant contestants and political science majors, I want to make a difference. I also want to make the world a better place. In keeping with those objectives, I offer the following proposal for a clearly needed new app.

I call it the Public Apology Application (or “PAP”).

I have noticed that more and more people on Twitter particularly have been asked to make public apologies and recantments for their obviously heartfelt beliefs and sentiments. Almost universally, those people have complied. In older days, the public apology was restricted to large powerful institutions, such as governments. Social media such as Twitter have thankfully democratized this much needed societal instrument. What was once only available to large entities such as the Soviet Union under Stalin (and is still used by North Korea and China), is now finally available to the people. My app would go even further in making the task of public self-abasement easier. This is what the app would offer.

Both boiler-plate and custom apologies.

A boiler-plate apology would be something like this:

If any of my statements, actions, or habitual behaviors have unintentionally given pain or offense to anyone, I want to make it absolutely clear how very, very sorry I am that they have had to feel this way. My statements, actions, and habitual behaviors in no way represent my true beliefs or attitudes and have all been either misinterpreted, taken out of context, or were the result of the use of prescribed drugs. Or my struggles with deep personal problems or addictions. Or other matters beyond my control. Anyways, I’m really sorry. Really sorry. That they feel this way.

All apologies could, of course, be edited and the audience base adjusted. Thus, an apology could be restricted to some predefined message address database (email, messaging, twitter accounts, etc.) or it could be addressed to the world at large.

The apologies could also be titrated for degree of personal humiliation: I am sorry, I am so sorry, I am so very sorry, my sorriness goeth beyond all belief, I am beneath contempt, OMG I am so awful, etc.

My plan includes having the app available in both a free (with advertisements) version and a premium version (no advertisements). The premium version would also allow for larger apology scripts up to five typed pages in length along with an inventory of professionally done scripts written by ex-presidential speech writers.

I sincerely believe that this app would be a genuine contribution to the great goal of making the world a better place and am confident that if it were implemented, I would indeed have made a difference. And I apologize most sincerely if my efforts have unintentionally caused anyone any pain, discomfort, anxiety, nervous tic, digestive disorder, or unhappy thought. Sorry. Really.

#134: Kant, Lieutenant Commander Data, and the “What if…” Argument

September 12, 2017

Back in 2009 I wrote a post (#14) on the “What If everyone Did That” moral argument. I still stand by what I wrote then, but I want to add something important, something I clearly missed then.

The “What If” argument actually conceals some ancient and very false assumptions. These assumptions arguably go back to Plato, but achieved their modern force in the late Enlightenment through the moral writings of Immanuel Kant. Kant is a difficult read, but his foundational false belief is not difficult.

This belief is that human beings as we encounter them are actually composites of parts only one of which is the authentically “human” part, the other parts being, in his term, heteronomous (“external” to the “true self”). The part which is  the “true” us is, of course, the rational part. And the rational part, sadly, is  frequently overridden by the heteronomous forces within us, e.g. lust, greed, pride, etc.. Thus, when we sin, we are simply losing the battle against external forces.

Kant further argues that reason is independent of personal identity in the sense that any two people faced with the same circumstances and relying only on reason will reach identical conclusions. For Kant, all authentic people (people shorn of heteronomous influences) are in fact identical; their apparent differences reflect nothing of their true selves, but only the varying heteronomous forces at play in them.

Still further, Kant argues that moral action is nothing other than rational action.

Consequently, in an imaginary world in which heteronomous forces were not present, all men  would behave morally simply by “being themselves.” There would actually be no such category as “moral” behavior in such a world, since there would be no such thing as “immoral” behavior. There would just be “behavior” (which, of  course, we, from our vantage  point in the real world, would judge to be “moral.”)

In contemporary culture terms, a Kantian “moral agent” would look like Lieutenant Commander Data of the Starship Enterprise. All other things being equal, Data acts rationally, this is how he has been programmed. He has no lust, no greed, no pride, no pleasure, no pain, no satisfaction, etc.. If other androids of Data’s kind were manufactured, their behaviors would be indistinguishable from his.

And, according to Kant, Data automatically acts morally.

Now, if we want to simulate ordinary human activity in Data, imagine him infected by a computer virus which mimics human vices and frailties. In such a case, Data is faced with the challenge of dominating the virus’ effects in order to remain “himself” and, thereby, to act morally. That’s ostensibly the human condition according to Kant.

But even if one buys this nonsense, one is still left with the crucial question: What is it “to act rationally?” Kant answers that a rational being acts always and only in accordance with a “rule,” a generalization with regard to behavior. But which rule?

Now, one of the ways in which Kant expresses this “rule” governing the behavior of a moral agent is this:

Act only in such a way as you would have the generalization expressing your action be a law of nature.

This is often expressed as the dictum that when one acts, one acts “for all men,” and it is one version of the rule which guides the actions of any rational being. Kant calls this rule the “categorical (unconditional) imperative.”

A thought experiment capturing this notion is this: When you act, imagine yourself a God whose every act becomes a natural law such that all other men cannot help but act the way in which you did.

And there you have it! The “What If…?” argument with which we began.

Kant’s view is that a rational being cannot act in any way that is not generalizable, that is, in any way he would not have all men act. For him, this is not really a moral imperative, it is a logical imperative: acting according to the categorical imperative follows necessarily from being a rational being. So, when we ask ourselves “what if everyone acted this way,” we are asking ourselves whether a rational being would act this way, and thus whether we should act this way.

To be fair to Kant, let me stress that this is not intended to be a prudential argument, namely that “enlightened self-interest” dictates that we act only on universally generalizable rules. For him, we are not reasoning that we should not do X because if everyone did X, we might ourselves eventually be harmed by that. No, rather the universalizability test of the categorical imperative is there to inform us as to how we would act if we were free. Being subject to heteronomous forces is being un-free. So, assuming that all men wish to be free, the universalizability test allows one to know how one would act if one were not subject to external constraints and forces. On the Kantian story, being moral is equivalent to being rational is equivalent to being free. Should we buy into this? No? Why?

Because any close look at this Kantian story reveals it for what it is: a pathetic Enlightenment fantasy featuring the 18th century’s favorite celebrity: Reason!

Are authentically human beings only their “reasoning” parts? Is “being moral” the same as “being rational.” When we act morally, do we act in such a way as we would have all other men act?

Nonsense. Morality is a matter of individual preferences, themselves no more than the result of formative experiences and natural inborn inclinations. For some people who have been conditioned in the Enlightenment mode, a Kantian style “What if …” argument might work; but not because it is sound, merely because they were so conditioned.

Let’s do away with the “What if …” argument and turn our eyes from Kant to Hume, who really knew what he was talking about!

133: Multiculturalism is the Publicly Acceptable Face of Revolution

July 20, 2017

Revolution is by strategy and by ideology essentially an enemy of culture, any culture, attacking it without any of the constraints taken for granted in the culture under attack. Revolution operates in a field devoid of all values save one: the destruction of the existing order. This was true of the French Revolution, the Russian Revolution, and the Chinese Revolution under Mao. It was also true, arguably, of the “bloodless” German revolution of 1918 from which the Left emerged in power and, through the 1930s, proceeded to effect the “revaluation of all values,” in Nietzsche’s phrase. If Weimar was nothing else, it was certainly an attack on the culture of pre-war Europe. The only counter-example to this thesis is that of the American Revolution, which belongs much more in the category of colonial irredentism than genuine revolution. Genuine revolution only takes place within an existing cultural/political entity.

The difference I’m appealing to here really demands a separate blog post, but I’ll just say this much here: irredentism is a defense of a culture, specifically a local culture, against an alien outside occupier; revolution is an attack on a specific local culture from within. This means that properly speaking colonial wars of self-determination are not really revolutionary wars, they are irredentist wars. The American revolution tends to fall between the cracks, since the colony actually shared the culture of the “occupier,” thus not exactly falling in the irredentist mold. Yet, arguably, the colony had gone far enough in culture from the occupier to be classed as irredentist.

There are multiple, converging reasons why revolution is and has to be an enemy of the culture within which it is hatched. For one thing, the revolutionaries believe themselves and others like them to be excluded from the culture under attack, and they hate it for  that reason. Their usual expression for this reason is that the existing culture is “unfair.” But, more important perhaps, is that they perceive rightly that a culture is an impediment to their program of taking power. A culture unifies and strengthens a social/political whole, so it makes perfectly good sense to go about undermining and eroding the culture under attack.

And the revolutionaries have a point: it is far easier to attack a culture than to defend it, and, once the cultural norms and assumptions lie in tatters and chaos reigns, it is much easier to take the government by force. Witness the two revolutions in Russia of 1917: the first, a democratic one unable to organize an utterly failing state, and the second, taken by force by the Bolsheviks. Out of chaos, tyranny.

But how to undermine a culture? Well, there are some obvious steps. One, infiltrate the universities and turn them into platforms of indoctrination. Two, take over the news media and turn them into platforms of indoctrination. Three, take over the entertainment industry and turn it into a platform of indoctrination. Four, take over the  courts and have them legislate from the bench.

A culture is cemented by a social/historical mythology which contains some truths and some outright lies. The fact that it is a mythology is not a fact against it; its function is to provide a social cement. To attack it “as history” is intentionally to miss the point for a political purpose, which, of course, revolutionaries do.

But, lest they be accused of being “nihilists,” as some of them in the 19th century were accused of being, they also bring forward proposals for a “new” set of values, a “new” culture to replace the old. This is a clever strategy, a Trojan horse which conceals a world without any values at all.

The values of this new world are “tolerance” and “acceptance” for any peoples “not like ourselves,” no matter what the implications. But what are the implications? 

The new culture is one of “Multiculturalism.”

This is an interesting move. The U.S. motto is “E pluribus unum” (“Out of many, one.”). The revolutionaries’ new motto is: Out of one, many.

Multiculturalism is an ideology with a strategic purpose: it is dis-integrative of the existing social whole.

And there you have it: Multiculturalism is the public ideological face of a revolutionary movement bent on the take-over of the state through the gradual erosion of its historically rooted unifying identity.

Sadly, it’s an easy sell. The progressive fragmentation of the social unity begins with the fragmentation of the whole into ethnic parts. But it has to be seen that this is just the beginning. These parts are still too large. Fragmentation must continue, so further cracks in the social cements must be created. In addition to ethnic divisions, there must also be sexual divisions. Thus, to start, there are the homosexuals as well as the heterosexuals. But  this is also not enough, there are also the transsexuals, but there are more yet to come. Add to these the zoophiles, the cannibals, the sadists, the masochists, the coprophiles, the necrophiles, and, of course, the a-sexuals … Dare we add, the pedophiles? They all have identities which must be respected because it’s it’s just so wrong to judge. Be tolerant! Have no values at all!

Yes, this is an easy sell. It’s message goes right to the heart of  lazy, drugged, depressed, angry, failed narcissism. Who is immune to that siren call? Certainly not the deranged, screaming, eyes-bulging women of the “Million Women March.” Certainly not the deranged, screaming, eyes-bulging Millenials breaking windows and burning cars at “peaceful protests.”

For the revolutionaries, the beauty of this strategy lies in the fact that a culture’s social and legal conventions and rules presuppose stable and persisting identities and roles. 

The revolutionary strategy has been this: to begin by fragmenting the whole into ethnic parts, to progress to ever smaller parts, with the ultimate goal of a population of shape-shifters whose part-identity is a matter of momentary choice.

At this point, the social whole is no longer a nation, no longer a socially cohesive unit, and it is ready to be merged with all the other colorless, dying progressive polities into a single gigantic mass ruled by bleak, faceless bureaucrats at the UN oligarchy.

Oh, brave new world that hath such people in it.

#132: Mortal Loneliness

June 6, 2017


Many have noticed the religious nature of modern Leftist movements. The acceptance of a priesthood, the proscribing of apostates, the fervor, the hysteria are all hallmarks of the religious. Duly noted.

But more importantly the same Left which exhibits these characteristics fails to be aware of them. If it were, it would also see the inescapable inference: people desperately need and want religious experience (even if they get it from atheism). Science and Leftist indoctrination have stolen the God of Abraham, Isaac, and Jacob from the West, but the need for a God, some God, remains. One such god was Communism; we see this explicitly recognized by the Western ex-commies who contributed to the 1949 book The God that Failed. I actually knew communist devotees in the 1950s, both in high school and college, some adults and some young men. All of them knew the works of Marx, Engels, and Lenin like catechism and would discuss them with each other by reciting relevant passages. Not unlike the Christian theologians of the Middle Ages and the students of the Torah in yeshivas for the past 5000 years.

Nowadays, the god is sometimes “mother earth” and sometimes “humanity” and sometimes people have no idea what it is they are worshiping. People just want to worship. They always have, and have all around the world. The need and the impulse are hard wired in them.

But why?

The Existentialists were on the track of this answer, I think, but fell short of the mark. They identified the problem in the “meaninglessness” of human life. We’re born, we live, and not long after we die. Some got closer to the mark when they emphasized that we die alone. I think that’s the heart of the matter.

I remember a deeply disturbing scene in a science-fiction movie in which a man in a space-suit became untethered and slowly drifted out into deep space, alive for only as long as his air held out.

God, religion, and worship are answers to the fear and horror of loneliness in life and in death. Not the social loneliness that can be cured through companionship, but the loneliness experienced simply by being an inaccessible mind, a kind of solipsistic loneliness we suffer in the face of death. I call it “Mortal” loneliness.


Nietzsche announced “the death of God” in 1882. I think he was both premature and late to the party. The slide in God’s health could arguably be said to have begun with Galileo’s work with the telescope (ca. 1650). Many people date the beginning of God’s health problems with those beginnings of modern science. His health certainly declined during the 18th century as more and more intellectuals questioned His very existence. By the late 19th, publishing His obituary was scarcely a big deal. But was God actually dead? Maybe Nietzsche’s surrogate Zarathustra was releasing “fake news” to the media.

Fifty years earlier, Kierkegaard had far more perceptive thoughts. He asked the question later brought to the movie public in the title song of the movie “Alfie”:

What’s it all about, Alfie? Is it just for the moment we live?

Kierkegaard thought that it’s all about the personal bond of love. Not a love between man and woman, but a love between the particular man and a a particular, personal God. Hmm, perhaps God was still alive at that point.

This meant really that religion was losing the battle with science not so much, perhaps, because of science’s enormous explanatory power, but because religion had taken on over the centuries the characteristics that science exemplified much more effectively. Aristotelian Christian theology became increasingly abstract and intellectual through the centuries, losing the very element which had been the source of its original power: the loving presence of a personal God in one’s heart and mind. Not an abstract God, but a God who was actually a person, who knew you as a person, and who was there with and for you right inside yourself until the last fragment of your consciousness left this earth. When you and God loved each other, you did not die alone. Had religion stuck with that, it  would have done much better; it should never have gone into competition with science.


It’s not enough, though, to identify the direct benefits of God’s love here. There are complementary benefits as well. Those who love a  personal God within the context of a religion have not only God’s love to sustain them, but they have a further antidote to their mortal loneliness in the community of God-lovers in which they live. There are thus two sources of ease for the mortally lonely, God’s love and what I call “huddling”. There is an important lesson in this. Huddling is a powerful loneliness analgesic, but it comes admittedly at a price: those who huddle together, often work at strengthening their huddle by rejecting others. 

History has in various ways made the acceptance of a loving, personal God more and more difficult. Some people are still able to “make the leap,” thus Nietzsche was wrong and God is still “alive”; but huge masses of people no longer have this solution available to them. Yet “huddling” is still available to them, though one of its most important forms has been systematically attacked by the internationalist Left. This is the huddling of nationalism. And, admittedly, nationalist huddles are often aggressively hostile to other nationalist huddles.

The Left is not ignorant of the political utility of loneliness, it herds masses towards the useful huddles, while driving them away from ones it sees as problematic. The Left drives the masses away from Western God-religion huddles and from Western nationalist huddles, but it encourages earth-worship and humanity worship huddles. And like all huddles, it encourages rage and hatred towards other huddles. This is captured beautifully in Tom Lehrer’s sardonic introduction to his song National Brotherhood Week. He says:

“I’m sure we all agree that we ought to love one another and I know there are people in the world that do not love their fellow human beings and I hate people like that.


The mortal loneliness I’m speaking of was identified some years ago in sociology by Emile Durkheim. He called it “anomie”. I think it can fairly be described as the condition of feeling to be without an identity. Another way is to call it “looking for a huddle” of one’s own. Huddles have membership requirements, characteristics that all members have and expect others to have as well. These characteristics constitute “identities” or “roles.” Sartre is brilliant in his discussions of personal identity and the way in which people adopt “roles.” In particular, he is very good at identifying the discomfort people feel when they are suddenly bereft of a role. Anomie is the condition of being role-less.

A world of role-less masses presents political opportunists with troubled waters. The Left loves to fish in troubled waters, such waters present them with crises “too good to waste.” And Western waters are very troubled these days with role-less masses.

We see this particularly in the U.S., though also in different ways in Europe. The two largest anomie populations in the U.S. are women and blacks. In the case of the former, traditional roles have been under systematic attack for over a hundred years. Whether one likes or dislikes the roles that women had in America, no one can claim that women have actually formed a new, modern identity. We see this in the media’s coverage of the Women’s March on Washington. More and more of the women interviewed were besides themselves with rage, but unable to articulate their issues except in pre-packaged Leftist talking points and platitudes. And among the young ones, we heard them say, over and over again, “I wanna make a difference,” I wanna make the world a better place,” and other pap. We hear similar sentiments from beauty contest finalists: “I wanna work to end world hunger.”

And we’ve seen American Blacks experiment with one posture after another since the 1960s, the latest one being the “Black Lives Matter” movement.


We live in a time of anomie, a time of failing solutions to the problem of mortal loneliness. Some of the solutions are failing on their own, other solutions are failing because of relentless attack. While the old solutions soldier on in places, huge masses are living in frightening personal isolation, looking either for an old personal god, a new personal god, or a group within which to be loved.

And even “being in search of an identity” has now become a prevalent identity!

#131: Dear Mrs. Merkel, Here’s A Modest Proposal

May 30, 2017

Mrs. Merkel, Mutti as the Germans fondly call her, doesn’t like Mr. Trump. A surprise? I think not, she probably feels much more comfortable with Putin; she is, after all, an East Berliner, an unreconstructed commie finally showing her true colors under stress.

She doesn’t like Trump? Gee, pech, too bad. Europeans have long felt superior to Americans, but only too happy to accept their money. Maybe the time has come when they have to get along on their own.

Europe has not only become a welfare state, it has been America’s own welfare dependent since 1945. It started with the Marshall Plan and it hasn’t stopped for a moment since then, the US tax payer paying for European defense, and by that token, money being fungible, subsidizing Europe’s generous Socialist benefits. Even when France bungled the North African invasion of Libya, it had to beg armaments from the United States. Well, now they’ll have to repurpose their cheese factories into the production of ammo, though, on second thought, they’ll probably prefer Russian occupation to giving up their cheese. Ha! They’ll have to get used to watery borscht with rotting potatoes. No more cheese for Francois!

Mutti is bitter now, doesn’t relish having to pay her own way. Sad, bitter Mutti.

Since Monarchism finally failed and left a governance vacuum in Europe, starting with the French Revolution, through the Russian one, the bloodless German one of 1918, and the Chinese one, Socialisms of one stripe or another have been battling for possession of the continent. Into the Weimar vacuum after WW I swept the Spartacists and the Nazis, both Socialists; they fought in the streets, the Spartacists with Stalin’s backing, the Nazis feeding off a fictional past. The Nazis won that fight, but lost the war they began soon after, leaving the field open in 1945 to their erstwhile losers.

The post-war Socialists were no Stalinists, they chose what they called the “third way”: “Social Democracy”. Make no mistake about it, it was still Socialism (the “Sozis,” as they were called). This Socialism learned from the commies’ mistakes as well as from their insights; on the one hand, they eschewed their heavy handed dictatorships, but on the other, they saw the value of Hitler’s partnerships with giant corporations. They implemented and perfected Crony-Socialism. And it’s worth noting that the first revolution in Russia in 1917 was in fact a democratic one and that it fell very quickly to Lenin and his tactics. Lenin, however, and Stalin after him tried to be doctrinaire Marxists and thoroughly destroyed Russia’s productive capacity (along with millions and millions of lives). Fortunately, there was no Lenin in 1945 Europe and Social Democracy seemed able to take over governance. To some extent this was an illusion, since European reconstruction was being paid for by the United States, which has continued paying and paying ever since.

While it is true that Mutti’s bunch has been doing very well, the rest of Europe has not, particularly the Southern countries. Greece, Italy, Spain, Portugal, and to some extent France, have been flirting with bankruptcy. The Norwegians have North Sea oil, so they’re doing ok, the Swedes’ immigrations policies are bringing the country into deep trouble. I confess that I am not bullish on Europe. If the continent has to start paying its own way, expect tires burning in the streets — that’s how political discontent is expressed in Europe. Unsubsidized Socialism does not do well, even when it has wealth under its feet: witness Venezuela.

And while the US was subsidizing Europe’s recovery, the scribbler positions which had gone empty over the bad years were being rapidly filled with the old commie intellectual hold-overs from the 1920s and 30s. As well, the GI Bill and the reconstruction of Europe created huge numbers of new openings for scribblers. From their various platforms in the Universities and the mass media, they worked away at boring from within, a Socialist/Communist fifth column within the Western world. Their goal was nothing less than the one which had animated Lenin and Stalin and Mao, namely world domination. They worked by directly eroding and undermining the cultural distinctiveness of the Western nations: their values, conventions, and religions, and indirectly by importing populations utterly unable and unwilling to share and support the Western cultures hosting them. They hoped to produce a population-wide anomie into which one-world-Socialism could easily slide.

Hussein Obama thought he had a chance to turn the US into a domesticated Socialist cash cow for an increasingly socialistically unified world: an analogue, in effect, of an imperial colony for Europe. Just as the French sucked North Africa dry without giving anything in return, Hussein thought Europe as a whole could enjoy its sophisticated civilization on the revenue from its wealthy, primitive colony, the United States. He overshot, as did the Europeans, and now they’re just sad, bitter, and frustrated. Poor Mutti. Hussein has indeed managed to increase an anomie that was already present, which is why we’re seeing deranged screaming and violence in the streets. But his grasp of arithmetic was perhaps not up to the task of predicting an election outcome: there just weren’t enough of the crazed available (yet) to turn the trick. Poor sad, bitter Democrats; bitter, sad Europeans.

But really, now, they should be more understanding. They should just reflect on how annoying they themselves find the Greeks’ unwillingness to pay their own way. The Greeks expect the rest of Europe, particularly Mutti and her people, to subsidize their leisurely lives. The Germans are not enthused about this, why should the Americans be any more so? But this would require a level of perspective perhaps too demanding of a commie-manqué like Mutti.

So, Mrs. Merkel, here’s a modest proposal given in the spirit of communal contribution with which you are so familiar: Jeder nach seinen Fähigkeiten, jedem nach seinen Bedürfnissen. (“From each according to his ability, to each according to his needs.” — Karl Marx). The EU has become more than a mere economic union, it has evolved into an entity which speaks as if it is the unified voice of the continent on all matters social, moral, and political. I suggest, therefore, that the EU speak also with a single, unified purse on the continent’s financial obligations. Let the EU pay the 2% (or more) annual dues to NATO, rather than having individual nations of differing means struggle to meet their commitments, eh? How about them apples, Mutti?

Now, I know that this might hit Germany a bit hard since it is currently a deadbeat, but give the plan a chance. A quick look at the stats shows where Germany falls: United States, 3.61%. Greece, 2.38%. Britain, 2.21%. Estonia, 2.16%. Poland, 2%. France, 1.78%. Turkey, 1.56%. Norway, 1.54%. Lithuania, 1.49%. Romania, 1.48%. Latvia, 1.45%. Portugal, 1.38%. Bulgaria, 1.35%. Croatia, 1.23%. Albania, 1.21%. Germany, 1.19%. Denmark, 1.17%. Netherlands, 1.17%. Slovakia, 1.16%. Italy, 1.11%. Czech Republic, 1.04%. Hungary, 1.01%. Canada, 0.99%. Slovenia, 0.94%. Spain, 0.91%. Belgium, 0.85%. Luxembourg, 0.44%.

Gee, Mutti, aren’t you just a little embarrassed that Greece of all nations should be paying its bills better than you? But for goodness sake, Mutti, even  Albania is cleaning your  clock, making you look bad!

So consider my solution. Sure, you’re a deadbeat nation, but if you follow my suggestion, that can be nicely obscured. You can bury your non-compliance within the single payment made on the EU behalf. Not only that, but you can probably bully some of your more pathetic members into upping their contributions so that Germany can pay even less! Maybe you can intimidate languishing France into coughing up 2.5% or more. Remind them of the last two Wars, eh? Think it over, Mutti, I’ve got your back.