Nightcap

  1. Plato and teaching foreign policy Luke Perez, Duck of Minerva
  2. Suicidal elites Joel Kotkin, Quillette
  3. Debating the far right Chris Dillow, Stumbling & Mumbling
  4. Buddhist Hell park Laetitia Barbier, Atlas Obscura

Social noble lies

In the Republic, Socrates introduced the “noble lie”: governmental officials may, on occasion, be justified in propagating lies to their constituents in order to advance a more just society. Dishonesty is one tool of the political class (or even pre-political — the planning class) to secure order. This maxim is at the center of the debate about transparency in government.

Then, in the 20th century, when academic Marxism was in its prime, the French Marxist philosopher Louis Althusser became concerned with the issue of social reproduction. How does a society survive from one generation to the next, with most of its moors, morals and economic model still in place? This question was of particular interest to the Orthodox Marxists: their conflict theory of history doesn’t illuminate how a society is held together, since competing groups are always struggling for power. Althusser came up with “Ideological State Apparatuses”: institutions, coercive or purely ideological, that reinforce societal beliefs across generations. This necessarily includes all the intelligence agencies, like the CIA and FBI, and state thugs, like the Gestapo and NKVD, but it also includes the family unit (authorized by a marriage contract), public education and the political party system. “ISAs” also include traditions in the private sector, since for Althusser, the state exists primarily to protect these interests.

It’s rarely easy enough to point to a country and say, “This is the dominant ideology.” However, and here the Marxists are right, it can be useful to observe the material trends of citizens, and what sorts of interests people (of any class) save up money for, teach their children to admire, etc. In the United States, there is a conditional diversity of philosophies: many different strains abound, but most within the small notecard of acceptable opinion. Someone like Althusser might say there is a single philosophy in effect — liberal capitalism — getting reproduced across apparatuses; a political careerist might recognize antagonists across the board vying for their own particular interests. In any case, the theory of ISAs is one answer to conflict theory’s deficiencies.

There is no reason, at any time, to think that most of the ideas spreading through a given society are true. Plenty of people could point to a lesson taught in a fifth grade classroom and find something they disagree with, and not just because the lessons in elementary school are simplified often to distortion. Although ideas often spread naturally, they can also be thrusted upon a people, like agitprop or Uncle Sam, and their influence is either more or less deleterious.

Those outlooks thrust upon a people might take the form of a noble lie. I can give qualified support for noble lies, but not for the government. (The idea that noble lies are a right of government implies some sort of unique power for government actors.) There are currently two social lies which carry a lot of weight in the States. The first one comes from the political right, and it says: anyone can work their way to financial security. Anyone can come from the bottom and make a name for themselves. Sentiment like this is typically derided as pulling oneself up from the bootstraps, and in the 21st century we find this narrative is losing force.

The second lie comes from the left, and it says: the system is rigged for xyz privileged classes, and it’s necessarily easier for members of these groups to succeed than it is for non-members. White people, specifically white men, all possess better opportunities in society than others. This theory, on the other hand, is increasingly popular, and continues to spawn vicious spinoffs.

Of the two, neither is true. That said, it’s clear which is the more “socially useful” lie. A lie which encourages more personal responsibility is clearly healthier than one which blames one’s ills all on society and others. If you tell someone long enough that their position is out of their hands because the game is rigged, they will grow frustrated and hateful, and lose touch with their own creative power, opting to seek rent instead. Therefore one lie promotes individualism, the other tribalism.

Althusser wrote before the good old fashioned class struggle of Marxism died out, before the postmodernists splintered the left into undialectical identity politics. God knows what he would think of intersectionality, the ninth circle in the Dante’s Inferno of progressivism. These ideas are being spread regardless of what anyone does, are incorporated into “apparatuses” of some sort, and are both false. If we had to choose one lie to tell, though, it’s obvious to me the preferable one: the one which doesn’t imply collectivism in politics and tribalism in culture.

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.

The existentialist origins of postmodernism

In part, postmodernism has its origin in the existentialism of the 19th and 20th centuries. The Danish theologian and philosopher Søren Kierkegaard (1813-1855) is generally regarded as the first existentialist. Kierkegaard had his life profoundly marked by the breaking of an engagement and by his discomfort with the formalities of the (Lutheran) Church of Denmark. In his understanding (as well as of others of the time, within a movement known as Pietism, influential mainly in Germany, but with strong precedence over the English Methodism of John Wesley) Lutheran theology had become overly intellectual, marked by a “Protestant scholasticism.”

Scholasticism was before this period a branch of Catholic theology, whose main representative was Thomas Aquinas (1225-1274). Thomas Aquinas argued against the theory of the double truth, defended by Muslim theologians of his time. According to this theory, something could be true in religion and not be true in the empirical sciences. Thomas Aquinas defended a classic concept of truth, used centuries earlier by Augustine of Hippo (354-430), to affirm that the truth could not be so divided. Martin Luther (1483-1546) made many criticisms of Thomas Aquinas, but ironically the methodological precision of the medieval theologian became quite influential in Lutheran theology of the 17th and 18th centuries. In Germany and the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) Lutheranism became the state religion after the Protestant Reformation of the 16th century, and being the pastor of churches in major cities became a respected and coveted public office.

It is against this intellectualism and this facility of being Christian that Kierkegaard revolts. In 19th century Denmark, all were born within the Lutheran Church, and being a Christian was the socially accepted position. Kierkegaard complained that in centuries past being a Christian was not easy, and could even involve life-threatening events. In the face of this he argued for a Christianity that involved an individual decision against all evidence. In one of his most famous texts he makes an exposition of the story in which the patriarch Abraham is asked by God to kill Isaac, his only son. Kierkegaard imagines a scenario in which Abraham does not understand the reasons of God, but ends up obeying blindly. In Kierkegaard’s words, Abraham gives “a leap of faith.”

This concept of blind faith, going against all the evidence, is central to Kierkegaard’s thinking, and became very influential in twentieth-century Christianity and even in other Western-established religions. Beyond the strictly religious aspect, Kierkegaard marked Western thought with the notion that some things might be true in some areas of knowledge but not in others. Moreover, its influence can be seen in the notion that the individual must make decisions about how he intends to exist, regardless of the rules of society or of all empirical evidence.

Another important existentialist philosopher of the 19th century was the German Friedrich Nietzsche (1844-1900). Like Kierkegaard, Nietzsche was also raised within Lutheranism but, unlike Kierkegaard, he became an atheist in his adult life. Like Kierkegaard, Nietzsche also became a critic of the social conventions of his time, especially the religious conventions. Nietzsche is particularly famous for the phrase “God is dead.” This phrase appears in one of his most famous texts, in which the Christian God attends a meeting with the other gods and affirms that he is the only god. In the face of this statement the other gods die of laughing. The Christian God effectively becomes the only god. But later, the Christian God dies of pity for seeing his followers on the earth becoming people without courage.

Nietzsche was particularly critical of how Christianity in his day valued features which he considered weak, calling them virtues, and condemned features he considered strong, calling them vices. Not just Christianity. Nietzsche also criticized the classical philosophy of Socrates, Plato, and Aristotle, placing himself alongside the sophists. The German philosopher affirmed that Socrates valued behaviors like kindness, humility, and generosity simply because he was ugly. More specifically, Nietzsche questioned why classical philosophers defended Apollo, considered the god of wisdom, and criticized Dionysius, considered the god of debauchery. In Greco-Roman mythology Dionysius (or Bacchus, as he was known by the Romans) was the god of festivals, wine, and insania, symbolizing everything that is chaotic, dangerous, and unexpected. Thus, Nietzsche questioned the apparent arbitrariness of the defense of Apollo’s rationality and order against the irrationality and unpredictability of Dionysius.

Nietzsche’s philosophy values courage and voluntarism, the urge to go against “herd behavior” and become a “superman,” that is, a person who goes against the dictates of society to create his own rules . Although he went in a different religious direction from Kierkegaard, Nietzsche agreed with the Danish theologian on the necessity of the individual to go against the conventions and the reason to dictate the rules of his own existence.

In the second half of the 20th century existentialism became an influential philosophical current, represented by people like Jean-Paul Sartre (1905-1980) and Albert Camus (1913-1960). Like their predecessors of the 19th century, these existentialists criticized the apparent absurdity of life and valued decision-making by the individual against rational and social dictates.

Liberty or “Security”: An Old Debate, A Familiar Straw Man

Ho hum. Jacques wants his government to do three things in the name of fighting Muslim terrorism (not to be confused with other, more numerous kinds of terrorism): 1) allow for an armed, perpetually-on-alert military to be active on US soil, 2) allow for a surveillance state that can do as it pleases in regards to Muslims only, and 3) initiate ideological quotas for Muslim immigrants.

The entire ‘comments’ thread is well worth reading, too. Dr Amburgey, who came from the same doctoral program as Jacques, brings the quantitative fire; Dr Khawaja, the qualitative. Jacques has responded to each of them.

The absurdity of Delacroix’ argument speaks for itself. I will come back to it shortly, but first I want to address a couple of his points that are simply made in bad faith. Observe:

(Yes, Mohammed did behead every man of a vanquished enemy tribe on the battlefield. Incidentally, they were Jews. The Prophet then “married ” their wives, he raped them, in others words. Bad example? Talk about this genuine part of Muslim tradition?)

Murdering and raping Jews is a “Muslim tradition”? I am sure this is news to Uighurs in China and the Javanese of Indonesia. I think there is a good case to be made for a present-day Arab cultural chauvinism that rests in part on what could be called “Muslim tradition,” but this is not a nuance that Jacques – the retired college professor – cares to address. I wonder why. If we’re going to go back to the 7th century to find cultural defects, can anybody think of something nasty that was going on in what is now France at the time? In what is now the US? What an odd historical anecdote to include in an argument.

Here, too, is another whopper:

One article of faith among literalist Muslims is that government must come from God. That’s why the Supreme Leader of the Shiite Islamic Republic is explicitly a cleric, couldn’t be an elected civilian or a general. This belief also explains the search for a Caliphate among Sunni jihadists, a polity where administrative and religious powers are one and the same.

What is a “literalist Muslim”? Nevermind. The government of Iran, its structure, is based on Plato’s Republic, not the Qur’an. The “Supreme Leader” Jacques identifies is based on the notion of a philosopher-king, not a Shiite cleric. This was done to protect the new dictatorship from its many enemies, including those loyal to the old dictatorship (the one supported by the United States; the one that Washington installed after helping to remove a democratically-elected Leftist government during the Cold War). The rhetoric of the Iranian dictatorship is explicitly religious, but in reality it’s just plain, old-fashioned despotism.

In a similar vein, “Sunni jihadists” (to use Jacques’ term) do not search for a Caliphate because of a belief that government should come from God, but instead look to a mythical Caliphate that they believe existed from the 7th to early 20th centuries as inspiration for creating a society that cannot be pushed around by murderous Western governments. Pretending that Arab Sunnis want to create a Caliphate in order to strengthen the link between government and God can only be described as “dishonest” when it comes from the mouth of a sociologist with a doctorate from Stanford.

At best, it could be argued that Jacques is simply making these types of points because they are pervasive throughout American society, and thus we – as libertarians of all stripes – have our work cut out for us. Now that I think about it, Jacques’ argument is so silly that it has to be an exercise in critical thinking. Nobody of his stature could say something so stupid, right?

Those are just two examples of, uh, the misrepresentation of reality by Jacques. There are many more, and I don’t think he got those myths from an academic journal. He got them from Fox News. That’s not good. That means libertarians are not taking advantage of their right to free speech, like conservatives and Leftists do. Why aren’t you blogging more often?

I’d like to turn back to his policy proposals. Here they are again:

  1. an armed, perpetually-on-alert military on US soil,
  2. a surveillance state that can do as it pleases in regards to Muslims only, and
  3. ideological quotas for Muslim immigrants.

The first two proposals look like they were copied directly from the playbook of the Third Reich (I hope you’ll reprimand me in the ‘comments’ section if you think I am being overly dramatic, or strawmanning Jacques’ argument). Just replace “US” with “Germany” and “Muslims” with “Jews” and voila, you have an answer for your Muslim (“Jewish”) problem. (RE Policy #3: National socialists, of course, don’t like anybody immigrating to their territories, whereas Jacques, in his infinite kindness and wisdom, seeks only to allow those who think like him into his territory.)

Again, Jacques’ argument is silly. It is both vulgar and unintelligent. It is misinformed. And yet I have to ask: Who is winning the PR battle here, conservatives on the one side or left-liberals and libertarians on the other?

Everyone carries a part of society on his shoulders; no one is relieved of his share of responsibility by others. And no one can find a safe way out for himself if society is sweeping toward destruction. Therefore, everyone, in his own interests, must thrust himself vigorously into the intellectual battle. None can stand aside with unconcern; the interest of everyone hangs on the result. Whether he chooses or not, every man is drawn into the great historical struggle, the decisive battle into which our epoch has plunged us.

That’s from Ludwig von Mises, the libertarian Austrian (and Jewish) economist who had to flee his homeland as the Third Reich took power. Speak your mind!

Reading the Laws, Part 4

If you haven’t been following along with the series, you can find the last three entries here:

Part One
Part Two
Part Three

***

I recently began reading Joseph Campbell’s well known work, the Hero with a Thousand Faces, and in the section concerning the challenges of the hero, he uses the example of Theseus and the Minotaur. I instantly thought back to my first entry in this reader’s diary, for the characters of the dialogue are all on their way to the Temple of Zeus, in mimicry of the journey of King Minos, who would go there once every nine years to propitiate the sky god for his aid.

I will quote what I said initially:

“They [the three participants in the dialogue] find each other as strangers on the road to Knossos, where they are all heading to the temple of Zeus for some religious function. The Athenian suggests a discourse, befitting their age and mental alacrity, on the nature of law. Aping the pilgrimage of the mythic king Minos, who would travel every nine years to this very shrine for the purpose of receiving instruction from Zeus on the law, the other two heartily agree to the suggestion.

Plato’s first invocation, and the setting of his dialogue, readily complement each other. The first asks whether the law comes from man or from a god, while the second seemingly answers in favor of the gods, set as it is in direct apposition with Minos’ nine year journey to Zeus himself. Law, and all its attendant meanings, seems to spring from divine reason rather than human craftsmanship.”

Any serious student of literature could make that assessment, as it requires no previous knowledge about Greek culture, mythology, and history. The ability to make inferences, hopefully a faculty shared by all people, is sufficient. Coupling the plain interpretation of the passage with some concrete knowledge about the Ancient Greek cultural milieu will add further depth. I quote now from Campbell:

“…the king of the south Indian province of Quilacare, at the completion of the twelfth year of his reign, on a day of solemn festival, had a wooden scaffolding constructed, and spread over with hangings of silk. When he had ritually bathed in a tank, with great ceremonies and to the sound of music, he then came to the temple, where he did worship before the divinity. Thereafter, he mounted the scaffolding and, before the people, took some very sharp knives and began to cut off his own nose, and then his ears, and his lips, and all his members, and as much of his flesh as he was able. He threw it away and round about, until so much of his blood was spilled that he began to faint, whereupon he summarily cut his throat.

This is the sacrifice that King Minos refused when he withheld the bull from Poseidon. As Frazer has shown, ritual regicide was a general tradition in the ancient world. “In Southern India,” he writes, “the king’s reign and life terminated with the revolution of the planet Jupiter round the sun. In Greece, on the other hand, the king’s fate seems to have hung in the balance at the end of every eight years . . . Without being unduly rash we may surmise that the tribute of seven youths and seven maidens whom the Athenians were bound to send to Minos every eight years had some connexion with the renewal of the king’s power for another octennial cycle” (ibid., p. 280). The bull sacrifice required of King Minos implied that he would sacrifice himself, according to the pattern of the inherited tradition, at the close of his eight-year term. But he seems to have offered, instead, the substitute of the Athenian youths and maidens. That perhaps is how the divine Minos became the monster Minotaur…”

Whereas the king’s journey to the Temple of Zeus, in connection with the theme of Plato’s dialogue, connects kingship, divinity, and the law thematically for the reader, this passage from Campbell offers a related but different perspective: kingship is not a transmission of divine decrees into a phenomenal space, but is itself bound by a deeper law, a primordial order tied to the revolutions of the planets, and the bloody desires of the gods. Zeus did not give just give Minos the law, but also a limited time to enforce it, for it was dependent on his devotion, and eventual demise. By subverting the will of the divinity in diverting the sacrifice from its typical victim, the regent, to a novel set of seven youths, the king invites calamity: his wife copulates with the bull of Poseidon, bearing the Minotaur, which as Frazer and Campbell argue, is really the personification for the bloodlust of the king himself, effaced over time by myth.

The law cannot be deceived, and it always exacts its due, for it is not the enforcement of human decrees, but an element of the fabric of the universe. The word of Zeus is binding because Zeus is the pillar of existence and the basis of all, “the first and last, one royal body, containing fire, water, earth, and air, night and day, Metis and Eros. The sky is his head, the stars his hair, the sun and moon his eyes, the air his nous, whereby he hears and marks all things,” in the words of an Orphic hymn (quotation from A. Wayman, “The Human Body as a Microcosm,” History of Religions, Vol. 22, No. 2, (Nov., 1982), pp. 174). His words bear the weight of physical laws, which balk at defiance. If Plato had this myth in mind when he wrote the Laws, then his setting of the dialogue in such a place, at such a time, fives a different interpretation for his view of the law: law is the capricious will of the gods, which binds us with the finality of the law of gravity, the pious upholding it and prospering, the wicked flouting it and suffering, though its moral dictums are always enforced in the end, even if it is defied.

There is little to indicate that this was Plato’s intention. Though some Greeks moved into a sort of philosophical monotheism based on the preeminence of Zeus – see Stoic cosmology, for one example – belief in the entire Olympian pantheon was still widespread. If the law is the will of the gods, we must ask first, whose will? All of the gods, or just one, the father, Zeus? Surely all of them, since the poets always depicted the gods as bickering over their own spheres of influence, and with significant power endowed in each. But if all of them, how can their wills be the basis of law? For did not Poseidon support the Achaeans at Ilium, while Apollo took the side of noble Hektor? Plato went over these questions himself in his dialogue Euthyphro, which indicates to me that he would not endorse a viewpoint he ringingly denounced through his mouthpiece, Socrates, in his earlier dialogue. Furthermore, as I have pointed out earlier, it seems to vitiate the whole point of this book of the dialogue, which is to examine different systems of law and define the basis of good laws.

Despite this, there is something to go on here. As Plato’s Athenian has earlier argued, the old should be the only ones with the prerogative to discuss the laws, their bases, their validity, their use, while the young must be enjoined only to obey, lest respect for the law as an institution does not solidify in them. Edmund Burke has a similar argument in his Reflections. From page 29:

“Always acting as if in the presence of canonized forefathers, the spirit of freedom, leading in itself to misrule and excess, is tempered with an awful gravity. This idea of a liberal descent inspires us with a sense of habitual native dignity which prevents that upstart insolence almost inevitably adhering to and disgracing those who are the first acquirers of any distinction. By this means our liberty becomes a noble freedom. It carries an imposing and majestic aspect. It has a pedigree and illustrating ancestors. It has its bearings and its ensigns armorial. It has its gallery of portraits, its monumental inscriptions, its records, evidences, and titles. We procure reverence to our civil institutions on the principle upon which nature teaches us to revere individual men: on account of their age and on account of those from whom they are descended. All your sophisters cannot pro- duce anything better adapted to preserve a rational and manly freedom than the course that we have pursued, who have chosen our nature rather than our speculations, our breasts rather than our inventions, for the great conservatories and magazines of our rights and privileges.”

Burke argues for creating a civic religion around the institutions of the laws, which attains its respect from its age and pedigree, in the same way an old man garners respect by virtue of his advanced age. On my foregoing interpretation of Plato, tying the law to a far more awful and terrifying source than human judgment gives it greater security, for there is not just the fear of man’s retribution, but also god’s. Both these thinkers operate on the premise that the law does not necessarily attain respect from its goodness or its appropriateness. These are objective aspects, which can only be comprehended by an intellect habituated to high and lofty topics, and not by the vulgus, which is naturally stupid, and has no capacity for understanding such things. Better to inspire such people, always the bulk of a society, through the antiquity of the law, to awe them with its heraldry and trappings, to set them to quaking with the terrible countenance of the statue of Zeus, in the greatness of its size and the scale of its construction a visible reminder of the awesome power of the god, and of his vengeance.

Friends of Liberty and Friends of Montaigne II: Marie de Gournay (Expanding the Liberty Canon series)

Marie Le Jars de Gournay (1565-1646) was a minor aristocrat from Sancerre in central France who became a leading scholar and writer of her time, and an important advocate of women’s liberty through her scholarly career against the dismissive attitude of powerful men of the time, and through her writing in favour of equality between men and women. She was a friend of Michel de Montaigne, one of the great historical advocates of liberty if in a rather enigmatic manner, and he even treated her as an adoptive daughter. After the death of Montaigne, she lived on the Montaigne estate as a guest of the family, while preparing the third edition of Montaigne’s Essays, a contribution to the history of thought and thinking about liberty in itself.

Gournay’s work in the transmission of Montaigne’s thought is though just one episode in a life of writing covering translations of the classics, literary compositions, and essays. Two essays in particular mark important moments in the case for liberty to apply equally between the two sexes: The Ladies’ Complaint and Equality of Men and Women. In these brief, but rich texts, Gournay argues that there can be no liberty, where goods are denied, so since women have been deprived of the goods of equal esteem, there is no liberty.

She points to the frequency and intensity of denial of equal esteem to women and contests it through the examples in which women have been esteemed, or we can see that women have performed great deeds on a level with great men. The argument is very much that of a Renaissance Humanist, that is someone educated in the languages, history, and literature of antiquity, as great expressions of human spirit and with the assumption that these are the greatest expressions of human spirit. Greatness of literary, intellectual, and statecraft in modern languages, modern thought, and modern states, is possible where  continuing from the classical tradition. Since the emphasis is on pagan classical antiquity, the Humanists to some degree placed humanity above Christian theological tradition, though some Christians were also Humanists and secular Humanist achievements to some degree interacted with scholarship of the Hebrew and Greek languages of the Bible, along with the Greek and Latin used by church thinkers.

Gourany’s concerns are largely secular but she does deal with the place of women in the Bible. For the Hebrew Bible (Old Testament) She points out that if the Queen of Sheba (often thought to refer to an ancient queen of Yemen, or possibly Sudan) visited King Solomon, because she knew of his great wisdom then she too must have had an interest in wisdom, and had some high level of scholarship, learning, and intellectual work herself.

With regard to the New Testament, she comments on St Paul’s injunction in his Epistles that women be silent in church and not take the role of priest. Gournay argues that Paul was not writing out of contempt for women, but fear that men would be distracted and tempted by women speaking out in church serves whereto as part of the congregation or as priests. The limitation on the role of women is not therefore based on beliefs about the supposed inferiority of women, but control of male desire.

On the role of women in the Bible, Gournay argues that in general we should not argue that it supports an inferior role for women, given that God created both men and women in the beginning, and given that men are commanded to leave their parents in order to find a wife. The connection between man and woman, and the idea that a man’s life is completed by association with a woman, is the main message of Christian scripture for Gournay.

Looking at the more secular aspects of Greek and Roman antiquity, Gournay deals with philosophical and with historical concerns. On the philosophical side she notes the importance that Plato gives to the priestess Diotima (unknown outside Plato’s writings) in his dialogue The Symposium, which appears to recount conversations about love in a dinner and drinking party in Athens attended by some of the leading people of the time.

Plato shows Socrates presenting the views of Diotima as the correct ones on love, and Socrates, the teacher of Plato, always appears in Plato’s dialogues as a representative of truth. So Gournay points out, it must be conceded that Plato claims that his ideas, and those of Socrates, are in some degree dependent on the thought of women of their time. In that case, Aristotle made himself absurd when  he claimed that women were defective and inferior, since he was the student of Plato and therefore was in some way formed by ideas that Plato said came from Diotima.

Plato’s student Aristotle may have claimed women were inferior by nature to men, but Antisthenes, a follower of Socrates regarded women and men as equal in virtue. Gournay also refers to the tradition according to which Aspasia, female companion of the Athenian democratic leader Pericles (admired by Plato and Aristotle though they did not share his democratic principles) was a scholar and thinker of the time. There is a lack of contemporary sources confirming this view, but this applies to much about the antique world, so Gournay’s suggestions about Aspasia are just as strongly founded as many claims about antiquity, and the investigation of tradition is itself an important part of any kind of intellectual history.

Moving onto Roman historiography, Gournay points out the role take by women in the tribes of Germany and Gaul, according to Tacitus. Women serve as judges of dispute and as battlefield participants inciting male warriors to fight fiercely. So she can point to a revered classic source, which suggests that women had roles in ancient France and Germany denied to them in those countries in early modern times. In general, as she points out, the antiques often referred to a tribe of female warriors, known  as Amazons, which may have some historical origin in Scythian tribes from north of the Black Sea.

Gournay uses her formidable Humanist learning to demonstrate the ways in which equality between men and women had been recognised in the ancient past, on some occasions in some places at least. Showing that women have been recognised as equal to men in some contexts is evidence that the lower status of women in many societies is a result of socially embedded prejudices rather than any difference in abilities. As Gournay notes, rectifying denial of rights to women is part of the basis for real enduring liberty.