Atomistic? Moi?

I have written a brief paper entitled ‘Hayek: Postatomic Liberal’ intended for a collection on anti-rationalist thinkers. For the time being, the draft is available from SSRN and academia.edu. Here are a couple of snippets:

Hayek offers a way of fighting the monster of Rationalism while avoiding becoming an inscrutable monster oneself. The crucial move, and in this he follows Hume, is to recognize the non-rational origins of most social institutions, but treating this neither as grounds for dismissal of those institutions as unsound, nor an excuse to retreat from reason altogether. Indeed, reason itself has non-rational, emergent origins but is nevertheless a marvelous feature of humanity. Anti-rationalist themes that appear throughout Hayek’s work include: an emphasis on learning by processes of discovery, trial and error, feedback and adaptation rather than knowing by abstract theorizing; and the notion that the internal processes by which we come to a particular belief or decision is more complex than either a scientific experimenter or our own selves in introspection can know. We are always, on some level, a mystery even to ourselves…

Departing from Cartesian assumptions of atomistic individualism, this account can seem solipsistic. When we are in the mode of thinking of ourselves essentially as separate minds that relate to others through interactions in a material world, then it feels important that we share that world and are capable of clear communication about it and ourselves in order to share a genuine connection with others. Otherwise, we are each in our separate worlds of illusion. From a Hayekian skeptical standpoint, the mind’s eye can seem to be a narrow slit through which shadows of an external world make shallow, distorted impressions on a remote psyche. Fortunately, this is not the implication once we dispose of the supposedly foundational subject/object distinction. We can recognize subjecthood as an abstract category, a product of a philosophy laden with abstruse theological baggage… During most of our everyday experience, when we are not primed to be so self-conscious and self-centered, the phenomenal experience of ourselves and the environment is more continuous, flowing and irreducibly social in the sense that the categories that we use for interacting with the world are constituted and remade through interactions with many other minds.

Nightcap

  1. Tianxia: a philosophy for world governance Salvatore Babones, Asian Review of Books
  2. Imperialism or federalism? Round Two Notes On Liberty
  3. A new history of the United States Julio Ortega, New York Times
  4. Postmodern politics Chris Dillow, Stumbling & Mumbling

Asking questions about women in the academy

Doing the economist’s job well, Nobel Laureate Paul Romer once quipped, “means disagreeing openly when someone makes an assertion that seems wrong.”

Following this inspirational guideline of mine in the constrained, hostile, and fairly anti-intellectual environment that is Twitter sometimes goes astray. That the modern intellectual left is vicious we all know, even if it’s only through observing them from afar. Accidentally engaging with them over the last twenty-four hours provided some hands-on experience for which I’m not sure I’m grateful. Admittedly, most interactions on twitter loses all nuance and (un)intentionally inflammatory tweets spin off even more anger from the opposite tribe. However, this episode was still pretty interesting.

It started with Noah Smith’s shout-out for economic history. Instead of taking the win for our often neglected and ignored field, some twitterstorians objected to the small number of women scholars highlighted in Noah’s piece. Fair enough, Noah did neglect a number of top economic historians (many of them women) which any brief and incomprehensive overview of a field would do.

His omission raised a question I’ve been hooked on for a while: why are the authors of the most important publications in my subfields (financial history, banking history, central banking) almost exclusively male?

Maybe, I offered tongue-in-cheek in the exaggerated language of Twitter, because the contribution of women aren’t good enough…?

Being the twenty-first century – and Twitter – this obviously meant “women are inferior – he’s a heretic! GET HIM!”. And so it began: diversity is important in its own right; there are scholarly entry gates guarded by men; your judgment of what’s important is subjective, duped, and oppressive; what I happen to care about “is socially conditioned” and so cannot be trusted; indeed, there is no objectivity and all scholarly contribution are equally valuable.

Now, most of this is just standard postmodern relativism stuff that I couldn’t care less about (though, I am curious as to how it is that the acolytes of this religion came to their supreme knowledge of the world, given that all information and judgments are socially conditioned – the attentive reader recognises the revival of Historical Materialism here). But the “unequal” outcome is worthy of attention, and principally the issue of where to place the blame and to suggest remedies that might prove effective.

On a first-pass analysis we would ask about the sample. Is it really a reflection of gender oppression and sexist bias when the (top) outcome in a field does not conform to 50:50 gender ratios? Of course not. There are countless, perfectly reasonable explanations, from hangover from decades past (when that indeed was the case), the Greater Male Variability hypothesis, or that women – for whatever reason – have been disproportionately interested in some fields rather than others, leaving those others to be annoyingly male.

  • If we believe that revolutionising and top academic contributions have a long production line – meaning that today’s composition of academics is determined by the composition of bright students, say, 30-40 years ago – we should not be surprised that the top-5% (or 10% or whatever) of current academic output is predominantly male. Indeed, there have been many more of them, for longer periods of time: chances are they would have managed to produce the best work.
  • If we believe the Greater Male Variability hypothesis we can model even a perfectly unbiased and equal opportunity setting between men and women and still end up with the top contribution belonging to men. If higher-value research requires smarter people working harder, and both of those characteristics are distributed unequally between the sexes (as the Greater Male Variability hypothesis suggests), then it follows naturally that most top contributions would be men.
  • In an extension of the insight above, it may be the case that women – for entirely non-malevolent reasons – have interests that diverge from men’s (establishing precise reasons would be a task for psychology and evolutionary biology, for which I’m highly unqualified). Indeed, this is the entire foundation on which the value of diversity is argued: women (or other identity groups) have different enriching experiences, approach problems differently and can thus uncover research nobody thought to look at. If this is true, then why would we expect that superpower to be applied equally across all fields simultaneously? No, indeed, we’d expect to see some fields or some regions or some parts of society dominated by women before others, leaving other fields to be overwhelmingly male. Indeed, any society that values individual choice will unavoidably see differences in participation rates, academic outcomes and performance for precisely such individual-choice reasons.

Note that none of this excludes the possibility of spiteful sexist oppression, but it means judging academic participation on the basis of surveys responses or that only 2 out of 11 economic historians cited in an op-ed were women, may be premature judgments indeed.

Nightcap

  1. A libertarian case for postmodernism Candice Holdsworth, Spiked
  2. South Sudan and wealthy LA enclaves have same vaccination rate Olga Khazan, the Atlantic
  3. The rise of Turkey’s new ultranationalism Burak Kadercan, War on the Rocks
  4. Past Masters of the Postmodern Simon Blackburn, Inference

Jordan Peterson’s Ignorance of Postmodern Philosophy

Up until this point, I’ve avoided talking about Jordan Peterson in any serious manner. In part because I thought (and continue to hope) that he’s the intellectual version of a fad diet who will shortly become irrelevant. My general impression of him is that when he’s right, he’s saying rather banal, cliché truisms with an undeserved bombastic air of profundity, such as his assertions that there are biological differences between men and women, that many religious myths share some similar features, or that taking personal responsibility is good. When he’s wrong, he’s talking way out of the depth of his understanding in his field (like the infamous lobster comment or this bizarre nonsense). Either way, it doesn’t make for a rather good use of time or opportunity for interesting, productive discussion—especially when his galaxy-brained cult-like fanboys are ready to pounce on anyone who criticizes their dear leader.

However, since everyone seems to be as obsessed with Jordan Peterson as he is with himself, I guess it’s finally time to talk about one example of him ignorantly bloviating that particularly annoys me as a philosophy student: his comments on postmodernism. There’s a lot one can talk about with Jordan Peterson because he says almost anything that comes to his mind about any topic, but for the present purposes you can pretend that I think everything he’s ever said that isn’t about postmodern is the deepest, most insightful thing ever said by any thinker in the history of western thought. I’m not interested in defending any overarching claims about him as a thinker. At the very least, his work on personality psychology does seem rather well respected and he surely got to his prestigious academic position with some merit, though I am not qualified to really appraise it. I am, however, more prepared to talk about his rather confused comments on philosophy which might shed light on why people are generally frustrated with his overly self-confident presence as a public intellectual.

Postmodernism, According to Peterson

Peterson often makes comments about “postmodern neo-Marxism,” which he calls a “rejection of the western tradition.” Now the very phrase “postmodern neo-Marxism” strikes anyone remotely familiar with the academic literature on postmodernism and Marxism as bizarre and confused. Postmodernism is usually characterized as skepticism towards grand general theories. Marxism is a grand general theory about how class struggle and economic conditions shape the trajectory of history. Clearly, those two views are not at all compatible. As such, much of the history of twentieth century academia is a history of Marxists and postmodernists fighting and butting heads.

Many commentators have pointed out this error, but Jordan Peterson now has a response. In it he tries to offer a more refined definition of postmodernism as two primary claims and a secondary claim:

Postmodernism is essentially the claim that (1) since there are an innumerable number of ways in which the world can be interpreted and perceived (and those are tightly associated) then (2) no canonical manner of interpretation can be reliably derived.

That’s the fundamental claim. An immediate secondary claim (and this is where the Marxism emerges) is something like “since no canonical manner of interpretation can be reliably derived, all interpretation variants are best interpreted as the struggle for different forms of power.”

He then goes on to concede to the criticism that Marxism and postmodernism can’t be described as theoretically aligned, but moves the goal posts to say that they are practically aligned in politics. Further, he contends postmodernisms’ commitment to analyze power structures is just “a rehashing of the Marxist claim of eternal and primary class warfare.”

It is worth noting that this attempt at nuance is surely an improvement at Peterson’s previous comments that postmodern a Marxism are a coherent “doctrine” that just hated logic and western values. But his attempt at a “definition” is unsatisfactorily way too restrictive for every thinker who gets called “postmodern,” and the attempt to link the politics of postmodernism up with the politics of Marxism is a complete mischaracterization. Further, his attempt to “critique” this position, whatever one wants to call it, is either (at best) vague and imprecise or (at worst) utterly fails. Finally, there really is no alliance between postmodernists and Marxists. Whether or not a thinker is called a “postmodernist” or not is not a very good predictor of their political views.

Why Peterson’s Definition isn’t what Postmodernists Believe and his Critique Fails

First of all, I am really not interested in dying on the hill of offering a better “definition” of postmodernism. Like any good Wittgensteinian, I tend to think you can’t really give a good list of necessary and sufficient conditions that perfectly captures all the subtle ways we use a word. The meaning of the word is the way it is used. Even within academia postmodernism has such broad, varied usage that I’m not sure it has a coherent meaning. Indeed, Foucault once remarked in a 1983 interview when asked about postmodernism, “What are we calling postmodernity? I’m not up to date.” The best I can give is Lyotard’s classic “incredulity toward metanarratives,” which is rather vague and oversimplified. Because this is the best I think one can do given how wildly unpredictable the usage of postmodernism is, we’re probably better off just not putting too much stock in it either as one’s own philosophical position or as the biggest existential threat to western civilization and we should talk about more substantive philosophical disagreements.

That said, Jordan Peterson’s definition is unsatisfactory and shows a poor understanding of postmodernism. While the first half of the fundamental claim is a pretty good stab at generalizing a view most philosophers who get labeled as postmodern agree with, the second half is rather unclear since it’s uncertain what Peterson means by “canonical.” If he takes this to mean that we have no determinate way of determining which interpretations are valid, then that would be a good summary of most postmodernists and an implication of Peterson’s own professed Jamesian pragmatism. If what he thinks it means is that all perspectives are as valid as any other and we have no way of deciphering which ones are better than the other, then nobody relevant believes that.

Peterson objects to is the implication “that there are an unspecifiable number of VALID interpretations.” He tries to refute this by citing Charles Pierce (who actually did not at all hold this view) and William James on the pragmatic criterion of truth to give meaning to “valid interpretations.” He says valid means “when the proposition or interpretation is acted out in the world, the desired outcome within the specific timeframe ensues.” However, it doesn’t follow from this view that you can specify the number of valid interpretations. It just begs the question of how we should understand what “the desired outcome” means, which just puts the perspectivism back a level. Even if we did agree on a determinate “desired outcome,” there are still multiple beliefs one could have to achieve a desired outcome. To put it in a pragmatically-minded cliché, there is more than one way to skin a cat. This is why, in fact, William James was a pluralist.

Perhaps by “specifiable,” he doesn’t mean we can readily quantify the number of valid interpretations, just that the number is not infinite. However, nobody believes there are an infinite number of valid perspectives we should consider. The assertion that a priori we cannot quantify the number of valid perspectives does not mean that all perspectives are equally valid or that there are an infinite number of valid perspectives. Peterson’s argument that we have limited cognitive capacities to consider all possible perspectives is true, it’s just not a refutation of anything postmodernists believe. On this point, it is worth quoting Richard Rorty—one who was both a Jamesian pragmatist and usually gets called postmodern—from Philosophy and the Mirror of Nature:

When it is said, for example, that coherentist or pragmatic “theories of truth” allow for the possibility that many incompatible theories would satisfy the conditions set for “the truth,” the coherentist or pragmatist usually replies this merely shows that we have no grounds for choice among thse candidates for “the truth.” The moral to draw is not to say they have offered inadequate analyses of “true,” but that there are some terms—for example, “the true theory,” “the right thing to do”—which are, intuitively and grammatically singular, but for which no set of necessary and sufficient conditions can be given which will pick out a unique referent. This fact, they say, should not be surprising. Nobody thinks that there are necessary and sufficient conditions which will pick out, for example, the unique referent of “the best thing for her to have done on finding herself in that rather embarrassing situation,” though plausible conditions can be given as to which will shorten a list of competing incompatible candidates. Why should it be any different for the referents of “what she should have done in that ghastly moral dilemma” or “the Good Life for man” or “what the world is really made of?” [Emphasis mine]

The fact that we cannot readily quantify a limited number of candidates for interpretations or decide between them algorithmically does not that we have absolutely no ways to tell which interpretation is valid, that all interpretations are equally valid, nor does it mean there are an infinite number of potentially valid interpretations. Really, the view that many (though not all) postmodernists actually hold under this “primary claim” is not all that substantially different from Peterson’s own Jamesian pragmatism.

As for the secondary claim, which he thinks is Marxist, that “since no canonical manner of interpretation can be reliably derived, all interpretation variants are best interpreted as the struggle for different forms of power.” This view is basically just one just Foucault might have held depending on how you read him. Some would argue this isn’t even a good reading of Foucault because such sweeping generalizations about “all interpretations” is rather uncharacteristic of a philosopher who’s skeptical of sweeping generalizations. However, you read Foucault (and I’m not really prepared to take a strong stand either way), it certainly isn’t the view of all postmodernists.  Rorty criticized this habit of Foucault (Contingency, Irony, and Solidarity, p. 63), and thought that even if power does shape modern subjectivity it’s worth the tradeoff in the gains to freedom that modern liberalism has brought and thus is not the best way to view. It’s also telling that Peterson doesn’t even try to critique this claim and just dogmatically dismisses it.

Postmodernism’s Alleged Alliance with Marxism

So much for his vague, weak argument against a straw man. Now let’s see if there’s any merit to Peterson’s thought that Marxism and postmodernism have some important resemblance or philosophical alliance. Peterson says that the secondary claim of postmodernism is where the similarity to Marxism comes. However, Marx simply did not think that all theories are just attempts to grab power in the Foucauldian sense: he didn’t think that dialectical materialism the labor theory of value were just power grabs, and predicted a day when there was no competition for power in the first place at the end of history since a communist society would be classless. If anything, it’s the influence of Nietzsche’s Will to Power on Foucault, and oddly enough Peterson thinks rather highly of Nietzsche (even though Nietzsche anticipated postmodernism in rather important ways).

The only feature that they share is a narrative of one group trying to dominate another group. But if any attempt to describe oppression in society is somehow “Marxist,” that means right libertarians who talk about how the state and crony capitalist are oppressing and coercing the general public are “Marxist,” evangelicals who say Christians are oppressed by powerful liberal elites are “Marxist,” even Jordan Peterson himself is a “Marxist” when he whines about these postmodern Marxist boogeymen are trying to silence his free speech. He both defines “postmodernism” too narrowly, and then uses “Marxism” in such a loose manner that it basically means nothing.

Further, there’s Peterson’s claim that due to identity politics, postmodernists and Marxists now just have a practical political alliance even if it’s theoretically illogical. The only evidence he really gives of this alleged “alliance” is that Derrida and Foucault were Marxists when they were younger who “barely repented” from Marxism and that courses like critical theory and gender studies read Marxists and postmodernists. That they barely repented is simply a lie, Foucault left all his associations with Marxist parties and expunged his earlier works of Marxist themes. But the mere fact that someone once was a Marxist and then criticized Marxism later in their life doesn’t mean there was a continuing alliance between believers in their thought and Marxism. Alasdair MacIntyre was influenced by Marx when he was young and became a Catholic neo-Aristotelian, nobody thinks that he “barely repented” and there’s some overarching alliance between traditionalist Aristotelians and Marxists.

As for the claim that postmodernists and Marxists are read in gender studies, it’s just absurd to think that’s evidence of some menacing “practical alliance.” The reason they’re read in those is mostly courses is to provide contrast for the students of opposing perspectives. This is like saying that because Rawlsian libertarians are taken seriously by academic political philosophers there’s some massive political alliance between libertarians and progressive liberals.

Really, trying to connect postmodernism to any political ideology shows a laughably weak understanding of both postmodernism and political theory. You have postmodernists identifying as everything from far leftists (Foucault), to progressive liberals (Richard Rorty), to classical liberals (Deirdre McClosky), to anarchists (Saul Newman), to religious conservatives (like Peter Blum and James K.A. Smith). They don’t all buy identity politics uniformly, Richard Rorty criticized the left for focusing on identity issues over economic politics and was skeptical of the usefulness of a lot of critical theory. There really is no necessary connection between one’s highly theoretical views on epistemic justification, truth, and the usefulness of metaphysics or other metanarratives and one’s more concrete views on culture or politics.

Now Peterson can claim all the people I’ve listed aren’t “really” postmodern and double down on his much narrower, idiosyncratic definition of postmodernism which has very little relation to the way anyone who knows philosophy uses it. Fine, that’s a trivial semantic debate I’m not really interested in having. But it does create a problem for him: he wants to claim that postmodernism is this pernicious, all-encompassing threat that has consumed all of the humanities and social sciences which hates western civilization. He then wants to define postmodernism so narrowly that it merely describes the views of basically just Foucault. He wants to have his cake and eat it too: define postmodernism narrowly to evade criticism that he’s using it loosely, and use it as a scare term for the entire modern left.

Peterson’s Other Miscellaneous Dismissals of Postmodernism

The rest of what he has to say about postmodernism is all absurd straw men with absolutely no basis in anything anyone has ever argued. He thinks postmodernists “don’t believe in logic” when, for example, Richard Rorty was an analytic philosopher who spent the early parts of his career obsessed with the logic of language. He thinks they “don’t believe in dialogue” when Rorty’s whole aspiration was to turn all of society into one continuous dialogue and reimagine philosophy as the “conversation of culture). Or that they believe “you don’t have an individual identity” when K. Anthony Appiah, who encourages “banal ‘postmodernism’” about race, believes that the individual dimensions of identity are problematically superseded by the collective dimensions. This whole “definition and critique” of postmodernism is clearly just a post-hoc rationalization for him to continue to dishonestly straw man all leftists with an absurd monolithic conspiracy theory. The only people who are playing “crooked games” or are “neck-deep in deceit” are ignorant hucksters like Peterson bloviating about topics they clearly know nothing about with absurd levels of unmerited confidence.

Really, it’s ironic that Peterson has such irrational antipathy towards postmodernism. A ton of the views he champions (a pragmatic theory of truth, a respect for Nietzsche’s use of genealogy, a naturalist emphasis on the continuity between animals and humans, etc.) are all views that are often called “postmodern” depending on how broadly one understands “incredulity towards metanarratives,” and at the very least were extremely influential over most postmodern philosophers and echoed in their work. Maybe if Peterson showed a fraction of the openness to dialogue and debate he dishonestly pretends to have and actually read postmodernists outside of a secondary source, he’d discover a lot to agree with.

[Editors note: The last line has been changed from an earlier version with an incorrect statement about Peterson’s source Explaining Postmodernism.]

The Intolerance of Tolerance

Just recently I read The Intolerance of Tolerance, by D. A. Carson. Carson is one of the best New Testament scholars around, but what he writes in this book (although written from a Christian perspective) has more to do with contemporary politics. His main point is that the concept of tolerance evolved over time to become something impossible to work with.

Being a New Testament scholar, Carson knows very well how language works, and especially how vocabulary accommodates different concepts over time. Not long ago, tolerance was meant to be a secondary virtue. We tolerate things that we don’t like, but that at the same time we don’t believe we should destroy. That was the case between different Protestant denominations early in American history: Baptists, Presbyterians, and so on decided to “agree to disagree,” even though they didn’t agree with one another completely. “I don’t agree with what you are saying, but I don’t think I have to shut you up. I’ll let you say and do the things I don’t like.” Eventually, the boundaries of tolerance were expanded to include Catholics, Jews, and all kinds of people and behaviors that we don’t agree with, but that we decide to tolerate.

The problem with tolerance today is that people want to make it a central value. You have to be tolerant. Period. But is that possible? Should we be tolerant of pedophilia? Murder? Genocide? Can the contemporary tolerant be tolerant of those who are not tolerant?

Postmodernism really made a mess. Postmodernism is very good as a necessary critic to the extremes of rationalism present in modernity. But in and by itself it only leads to nonsense. Once you don’t have anything firm to stand on, anything goes. Everything is relative. Except that everything is relative, that’s absolute. Tolerance today goes the same way: I will not tolerate the fact that you are not tolerant.

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.

Some problems with postmodernism

Despite its contributions, postmodernism is also the subject of much criticism. One of the most recurrent is its tendency to nihilism, that is, to pleasure for nothing. Postmodern deconstruction may be efficient at demonstrating the randomness of many of our concepts, but it can lead us to a point where we have nothing but deconstruction. We find that the world is made up of dichotomies or binary oppositions that cancel out, without any logic, leaving us with an immense void.

Another weakness of postmodernism is its relativism. In the absence of an absolute truth that can be objectively identified one gets subjective opinions. There is an expectation of postmodern theorists that this leads to higher levels of tolerance, but ironically the opposite is true. Without objective truths individuals are isolated in their subjective opinions, which represents a division of people, not an approximation. Moreover, postmodernism leads to a concern that all claims may be attempts at usurpation of power.

But the main weakness of postmodernism is its internal inconsistency. As mentioned in previous posts, postmodernism can be defined as unbelief about metanarratives. But would not postmodernism itself be a metanarrative? Why would this metanarrative be above criticism?

Another way of defining postmodernism is by its claim that there is no absolute truth. But is not this an absolute truth? Is it not an absolute truth, according to postmodernism, that there is no absolute truth? This circular and contradictory reasoning demonstrates the internal fragility of postmodernism. Finally, what happens if the hermeneutics of suspicion is turned against postmodernism itself? What gives us assurance that postmodern authors do not themselves have a secret political agenda hidden behind their speeches?

It is possible that postmodernists do not really feel affected by this kind of criticism, if they are consistent with the perception that there is no real world out there, or that “there is nothing outside the text”, but that the Reality is produced by discourses. That is: conventional theorists seek a truth that corresponds to reality. Postmodernists wonder what kind of reality their speeches are capable of creating.

Be that as it may, in spite of the preached intertextuality (the notion that texts refer only to other texts, and nothing objective outside the texts), postmodern theorists continue to write in the hope that we will understand what they write. Moreover, postmodernists live in a world full of meanings that are if not objective are at least intersubjective. Perhaps our language is not transparent, but that does not mean that it is opaque either. Clearly we are able to make ourselves understood reasonably well through words.

As C.S. Lewis said, “You cannot go on ‘seeing through’ things forever. The whole point of seeing through something is to see something through it. It is good that the window should be transparent, because the street or garden beyond it is opaque. How if you saw through the garden too? It is no use trying to ‘see through’ first principles. If you see through everything then everything is transparent. But a wholly transparent world is an invisible world. To ‘see through’ all things is the same as not to see”. This critique fits very well to postmodernism.

Main postmodern theorists and their main concepts

Postmodernism has been defined as “unbelief about metanarratives.” Metanarratives are great narratives or great stories; comprehensive explanations of the reality around us. Christianity and other religions are examples of metanarratives, but so are scientism and especially the positivism of more recent intellectual history. More specifically, postmodernism questions that there is a truth out there that can be objectively found by the researcher. In other words, postmodernism questions the existence of an objective external reality, as well as the distinction between the subject who studies this reality and object of study (reality itself), and consequently the possibility of a social science free of values, assumptions, or neutrality.

One of the main theorists of postmodernism (or of deconstructionism, to be more exact) was Jacques Derrida (1930-2004). Derrida noted that Western intellectual history has been, since ancient times, a constant search for a Logos. The Logos is a concept of classical philosophy from which we derive the word logic. It concerns an order, or logic, behind the universe, bringing order (cosmos) to what would otherwise be chaos. The concept of Logos was even appropriated by Christianity when the evangelist John stated that “In the beginning was the Logos, and the Logos was with God and the Logos was God,” identifying the Logos with Jesus Christ. In this way, this concept is undoubtedly one of the most influential in the intellectual history of the West.

Derrida, however, noted that this search for identifying a logos (whether it be an abstract spiritual principle, the person of Jesus Christ, or reason itself) implies the formation of dichotomies, or binary oppositions, where one of the elements of the binary opposition is closer to the Logos than the other, but with the two cancelling each other out in the last instance. In this way, Western culture tended to value masculine over feminine, adult over child, and reason over emotion, among other examples. However, as Derrida observes, these preferences are random choices, coupled with the fact that it is not possible to conceive the masculine without the feminine, the adult without the child, and so on. Derrida’s proposal is to identify and deconstruct these binaries, demonstrating how our conceptions are random.

Michel Foucault (1926-1984) developed a philosophical system similar to that of Derrida. At the beginning of his career he was inserted into the post-WWII French intellectual environment, deeply influenced by existentialists. Eventually Foucault sought to differentiate himself from these thinkers, although Nietzsche’s influence can be seen throughout his career. One of the recurring themes in Foucault’s literary production is the link between knowledge and power. Initially identified as a medical historian (and more precisely of psychoanalysis), he sought to demonstrate how behaviors identified as pathologies by psychiatrists were simply what deviated from accepted societal standards. In this way, Foucault tried to demonstrate how the scientific truths elaborated by the doctors were only authoritarian impositions. In a broader sphere he has identified how the knowledge produced by individuals and institutions clothed with power become true and define the structures in which the other individuals must insert themselves. At this point the same hermeneutic of the suspicion that appears in Nietzsche can be observed in Foucault: distrust of the intentions of the one who makes an assertion. The intentions behind an assertion are not always the explicit ones. Foucault’s other contribution was his discussion of the pan-optic, a kind of prison originally envisioned by the English utilitarian philosopher Jeremy Bentham (1748-1832) in which the incarcerated are never sure whether they are being watched or not. The consequence is that the incarcerated need to behave as if they are constantly being watched. Foucault imagined this as a control mechanism applied to everyone in modern society. We are constantly being watched, and charged to suit our standards.

In short, postmodernism questions Metanarratives and our ability to identify absolute truths. Truth becomes relative and any attempt to identify truth becomes an imposition of power over others. In this sense the foundations of modern science, especially in its positivist sense, are questioned. Postmodernism further states that “there is nothing outside the text,” that is, our language has no objective relation to a reality external to itself. Similarly, there is a “death of the author” after the enunciation of a discourse: it is impossible to identify the meaning of a discourse by the intention of the author in writing it, since the text refers only to itself, and is not capable of carrying any sense present in the intention of its author. In this way, discourses should be analyzed not by their relation to a reality external to them or by the intention of the author, but rather in their intertextuality.

The existentialist origins of postmodernism

In part, postmodernism has its origin in the existentialism of the 19th and 20th centuries. The Danish theologian and philosopher Søren Kierkegaard (1813-1855) is generally regarded as the first existentialist. Kierkegaard had his life profoundly marked by the breaking of an engagement and by his discomfort with the formalities of the (Lutheran) Church of Denmark. In his understanding (as well as of others of the time, within a movement known as Pietism, influential mainly in Germany, but with strong precedence over the English Methodism of John Wesley) Lutheran theology had become overly intellectual, marked by a “Protestant scholasticism.”

Scholasticism was before this period a branch of Catholic theology, whose main representative was Thomas Aquinas (1225-1274). Thomas Aquinas argued against the theory of the double truth, defended by Muslim theologians of his time. According to this theory, something could be true in religion and not be true in the empirical sciences. Thomas Aquinas defended a classic concept of truth, used centuries earlier by Augustine of Hippo (354-430), to affirm that the truth could not be so divided. Martin Luther (1483-1546) made many criticisms of Thomas Aquinas, but ironically the methodological precision of the medieval theologian became quite influential in Lutheran theology of the 17th and 18th centuries. In Germany and the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) Lutheranism became the state religion after the Protestant Reformation of the 16th century, and being the pastor of churches in major cities became a respected and coveted public office.

It is against this intellectualism and this facility of being Christian that Kierkegaard revolts. In 19th century Denmark, all were born within the Lutheran Church, and being a Christian was the socially accepted position. Kierkegaard complained that in centuries past being a Christian was not easy, and could even involve life-threatening events. In the face of this he argued for a Christianity that involved an individual decision against all evidence. In one of his most famous texts he makes an exposition of the story in which the patriarch Abraham is asked by God to kill Isaac, his only son. Kierkegaard imagines a scenario in which Abraham does not understand the reasons of God, but ends up obeying blindly. In Kierkegaard’s words, Abraham gives “a leap of faith.”

This concept of blind faith, going against all the evidence, is central to Kierkegaard’s thinking, and became very influential in twentieth-century Christianity and even in other Western-established religions. Beyond the strictly religious aspect, Kierkegaard marked Western thought with the notion that some things might be true in some areas of knowledge but not in others. Moreover, its influence can be seen in the notion that the individual must make decisions about how he intends to exist, regardless of the rules of society or of all empirical evidence.

Another important existentialist philosopher of the 19th century was the German Friedrich Nietzsche (1844-1900). Like Kierkegaard, Nietzsche was also raised within Lutheranism but, unlike Kierkegaard, he became an atheist in his adult life. Like Kierkegaard, Nietzsche also became a critic of the social conventions of his time, especially the religious conventions. Nietzsche is particularly famous for the phrase “God is dead.” This phrase appears in one of his most famous texts, in which the Christian God attends a meeting with the other gods and affirms that he is the only god. In the face of this statement the other gods die of laughing. The Christian God effectively becomes the only god. But later, the Christian God dies of pity for seeing his followers on the earth becoming people without courage.

Nietzsche was particularly critical of how Christianity in his day valued features which he considered weak, calling them virtues, and condemned features he considered strong, calling them vices. Not just Christianity. Nietzsche also criticized the classical philosophy of Socrates, Plato, and Aristotle, placing himself alongside the sophists. The German philosopher affirmed that Socrates valued behaviors like kindness, humility, and generosity simply because he was ugly. More specifically, Nietzsche questioned why classical philosophers defended Apollo, considered the god of wisdom, and criticized Dionysius, considered the god of debauchery. In Greco-Roman mythology Dionysius (or Bacchus, as he was known by the Romans) was the god of festivals, wine, and insania, symbolizing everything that is chaotic, dangerous, and unexpected. Thus, Nietzsche questioned the apparent arbitrariness of the defense of Apollo’s rationality and order against the irrationality and unpredictability of Dionysius.

Nietzsche’s philosophy values courage and voluntarism, the urge to go against “herd behavior” and become a “superman,” that is, a person who goes against the dictates of society to create his own rules . Although he went in a different religious direction from Kierkegaard, Nietzsche agreed with the Danish theologian on the necessity of the individual to go against the conventions and the reason to dictate the rules of his own existence.

In the second half of the 20th century existentialism became an influential philosophical current, represented by people like Jean-Paul Sartre (1905-1980) and Albert Camus (1913-1960). Like their predecessors of the 19th century, these existentialists criticized the apparent absurdity of life and valued decision-making by the individual against rational and social dictates.

“We’re all nothing but bags of stories”: Carlos Castaneda as a Countercultural Icon and Budding Post-Modernist

Exploring the countercultural 1960s and the origin of Western New Age, one cannot bypass Carlos Castaneda. He became a celebrity writer because of his bestselling book The Teachings of Don Juan: A Yaqui Way of Knowledge that was published by the University of California Press in 1968. The book was written in a genre of free-style dialogues between a Native American shaman named Don Juan Matus and Castaneda himself, who claimed that he worked with Don Juan for many years. The Teachings describes how Castaneda learned to use three hallucinogenic plants: peyote, jimson weed, and psychedelic mushrooms. After ingesting these substances, Castaneda went through mind transformations and learned that there were other realities besides the ordinary one. Later, it was revealed that he made up the whole experience, but this never affected his popularity.

Carlos-Castaneda-The-Teachings-of-Don-Juan

Of course, a book like this was well-tuned to the then-popular hallucinogenic subculture, and the link between Castaneda’s text and the psychedelic ‘60s is the most common explanation of his popularity. Yet I want to argue that this is a very narrow view, which does not explain why Castaneda’s follow up books, which had nothing to do with psychedelics, continued to enjoy popularity well into the 1990s. In fact, by the early 1980s, Castaneda became so paranoid about hallucinogens that he forced his girlfriend to undergo drug tests before allowing her to sleep with him. I also argue that viewing Castaneda exclusively as one of the spearheads of the New Age does not explain much either. The appeal of his texts went far beyond the New Age. In the 1970s and the 1980s, for example, his books were frequently assigned as conventional course readings in anthropology, philosophy, sociology, religious studies, and humanities classes.

Let me start with some biographical details. Castaneda was born Carlos Arana in Peru to a middle class family and moved to the United States in 1951. He tried to enter the world of art but failed. Then, for a while, he worked as a salesman while simultaneously taking classes in creative writing before eventually enrolling in the anthropology graduate program at UCLA.

Originally Castaneda did not care about hallucinogens and the emerging hippie culture, but eventually UCLA (and the broader California environment), which was saturated at that time with various counterculture and unchurched spirituality projects, made him choose a sexy topic: the use of psychedelics in a tribal setting. The book which made him famous, The Teachings of Don Juan, originated from a course paper on “power plants” and from his follow-up Master’s thesis. I want to stress that both papers were essentially attempts to find a short-cut to satisfy the requirements of his professors. His first professor, an anthropologist, invited those students who wanted to get an automatic “A” to find and interview an authentic Indian. Despite a few random contacts, Castaneda could not produce any consistent narrative, and had to invent his interview. This was the origin of his Don Juan character. Then he followed requirements of his advisor, Harold Garfinkel, a big name in sociology at that time and one of the forerunners of postmodernism. Garfinkel made it explicitly clear to Castaneda that he did not want him to classify and analyze his experiences with Don Juan scientifically.

What Garfinkel wanted was a free-style and detailed description of his work with the indigenous shaman as it was and without any interpretation. Thus it was through collective efforts that Castaneda produced a text that by chance caught the attention of the university press as a potential bestseller. Essentially, Castaneda took to the extreme incentives provided to him by his professors and by the surrounding subculture. He internalized these incentives by composing a fictional text, which he peddled as authentic anthropological research. It is interesting to note that in 1998, just before he died, Castaneda made the following mischievous remark in his introduction to the last anniversary edition of The Teachings of Don Juan: “I dove into my field work so deeply that I am sure that in the end, I disappointed the very people who were sponsoring me.”

The popularity of the first book gave rise to the whole Don Juan sequel, which made Castaneda an anthropology and counterculture star. The combined print run of his books translated in 17 languages reached 28 million copies. And, as I mentioned above, despite the revelations that his Don Juan was a completely fabricated character, the popularity of his books was increasing throughout the 1970s. In fact, to this present day, libraries frequently catalogue his books as non-fiction.

It seems that Castaneda’s appeal had something to do with overall trends in Western culture, which made his text resonate so well with millions of his readers. For this reason, I want to highlight the general ideological relevance of Castaneda’s books for the Western zeitgeist (spirit of the time) at its critical juncture in the 1960s and the 1970s. Various authors who wrote about Castaneda never mentioned this obvious fact, including his most complete biography by French writer Christophe Bourseiller, Carlos Castaneda: La vérité du mensonge (2005). So exploring the ideological relevance of the Don Juan books will be my small contribution to Castanediana.

To be specific, I want to point to two themes that go through all his books. First, he hammered in the minds of his readers the message of radical subjectivism, which in our day it is considered by some a conventional wisdom: What we call truth is always socially constructed. Don Juan, who in later books began speaking as a philosophy professor, repeatedly instructed Carlos that so-called reality was a fiction and a projection of our own cultural and individual experiences, and instead of so-called objective reality, we need to talk about multiple realities. In an interview for Time magazine, Castaneda stressed that the key lesson Don Juan taught him was “to understand that the world of common-sense reality is a product of social consensus.” Castaneda also stressed the role of an observer in shaping his or her reality and the significance of text in Western culture. In other words, he was promoting what later became the hallmark of so-called postmodern mindset.

Second, fictional dialogues between the “indigenous man” Don Juan, whom Castaneda portrayed as the vessel of wisdom, and Castaneda, a “stupid Western man,” contained another message: remove your Western blinders and learn from the non-Western ones. Such privileging of non-Western “wisdom” resonated very well with Western intellectuals who felt justified frustration about the hegemony of positivism and Western knowledge in general and who looked for an intellectual antidote to that dominance. By the 1990s, this attitude mutated into what Slavoj Zizek neatly labelled the “multiculturalist’s basic ideological operation,” which now represents one of the ideological pillars of Western welfare-warfare capitalism.

At the end of the 1970s, several critics tried to debunk Castaneda. They were able to prove that his books were the product of creative imagination and intensive readings of anthropological and travel literature. These critics correctly pointed out that Castaneda misrepresented particular indigenous cultures and landscapes. Besides, they stressed that his books were not written in a scientific manner. Ironically, this latter criticism did not find any responsive audiences precisely because social scholarship was moving away from positivism. Moreover, one of these critics, anthropologist Jay Fikes, who wrote a special book exposing Castaneda’s hoax, became a persona non grata in the anthropology field within the United States. Nobody wanted to write a reference for him, and he had to move to Turkey to find an academic position.

What critics like Fikes could not grasp was the fact that the Castaneda texts perfectly fit the emerging post-modernist thinking that was winning over the minds of many Western intellectuals who sought to break away from dominant positivism, rationalism, and grand all-explaining paradigms. To them, an antidote to this was a shift toward the subjective, individual, and spontaneous. The idealization and celebration of non-Western knowledge and non-Western cultures in general, which currently represents a powerful ideological trend in Western Europe and North America, became an important part of this intellectual revolt against the modern world. I am sure all of you know that anthropology authorities such as Clifford Geertz (until recently one of the major gurus of Western humanities), Victor Turner, and Claude Lévi-Strauss were inviting others to view any cultural knowledge as valid and eventually erased the border between literature and science. They also showed that scholarship can be constructed as art. Castaneda critics could not see that his texts only reflected what was already in the air.

Castaneda_Time magazine

The person who heavily affected the “production” of the first Don Juan book, which was Castaneda’s revised Master’s thesis, was the above-mentioned sociologist Garfinkel. As early as the 1950s, Garfinkel came up with ideas that contributed to the formation of the post-modern mind. I am talking here about his ethnomethodology. This school of thought did not see the social world as an objective reality but as something that individuals build and rebuild in their thoughts and actions. Garfinkel argued that what we call truth was individually constructed. Sometimes, he also called this approach “people’s sociology.” He stressed that a scholar should set aside traditional scientific tools and should simply narrate human experiences as they were in all details and spontaneity. Again, today, for many, this line of thinking is conventional wisdom, but in the 1950s and the 1960s it was revolutionary. Incidentally, for Castaneda it took time to figure out what Garfinkel needed from him before he rid his text of the vestiges of “positive science.” To be exact, Castaneda could not completely get rid of this “science” in his first bestselling book. In addition to the free-flowing and easy-to-read spontaneous dialogues with Don Juan, Castaneda attached to the text an appendix; a boring meaningless read that he titled “Structural Analysis.” In his later books, such rudiments of positivism totally disappeared.

When Castaneda was writing his Master’s thesis, Garfinkel made him revise the text three times. The advisor wanted to make sure that Castaneda would relate his spiritual experiences instead of explaining them. Originally, when Castaneda presented to Garfinkel his paper about a peyote session with Don Juan, the text was formatted as a scientific analysis of his own visions. The professor, as Castaneda remembered, rebuked him, “Don’t explain to me. You are nobody. Just give it to me straight and in detail, the way it happened. The richness of detail is the whole story of membership.” Castaneda spent several years revising his thesis and then had to revise it again because Garfinkel did not like that the student slipped into explaining Don Juan psychologically. Trying to be a good student, Castaneda embraced the advice of his senior colleague. So the final product was a beautiful text that was full of dialogues, rich in detail, and, most importantly, came straight from the “field.”

I interviewed some of Castaneda’s classmates and other scholars who became fascinated with his books at the turn of the 1970s. Many of them had no illusions about the authenticity of Don Juan. Still, they argued that the whole message was very much needed at that time. A quote from Douglas Sharon, one of Castaneda’s acquaintance, is illustrative in this regard. In his conversation with me, Sharon stressed:

“In spite of the fact that his work might be a fiction, the approach he was taking—validating the native point of view—was badly needed in anthropology, and, as a matter of fact, I felt it was a helping corrective for the so-called scientific objectivity that we were taking into the field with us.”

I want to mention in conclusion that Castaneda not only promoted the postmodern approach in his novels but also tried to live it. Before the age of Facebook and online forums, Castaneda, with a group of his followers, became involved in an exciting game of identity change. They came to enjoy confusing those around them by blurring and constantly changing their names and life stories. For example, people in his circle shredded their birth certificates and made new ones. They also performed mock wedding ceremonies to make fun of conventional reality. To those who might have had questions about this “post-modernist” game, Castaneda reminded: “We’re all nothing but bags of stories.”

How to think like an individualist

Postmodernism is disposed of incisively. “Just as Western politicians and generals annex foreign lands, postcolonial theorists argue, so Western intellectuals impose their knowledge on the rest of the world,” Malik writes. But Western philosophy does not replicate the ways and methods of Western imperialism. Its criteria and methods, but also its values, are completely different. So is its relationship to the non-European world, which is not one of subjugation and annexation, but of interaction and accommodation. The key concepts of Western secular modernity that are hardest to contest – universalism, democracy and individual liberty – were not, in reality, products of Western imperialism, and are actually not compatible with it. Anti-colonialism in modern times is as much a product of Western philosophy as of non-European thought, or more so. There are also other key Western ideas, such as Marx’s critique of capitalism, that have demonstrated an impressively wide appeal in every part of the globe but remain as much contested today in the West as anywhere else.

Kenan Malik stole all my ideas. I guess I should start applying for insurance salesman positions, eh? Read the rest, by Jonathan Israel. But wait, there’s more.

Any nation that has an official religious establishment faces the problem of “standardizing” the religion to satisfy the demands of the establishment. Note that the law [passed by Austria’s parliament forcing Austrian Muslim organizations to use a German-language Qur’an] doesn’t outright ban competing translations of the Qur’an, but gives the official imprimatur of the Austrian government to an approved translation. It doesn’t seem to have occurred to Austrians to distinguish the rights-protecting and religious-establishment-establishing functions of the state, and to dump the latter over the side. But I suspect it hasn’t occurred to the Austrian Parliament because it hasn’t quite occurred to Austrian Muslims, either. There are perks to be had if you accept government sponsorship of your religion: once you’re enticed by them, it becomes hard not to do a deal with the Devil to keep them in place. I don’t know about the standardized German translation, but my translation of the Qur’an suggests that seduction is the Devil’s AOS.

This is from the infamous Irfan Khawaja over at Policy of Truth. Read it.

Telling the Truth and Tarentino, Liberals, the Secretary of State, and the President

I have a liberal friend with whom I have fairly frequent serious discussions. He thinks of himself as a moderate liberal, even a centrist because his owns guns and his guns are dear to him. Yet, he voted for Obama and he can give a spirited defense of every aspect of Obama’s policies and actions. That’s a test, in my book.

He told me once, but only once, that the administration’s program of at-a-distance- assassination-of-the-untried was not a problem for him. He dos not see how assassinating an American citizen, for example, on the presidential say-so, could be a problem, ethical or judicial. He does not discern a slippery slope. That too is a test.

He and I have had repeatedly two bases of disagreement. First, we have different values, of course. Thus, he insists that it’s fine for him to use the vote to take my money by force in order to give it to someone that he, my friend, thinks deserves it more than I do because he, the other guy, does not have health insurance.

I disagree.

Note that this is an actual example of a fundamental value difference because my liberal buddy does not have to go there to achieve the same results. He could try, for example, to convince me to give up some money on the basis of expediency: It’s unpleasant, even messy to have the uninsured dying on my front lawn for lack of medical care. (As they do all the time, of course.) Or, he could persuade me on fellow-human grounds. He does not feel like doing either because, I think, he has no moral qualm about taking my earnings by force for a cause he judges good. That’s a big difference between us. Continue reading