Nightcap

  1. Why is the public so obsessed with Nazis? Roundtable, History Today
  2. The forgotten fascist surge in postwar Britain Jerry White, Literary Review
  3. Failed states and failed civilizations Nick Nielsen, The View from Oregon
  4. The use and abuse of ethnic minorities Salvatore Babones, spiked!

Nightcap

  1. How millennial socialists make the case for a kinder politics George Scialabba, New Republic
  2. Affirmative Action: the uniquely American experiment Orlando Patterson, New York Times
  3. Imagining Africa (clash of civilizations?) Clive Gabay, Disorder of Things
  4. Anáhuac and Rome: indigeneity and religion in Mexico Arturo Chang, Age of Revolutions

Nightcap

  1. Sir Roger Scruton (1944-2020) Johnathan Pearce, Samizdata
  2. Sir Roger Scruton and free market economics Chris Dillow, Stumbling & Mumbling
  3. Roger Scruton’s conservatism Bradley Birzer, American Conservative
  4. The problem of defining civilization Nick Nielsen, Grand Strategy Annex

Wiener Moderne and Austrian Economics – A product of times of turmoil

There are some certain incredibly rare constellations of time and space which result in one of a kind decades. The peak of Greek civilization from 5th to 4th century BC, the Californian Gold Rush from 1848–1855 and the Fin de Siecle from 1890-1920. The latter one is of specific interest to me for a long time. Some of the most worlds most famous painters (Gustav Klimt, Oskar Kokoschka), philosophers (Ludwig Wittgenstein, Karl Popper, Edmund Husserl) or authors (Georg Trakl, Hugo von Hofmannsthal, Arthur Schnitzler) coined the decade. Even more intriguing for me is that the Viennese intellectual live happened in very close circles. All intellectuals being witnesses of the downfall of one of the greatest empires of the 19th century, each discipline coped with this fate in their very own way. Especially if one compares the movements of that time in literature and economics, it becomes clear that the self-imposed demands of the authors and scientists on their science differ considerably.

The Wiener Moderne:  Flight into the irrational

Driven by the predictable crumbling of the Austro-Hungarian empire, the anticipated increasing tensions in the multi-ethnic empire and the threating of financial recession, the civil society was teetering on an abyssal edge. Furthermore, the Halleyscher comet was predicted to “destroy” the world in 1910, the titanic sunk in 1912, a European war was lingering just around the corner. Concerning the breakdown of stable order, people sought a way out of ruins of what once has been a stable authoritarian order. When existential threats become more and more realistic, one would expect cultural life to totally drain or at least decrease sufficiently. However, the complete opposite was the case.

At first, art merely revolted against the prevailing naturalism. Why would anybody need a detailed, accurate depiction of reality if reality itself is flawed with incomprehension, irrationality and impenetrability? Missing a stable external framework, many writers turned the back against their environment and focused on the Ego. To express the inner tensions of most contemporary people, many authors sought to dive deep into the human consciousness. Inspired by the psychoanalytical insights provided by Sigmund Freund, who had vivid relationships with many important authors such as Arthur Schnitzler, human behaviour and especially human decision making became a topic of increasing interest. Therefore, news ways of narrating such as interior monologue were founded.

Many writers such as Albert Schnitzler, Hugo von Hofmannsthal and Georg Trakl found in transcendence a necessary counterbalance to supra-rational society. Reality and dream blurred into a foggy haze; rational preferences gave way to impulsive needs; time horizons shortened, emotions overcame facts. The individual was portrayed without any responsibility towards society, their family or other institutions. In the Dream Story (By far my favourite book) by Arthur Schnitzler, the successful doctor Ferdinand risks his marriage and his family to pursue subconscious, mysterious sexual needs. If you have the time, check out the movie based on the novel “Eyes Wide Shut” by Stanley Kubrick, truly a cinematic masterpiece.

Karl Kraus, on the other hand, founded the satirical newspaper “The Torch” in 1899 and offered often frequented point of contact for aspiring young talented writers. The content was mostly dominated by craggy, harsh satirical observations of the everyday life which sought to convince the public of the predictable mayhem caused by currents politics. Franz Wedekind, Adolf Loos and Else Lasker-Schüler could use the torch as a stepping stone for their further careers.

What they have in common is their understanding of their craftmanship: It is not of the concern of art to save civilization or to convince us to be better humans, but to describe, document and in a way aestheticize human behaviour. This does by no way means that the Viennese authors of the early 20th century were not politically or socially involved: Antisemitism (Karl Kraus & Arthur Schnitzler), Free Press (Karl Kraus), Sexuality (Franz Wedekind and Arthur Schnitzler) were, for example, reoccurring themes. However, in most works, the protagonist struggles with these problems on an individual level, without addressing the problem as a social problem. Also, the authors seemed to lack the entire puzzle picture: Although many individual pieces were criticized, the obvious final picture was rarely recognized (Especially Schnitzler).

Economics – Role of the scientist in society

Meanwhile in economics another exciting clash of ideas took place: The second wave of the Historical School economist, mainly Gustav Schmoller, Karl Büchner and Adolph Wagner, were waging a war against Austrian School of Economics, mainly Carl Menger. The Historical School sought to identify the patterns in history through which one could deduce certain principles of economics. Individual preferences are not the result of personal desires, but rather the sum of social forces acting on the individual depending on space and time, they asserted. Thus, instead of methodological individualism, methodological collectivism must be used to conduct economic research. To determine the historical-temporal circumstances, one must first collect an enormous amount of empirical material, based on which one could formulate a theory. Austrian Economists, in turn, claim that individual preferences stem from personal desires. Although the Austrian emphasize the constraints emerging from interpersonal interactions, they rejected the idea, that free individuals are confined in their will through culture and norms. Thus, economics is a science of aggregated individual preferences and must be studied through the lens of methodological individualism.

As Erwin Dekker (Dekker 2016) has argued, the works of Austrian Economists must be seen as an endeavour to understand society and civilization in the first place. One must carefully study human interaction and acknowledge the ridiculously small amount of knowledge we actually possess about the mechanism of a complex society before one can “cure” the many ills of humankind. With the socialist calculation debate, Austrian Economist tried to convince other academics of the impossibility of economic calculation in the absence of prices.

Apart from their academic debates, they were very much concerned with the development of common society: Authoritarian proposal, the constant erosion of norms as a foundation for civil society, the increasing overall hostility lead them to the decision to leave the ivory tower of economics and argue for their ideas in public discourse. “The road to serfdom” is THE peak of this development. Hayek impressively explains to the general public the fragility of liberal democratic order and how far-reaching even well-intended governmental interferences can eventually be. Joined by Karl Popper’s masterpiece “The open society and its enemies”, Austrian Economist were now defending the achievements of liberal democracy more vigorously than ever.

Conclusion

It would be exaggerated to claim that the literary-historical “flight into the irrational” had excessive influence on the economic debate between the historical school and the Austrian school. Nevertheless, it has already been proven that intellectual Viennese life took place in a few closely networked interdisciplinary circles. There is no direct connection between the Viennese literary circles and famous contemporary economic circles such as the Mises-Kreis. However, the intellectual breadth of contributions and the interwoven relationships of many contributors became an important point of study in recent years (See: Dekker 2014). Especially Sigmund Freud could have been a “middle man” between Austrians (especially Hayek) and the authors of the Wiener Moderne (especially Schnitzler).

What definitely is remarkable is how different the various scientists and artist reacted to the existential threats of the early 20th century.
Resignation? Internal Exile? Counterattack? There were many options on the table.

The “flight into the irrational” pursued by many, by far not all, authors of Wiener Moderne was a return to surreality, irrationality and individualism. Austrian Economist, however, went from individualism to social responsibility. According to them, scientists had an obligation to preserve that kind of liberal democratic system, which fosters peaceful human cooperation. To achieve this shared goal, many Austrian Economists left the ivory tower of academic debates, where they also fought for the same purpose, and temporarily became public intellectuals; starting a much more active defence of liberal democracy.

Perspective and riches

Sometimes working in the arts can be quite disorienting, especially in terms of what comes out of the mouths of colleagues. For example, a close friend was in rehearsal and an ensemble member, having spent the first hour staring at her, suddenly demanded: 

“Are those real diamonds [pointing at a simple crystal strand bought at H&M]?” 

“What?! These?! No.”

“Oh, okay. I was trying to figure out how rich you are.” 

There were so many things wrong with this actual exchange that it is hard to know where to start. The main, collective reaction was: “Who openly admits to sitting there thinking things like that?” The episode embarrassed everyone except the person who asked the offensive question. Aside from the immediate disruptive effect it had, the incident was indicative of a greater socio-cultural problem, a shameless voyeurism that, while not new, has reached a fevered pitch today.

While one could easily say that reality TV and Instagram are primary causes, there are plenty of examples which predate these media, most memorably Gustave Flaubert’s Madame Bovary and its prescient view of tabloid and celebrity culture. What is new, though, is the idea that the envious and their curiosity have any legitimacy. We have come from Flaubert’s view that Emma Bovary was a colossal idiot to articles published by the BBC lamenting “invisible poverty.” The BBC writer’s examples of “invisible poverty” were an inability to afford “posh coffee,” a qualifier which he declined to define, and neighbors wondering if a “nice car” was bought on auto loan or owned outright. Like the question about diamonds, not only should such matters be outside the concern of others, to think that they are appropriate, or even a valid source of social strife, is disgusting and disturbing. 

In his book Down and Out in Paris and London, George Orwell complained about being sent to Eton, where he spent his school years feeling as though everyone around him had more material wealth. The essence of his lament was that he wished his parents had sent him to a small grammar school where he could have been the richest student. He also claimed, in a wild generalization, that his feelings on the matter were universal through the British upper-middle class. Further, he said that it was his time in secondary school, not as commonly claimed his time as a civil servant, which fueled his turn toward Marxism, following the traditional logic of grabbers – “they have so much and therefore can spare some for me.” 

The most baffling part for Orwell was the way that the upper-middle class, which included his family, was willing to move to far-flung corners of the globe and live in conditions the lowest British laborer would not accept in exchange for educational opportunity for their children and a high-status, reasonably wealthy retirement for themselves. For a comprehensive analysis of the phenomenon of self-sacrifice, its role in the development of capitalism, and why only the century upper- and upper-middle classes were the ones willing to make such exchanges, see Niall Ferguson’s Colossus

It is important today for us to become more critical regarding complaints about society and anecdotes that are presented as proof regarding unfair societal mechanisms that prevent social mobility. An example of the reason we must be careful is art recent article published by written by a Cambridge undergraduate for The Guardian, who identifying as working class and having many problems along those lines, cited as her biggest complaint the Cambridge Winter Ball. Her problem was not that she hasn’t been able to attend, but that she had had to work for an hour in order to get into the Ball for free. This is a questionable example of social immobility. Her complaint about the Ball was that there were others who could pay the £100 entrance fee upfront. From this, she assumed a level of privilege that might not necessarily exist, i.e. “the other students could part with 100 pounds.” 

Another example of failure to understand the availability of resources and extrapolating a false conclusion of social immobility is the Columbia University FLiP (First-generation, Low-income People) Facebook page, which was, through 2018, their primary platform. In response to Columbia University’s study on their first-generation low-income population, many of the complaints related to books and the libraries. FLiP students didn’t know that books were available in the library, and so they had purchased study materials while their “wealthier” peers simply borrowed the library copy or spent the necessary number of hours in the library working. Ironically, this complaint is not valid if you also consider that Columbia does an immersive orientation in which new students are taken into the libraries and are shown the basics of the book search system, card operations, checkout procedure, etc. In response to the publicity surround the FLiP drive[1] the university opened a special library for these people where there is no official checkout; all loans are on the honor system. On a hilarious side note, in the middle ages libraries would chain books to lecterns to keep the students from walking away with them.

While we may have moved away from a society that encouraged living modestly to avoid arousing the envy of one’s neighbors, we now live in a culture in which our neighbors’ jealousy is too easily aroused. Chaos is the natural resting state of existence, but people have lost the ability to construct order for themselves out of it. It is possible to argue that modern people have not been taught to do so; after all, no one comes into the world knowing the underlying skills that are the foundation of the “invisible poor” complaints, e.g. social interactions, sartorial taste, self-sacrifice, etc. To tell the truth, mankind’s natural state is closer to the savages of the middle ages whose covetous inclinations necessitated the chaining of library books. On the one hand, we have progressed tremendously past such behavior and in doing so created order from chaos; but on the other hand, the external signs of progress are now under fire as symbols of privilege. Chillingly, the anti-civilization narrative, because that is ultimately what it is, is being incorporated into an anti-capitalist agenda through the conflation of “civilized” with “privileged,” which in turn is conflated with “rich.” 


[1] It is also revealing that the sign off for these people while the drive lasted was FLiP [school name]. Yes, one must wonder if even the acronym was picked for its stunning vulgarity. 

Nightcap

  1. The Dark Forest theory: why aliens haven’t contacted us yet Scotty Hendricks, Big Think
  2. China’s “Dark Forest” answer to Star Wars optimism Jeremy Hsu, Lovesick Cyborg
  3. On the tradition of “Chinese unity” in geopolitical thought Nick Nielsen, The View from Oregon
  4. Rice Peter Miller, Views of the Kamakura

Nightcap

  1. Espionage and the Catholic Church Aaron Bateman, War on the Rocks
  2. Can globalization be reversed? John Quiggin, Crooked Timber
  3. Getting rich is glorious (scroll down) Pierre Lemieux, Regulation
  4. Africa’s lost kingdoms Howard French, New York Review of Books

Nightcap

  1. The stoic grief of the Gold Star Mothers John McKay, American Conservative
  2. “My body, my choice” Ilya Somin, Volokh Conspiracy
  3. “Frontier” history has gotten much better, no thanks to David McCullough Rebecca Onion, Slate
  4. The loss of a symbol of civilization Nick Nielsen, The View from Oregon

Nightcap

  1. Why weakly enforced rules? Robin Hanson, Overcoming Bias
  2. What’s changed since Salman Rushdie’s notorious novel? Bruce Fudge, Aeon
  3. Spacefaring civilization Nick Nielsen, Grand Strategy Annex
  4. Is the universe pro-life? Bobby Azarian, Quartz

Nightcap

  1. What cafés did for liberalism Adam Gopnik, New Yorker
  2. How the Catholic Church created our liberal world Tanner Greer, American Conservative
  3. How meritocracy and populism reinforce each other’s fault Ross Douthat, New York Times
  4. Extraterrestrial preservation of terrestrial heritage Nick Nielsen, Grand Strategy Annex

No Country for Creative Destruction

Imagine a country whose inhabitants reject every unpleasant byproduct of innovation and competition.

This country would be Frédéric Bastiat’s worst nightmare: in order to avoid the slightest maladies expected to emerge from creative destruction, all their advantages would remain unseen forever.

Nevertheless, that impossibility to acknowledge the unintended favourable consequences of competition is not conditioned by any type of censure, but by a sort of self-imposed moral blindness: the metaphysical belief that “being” is good and “becoming” is bad. A whole people inspired by W. B. Yeats, they want to be gathered into the artifice of eternity.

In this imaginary country, which would deserve a place in “The Universal History of Infamy” by J.L. Borges, people cultivate a curious strain of meritocracy, an Orwellian one: they praise stagnation for its stability and derogate growth because of the stubborn and incorruptible conviction that life in society is a zero-sum game.

Since growth is an unintended consequence of creative destruction, they reason additionally, then there must be no moral merit to be recognised in such dumb luck. On the other hand, stagnation is the unequivocal signal of the good deeds to the unlucky, who otherwise could suffer the obvious lost coming from every innovation.

In this fantastic country, Friedrich Nietzsche and his successors are well read: everybody knows that, in the Eternal Return, the whole chance is played at each throw of the dice. So, they conclude, “if John Rawls asked us to choose between growth or stagnation, we would shout at him: Stagnation!!!”

But the majority of the inhabitants of “Stagnantland” are not the only to blame for their devotion to quietness. The few and exceptional proponents of creative destruction who live in Stagnantland are mostly keen on the second term of the concept. That is why some love to say, from time to time, “we all are stagnationist” – the few contrarians are just Kalki’s devotees.

These imaginary people love to spend their vacations abroad, particularly in a legendary island named “Revolution”. Paradoxically, in Revolution Island the Revolutionary government found a way to avoid any kind of counter-revolutionary innovation. It is not necessary to mention that Revolution Island is, by far, Stagnantlanders’ favourite holiday destination.

They show their photos from their last vacation in Revolution Island and proudly stress: “Look: they left the buildings as they were back in 1950!!! Awesome!!!” If you dare to point out that the picture resembles a city in war, that the 1950 buildings lack of any maintenance or refurbishment, they will not get irritated. They will simply smile at you and reply smugly: “but they are happy!”

Actually, for Stagnantlanders, as for many others, ignorance is bliss, but their governments do not need to resort to such rudimentary devices as censure and spying to prevent people from being informed about the innovations and discoveries occurring in other countries, as Revolutionary Island rulers sadly do. Stagnantlanders simply reject any innovation as an article of faith!

Notwithstanding, they allow to themselves some guilty pleasures: they love to use smartphones brought by ant-smuggling and to watch contemporary foreign films which, despite being realistic, show a dystopian future to them.

As everything is deteriorated, progress is always a going back to an ancient and glorious time. In Stagnantland, things are not created, but restored. As with Parmenides, they do not believe in movement, but if there has to be an arrow of time, you had better point it to the past.

Moreover, Stagnantland is an imaginary country because it does not only lack of duration, but of territory as well. As the matter of fact, no man inhabits Stagnantland, but it is indeed stagnation that inhabits the hearts of Stagnantlanders. That is how, from dusk to dawn, any territory could be fully conquered by the said sympathy for the stagnation.

Nevertheless, if we scrutinise the question with due diligence, we will discover that the stagnation is not an ineluctable future, but our common past. Human beings appeared very much earlier than civilisation. So, all those generations must have been doing something before agriculture, commerce, and institutions.

Before the concept of creative destruction had been formulated by Joseph Schumpeter, it was needed a former conception about how people are conditioned by institutions: Bernard Mandeville pointed out how private vices might turn into public benefits, if politicians arranged the correct set of incentives. The main issue, thus, should be the process of discovery of such institutions.

That is why the said aversion to competition and innovation is hardly a problem of a misguided sense of justice, but mostly a matter of what we could coin as “bounded imagination”: the difficultly of reason to deal with complex phenomena. Don’t you think so, Horatio?

Nightcap

  1. Libertarian populism is still relevant in the Age of Trump Kevin Boyd, American Conservative
  2. What others have said about America James Poulos, Law & Liberty
  3. In praise of Viktor Orbán Lee Congdon, Modern Age
  4. Beyond the SETI paradigm Nick Nielsen, Grand Strategy Annex

Nightcap

  1. A history of true civilisation is not one of monuments David Wengrow, Aeon
  2. Feminism in Saudi Arabia Lindsey Hilsum, New York Review of Books
  3. The artwork of proto-Surrealist JJ Grandville Patricia Mainardi, Public Domain Review
  4. The other protest: Gazans against Hamas Shlomi Eldar, Al-Monitor

Nightcap

  1. Roosevelt, Taft, and the Nasty 1912 GOP Convention Rick Brownell, Historiat
  2. In praise of (occasional) bad manners Freya Johnston, Prospect
  3. What Should America Expect from a More Originalist Supreme Court? David French, National Review
  4. Why ultra-nationalists exceeded expectations in Turkey’s elections Pinar Tremblay, Al-Monitor

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.