Where are our manners?

“Manners Makyth Man.” William of Wykeham said that back in a distant past when the letter “y” was at peak popularity. I thought of that quote today as I read about the shrill outrage over Karen Pence’s unremarkable job at a Christian school. There’s a great speech expounding on William of Wykeham’s quote, delivered about a century ago by Lord John Fletcher Moulton in London. He entitled his speech, “Law and Manners,” and its message could really use another go around.

Lord Moulton’s speech begins by dividing human action into three domains: the domain of positive law, the domain of absolute choice, and the domain of what he calls “manners.” This last domain is his essential topic, which he defines as “obedience to the unenforceable.”

Manners, by which he means something akin to duty or morality but encompassing more than both, are sandwiched between the worlds of positive law and absolute choice. This realm of manners is where we may act as we choose but we nonetheless face constraints that are outside the force of law. His basic premise is that the larger the middle domain, the healthier the society. He says, “The true test is the extent to which individuals composing the nation can be trusted to obey self-imposed law.” Encroachment from the realms of positive law and absolute choice pose a danger.

Lord Moulton does not suggest that the two outer domains are bad. They are vital. But if either expands too far into the middle, trouble awaits. If positive law expands too far, it stifles the freedom necessary for a flourishing society. On the other hand, if people feel completely unrestrained in their exercise of freedom, civil society begins to sag, and the danger that positive law will sweep in to pick up a perceived slack increases. As one religious leader put it, “We would not accept the yoke of Christ; so now we must tremble at the yoke of Caesar.”

Given these threats to the middle domain, Lord Moulton feared that “the worst tyranny will be found in democracies.” Minority interests will get chewed up by the voracious appetite of a positive law driven by a majority.  The representatives of the majority “think that the power and the will to legislate amount to a justification for that legislation. Such a principle would be death to liberty. No part of our life would be secure from interference from without. If I were asked to define tyranny, I would say it was yielding to the lust of governing.”

The maintenance of the middle domain depends on growth of a robust civil society sheltered from majority dominance. Religion, culture, tradition, diasporas—communities independent of the state must exist with some genuine autonomy for the middle domain to survive and thrive.

And this brings me back to Karen Pence working at a Christian school that (trigger outrage) requires students and teachers to abide by traditional Christian values. Whether or not those values are correct or not is not at all the point. Those eager to slap down a law at the first hint of a disagreement need to understand that tolerance for even genuinely illiberal viewpoints is essential to the success of liberal democracy. Organizations must have some power to define themselves apart from the prerogatives of the state to establish a framework for obedience to the unenforceable. As the Supreme Court put it, people must have space to organize communities separate from state interference that can serve as competing purveyors of norms. Such groups provide an essential “counterweight . . . to the State’s impulse to hegemony.” Thus, organizations that can establish their own norms apart form majority interference prevent the encroachment of positive law into the middle domain.

I worry that we are seeing simultaneous encroachment from both the realms of positive law and absolute choice. People outraged at Karen Pence’s new job feel convinced that the positive law should thrust its tentacles into group dynamics, thereby swallowing civil society into an all-pervading state orthodoxy. On the other hand, a sneering sense of moral relativity that frowns upon any attempt to speak up for solid norms encroaches from the other end—the perversion of tolerance that believes in no genuine moral structure outside what the law “makyth.” The letter “Y” may be a consonant and a vowel, but that doesn’t mean we can live without unenforced rules. Lord Moulton warned us about this. It’s time we mind our manners.

Law, Judgement, Republicanism

Draft material for a joint conference paper/Work in Progress on a long term project

This paper comes out of a long term project to work on ideas of liberty in relation to republicanism in political thought, along with issues of law and sovereignty. The paper in question here comes out of collaborative work on questions of law, judgement, and republicanism in relation to Turkey’s history and its current politics. Though this comes from collaborative work, I take sole responsibility for this iteration of draft material towards a joint conference paper, drafted with the needs of a blog with a broad audience in mind.

The starting point is in Immanuel Kant with regard to his view of law and judgement. His jurisprudence, mostly to be found in the first part of the Metaphysics of Morals on ‘The Doctrine of Right’, is that of law based on morality, so is an alternative to legal positivism. The argument here is not to take his explicit jurisprudence as the foundation of legal philosophy. There is another way of looking at Kant’s jurisprudence which will be discussed soon. 

What is particularly valuable at this point is that Kant suggests an alternative to legal positivism and the Utilitarian ethics with which is has affinities, particularly in Jeremy Bentham. Legal positivism refers to a position in which laws are commands understood only as commands, with regard to some broader principles of justice. It is historically rooted in the idea of the political sovereign as the author of laws. Historically such a way of thinking about law was embedded in what is known to us as natural law, that is, ideas of universal rules of justice. This began with a very sacralised view of law as coming from the cosmos and divine, in which the sovereign is part of the divinely ordained laws. Over time this conception develops more into the idea of law as an autonomous institution resting on sovereign will. Positivism develops from such an idea of legal sovereignty, leaving no impediment to the sovereign will.

Kant’s understanding of morality leaves law rooted in ideas of rationality, universality, human community, autonomy, and individual ends which are central to Kant’s moral philosophy. The critique of legal positivism is necessary to understanding law in relation to politics and citizenship in ways which don’t leave a sovereign will with unlimited power over law. Kant’s view of judgement suggests a way of taking Kant’s morality and jurisprudence out of the idealist abstraction he tends towards. His philosophy of judgement can be found in the Critique of Judgement Power, divided into parts on aesthetic judgments of beauty and teleological judgments of nature.

The important aspect here is the aesthetic judgement, given political significance through the interpretation of Hannah Arendt. From Arendt we can take an understanding of Kant’s attempts at a moral basis for law, something that takes political judgement as an autonomous, though related, area. On this basis it can be said that the judgement necessary for there to be legal process, bringing particular cases under a universal rule, according to a non-deterministic subjective activity, on the model of Kant’s aesthetic judgement is at the root of politics.

Politics is a process of public judgement about particular cases in relation to the moral principles at the basis of politics. The making of laws is at the centre of the political process and the application of law in court should also have a public aspect. We can see a model of a kind in antiquity with regard to the minor citizen assembly, selected by lottery, serving as a jury in the law courts of ancient Athens. It is Roman law that tends to impose a state oriented view of law, in which the will of the sovereign is applied in a very absolutist way, so that in the end the Emperor is highest law maker and highest judge of the laws.

As Michel Foucault argues, and Montesquieu before him, the German tribes which took over Roman lands had more communal and less rigidly defined forms of court judgement, and were more concerned with negotiating social peace than applying laws rigidly to cases. Foucault showed how law always has some political significance with regard to the ways in which sovereignty works and power is felt. That is the law and the work of the courts is a demonstration of sovereignty, while punishment is concerned with the ways that sovereignty is embedded in power, and how that power is exercised on the body to form a kind of model subjugation to sovereignty. The Foucauldian perspective should not be one in which everything to do with the laws, the courts, and methods of punishment is an expression of politics narrowly understood.

The point is to understand sovereignty as whole, including the inseparability of institutions of justice from the political state. The accountability of the state and the accountability of justice must be taken together. Both should work in the context of public accessibility and public discussion. The ways in which laws, courts, and judges can be accountable to ideas of autonomy must be declared and debate. Courts should be understood as ways of addressing social harms and finding reconciliation rather than as the imposition of state-centric declarations of law.

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.

“We’re all nothing but bags of stories”: Carlos Castaneda as a Countercultural Icon and Budding Post-Modernist

Exploring the countercultural 1960s and the origin of Western New Age, one cannot bypass Carlos Castaneda. He became a celebrity writer because of his bestselling book The Teachings of Don Juan: A Yaqui Way of Knowledge that was published by the University of California Press in 1968. The book was written in a genre of free-style dialogues between a Native American shaman named Don Juan Matus and Castaneda himself, who claimed that he worked with Don Juan for many years. The Teachings describes how Castaneda learned to use three hallucinogenic plants: peyote, jimson weed, and psychedelic mushrooms. After ingesting these substances, Castaneda went through mind transformations and learned that there were other realities besides the ordinary one. Later, it was revealed that he made up the whole experience, but this never affected his popularity.

Carlos-Castaneda-The-Teachings-of-Don-Juan

Of course, a book like this was well-tuned to the then-popular hallucinogenic subculture, and the link between Castaneda’s text and the psychedelic ‘60s is the most common explanation of his popularity. Yet I want to argue that this is a very narrow view, which does not explain why Castaneda’s follow up books, which had nothing to do with psychedelics, continued to enjoy popularity well into the 1990s. In fact, by the early 1980s, Castaneda became so paranoid about hallucinogens that he forced his girlfriend to undergo drug tests before allowing her to sleep with him. I also argue that viewing Castaneda exclusively as one of the spearheads of the New Age does not explain much either. The appeal of his texts went far beyond the New Age. In the 1970s and the 1980s, for example, his books were frequently assigned as conventional course readings in anthropology, philosophy, sociology, religious studies, and humanities classes.

Let me start with some biographical details. Castaneda was born Carlos Arana in Peru to a middle class family and moved to the United States in 1951. He tried to enter the world of art but failed. Then, for a while, he worked as a salesman while simultaneously taking classes in creative writing before eventually enrolling in the anthropology graduate program at UCLA.

Originally Castaneda did not care about hallucinogens and the emerging hippie culture, but eventually UCLA (and the broader California environment), which was saturated at that time with various counterculture and unchurched spirituality projects, made him choose a sexy topic: the use of psychedelics in a tribal setting. The book which made him famous, The Teachings of Don Juan, originated from a course paper on “power plants” and from his follow-up Master’s thesis. I want to stress that both papers were essentially attempts to find a short-cut to satisfy the requirements of his professors. His first professor, an anthropologist, invited those students who wanted to get an automatic “A” to find and interview an authentic Indian. Despite a few random contacts, Castaneda could not produce any consistent narrative, and had to invent his interview. This was the origin of his Don Juan character. Then he followed requirements of his advisor, Harold Garfinkel, a big name in sociology at that time and one of the forerunners of postmodernism. Garfinkel made it explicitly clear to Castaneda that he did not want him to classify and analyze his experiences with Don Juan scientifically.

What Garfinkel wanted was a free-style and detailed description of his work with the indigenous shaman as it was and without any interpretation. Thus it was through collective efforts that Castaneda produced a text that by chance caught the attention of the university press as a potential bestseller. Essentially, Castaneda took to the extreme incentives provided to him by his professors and by the surrounding subculture. He internalized these incentives by composing a fictional text, which he peddled as authentic anthropological research. It is interesting to note that in 1998, just before he died, Castaneda made the following mischievous remark in his introduction to the last anniversary edition of The Teachings of Don Juan: “I dove into my field work so deeply that I am sure that in the end, I disappointed the very people who were sponsoring me.”

The popularity of the first book gave rise to the whole Don Juan sequel, which made Castaneda an anthropology and counterculture star. The combined print run of his books translated in 17 languages reached 28 million copies. And, as I mentioned above, despite the revelations that his Don Juan was a completely fabricated character, the popularity of his books was increasing throughout the 1970s. In fact, to this present day, libraries frequently catalogue his books as non-fiction.

It seems that Castaneda’s appeal had something to do with overall trends in Western culture, which made his text resonate so well with millions of his readers. For this reason, I want to highlight the general ideological relevance of Castaneda’s books for the Western zeitgeist (spirit of the time) at its critical juncture in the 1960s and the 1970s. Various authors who wrote about Castaneda never mentioned this obvious fact, including his most complete biography by French writer Christophe Bourseiller, Carlos Castaneda: La vérité du mensonge (2005). So exploring the ideological relevance of the Don Juan books will be my small contribution to Castanediana.

To be specific, I want to point to two themes that go through all his books. First, he hammered in the minds of his readers the message of radical subjectivism, which in our day it is considered by some a conventional wisdom: What we call truth is always socially constructed. Don Juan, who in later books began speaking as a philosophy professor, repeatedly instructed Carlos that so-called reality was a fiction and a projection of our own cultural and individual experiences, and instead of so-called objective reality, we need to talk about multiple realities. In an interview for Time magazine, Castaneda stressed that the key lesson Don Juan taught him was “to understand that the world of common-sense reality is a product of social consensus.” Castaneda also stressed the role of an observer in shaping his or her reality and the significance of text in Western culture. In other words, he was promoting what later became the hallmark of so-called postmodern mindset.

Second, fictional dialogues between the “indigenous man” Don Juan, whom Castaneda portrayed as the vessel of wisdom, and Castaneda, a “stupid Western man,” contained another message: remove your Western blinders and learn from the non-Western ones. Such privileging of non-Western “wisdom” resonated very well with Western intellectuals who felt justified frustration about the hegemony of positivism and Western knowledge in general and who looked for an intellectual antidote to that dominance. By the 1990s, this attitude mutated into what Slavoj Zizek neatly labelled the “multiculturalist’s basic ideological operation,” which now represents one of the ideological pillars of Western welfare-warfare capitalism.

At the end of the 1970s, several critics tried to debunk Castaneda. They were able to prove that his books were the product of creative imagination and intensive readings of anthropological and travel literature. These critics correctly pointed out that Castaneda misrepresented particular indigenous cultures and landscapes. Besides, they stressed that his books were not written in a scientific manner. Ironically, this latter criticism did not find any responsive audiences precisely because social scholarship was moving away from positivism. Moreover, one of these critics, anthropologist Jay Fikes, who wrote a special book exposing Castaneda’s hoax, became a persona non grata in the anthropology field within the United States. Nobody wanted to write a reference for him, and he had to move to Turkey to find an academic position.

What critics like Fikes could not grasp was the fact that the Castaneda texts perfectly fit the emerging post-modernist thinking that was winning over the minds of many Western intellectuals who sought to break away from dominant positivism, rationalism, and grand all-explaining paradigms. To them, an antidote to this was a shift toward the subjective, individual, and spontaneous. The idealization and celebration of non-Western knowledge and non-Western cultures in general, which currently represents a powerful ideological trend in Western Europe and North America, became an important part of this intellectual revolt against the modern world. I am sure all of you know that anthropology authorities such as Clifford Geertz (until recently one of the major gurus of Western humanities), Victor Turner, and Claude Lévi-Strauss were inviting others to view any cultural knowledge as valid and eventually erased the border between literature and science. They also showed that scholarship can be constructed as art. Castaneda critics could not see that his texts only reflected what was already in the air.

Castaneda_Time magazine

The person who heavily affected the “production” of the first Don Juan book, which was Castaneda’s revised Master’s thesis, was the above-mentioned sociologist Garfinkel. As early as the 1950s, Garfinkel came up with ideas that contributed to the formation of the post-modern mind. I am talking here about his ethnomethodology. This school of thought did not see the social world as an objective reality but as something that individuals build and rebuild in their thoughts and actions. Garfinkel argued that what we call truth was individually constructed. Sometimes, he also called this approach “people’s sociology.” He stressed that a scholar should set aside traditional scientific tools and should simply narrate human experiences as they were in all details and spontaneity. Again, today, for many, this line of thinking is conventional wisdom, but in the 1950s and the 1960s it was revolutionary. Incidentally, for Castaneda it took time to figure out what Garfinkel needed from him before he rid his text of the vestiges of “positive science.” To be exact, Castaneda could not completely get rid of this “science” in his first bestselling book. In addition to the free-flowing and easy-to-read spontaneous dialogues with Don Juan, Castaneda attached to the text an appendix; a boring meaningless read that he titled “Structural Analysis.” In his later books, such rudiments of positivism totally disappeared.

When Castaneda was writing his Master’s thesis, Garfinkel made him revise the text three times. The advisor wanted to make sure that Castaneda would relate his spiritual experiences instead of explaining them. Originally, when Castaneda presented to Garfinkel his paper about a peyote session with Don Juan, the text was formatted as a scientific analysis of his own visions. The professor, as Castaneda remembered, rebuked him, “Don’t explain to me. You are nobody. Just give it to me straight and in detail, the way it happened. The richness of detail is the whole story of membership.” Castaneda spent several years revising his thesis and then had to revise it again because Garfinkel did not like that the student slipped into explaining Don Juan psychologically. Trying to be a good student, Castaneda embraced the advice of his senior colleague. So the final product was a beautiful text that was full of dialogues, rich in detail, and, most importantly, came straight from the “field.”

I interviewed some of Castaneda’s classmates and other scholars who became fascinated with his books at the turn of the 1970s. Many of them had no illusions about the authenticity of Don Juan. Still, they argued that the whole message was very much needed at that time. A quote from Douglas Sharon, one of Castaneda’s acquaintance, is illustrative in this regard. In his conversation with me, Sharon stressed:

“In spite of the fact that his work might be a fiction, the approach he was taking—validating the native point of view—was badly needed in anthropology, and, as a matter of fact, I felt it was a helping corrective for the so-called scientific objectivity that we were taking into the field with us.”

I want to mention in conclusion that Castaneda not only promoted the postmodern approach in his novels but also tried to live it. Before the age of Facebook and online forums, Castaneda, with a group of his followers, became involved in an exciting game of identity change. They came to enjoy confusing those around them by blurring and constantly changing their names and life stories. For example, people in his circle shredded their birth certificates and made new ones. They also performed mock wedding ceremonies to make fun of conventional reality. To those who might have had questions about this “post-modernist” game, Castaneda reminded: “We’re all nothing but bags of stories.”

A (very) Quick Primer on Natural-Rights.

by Adam Magoon

The first step in understanding natural rights theory is to ask a simple but profound question.  Do you own yourself?

Well, let’s start with the definition of ownership.  Dictionary.com gives us “the act, state, or right of possessing something.” Digging deeper we find the definition of possession as “the state of having, owning, or controlling something.” The last part of that definition is key; controlling.  There is a modicum of truth in the old adage possession is 9/10ths of the law.  Nine times out of ten to own something is to control it.

Now getting back to our original question: Do you own yourself?  Well do you control your own body and mind?  We do not need to delve into psychology to answer this question.  I alone can move my arms up and down, I can choose to stand, walk, eat, think, write, create, or to do nothing at all.  I alone am in control over my body.    This is an indisputable fact.  The very act of questioning this fact proves it true; for if you do not have control over your thoughts and actions how could you possibly disagree?

Self-ownership is the cornerstone of libertarian natural rights philosophy and what the libertarian means when he uses the term “natural rights”.

To quote Murray Rothbard: “The fundamental axiom of libertarian theory is that each person must be a self-owner, and that no one has the right to interfere with such self-ownership”

Under this philosophy of self-ownership there are two important subcategories that I will just touch on for further elaboration at another time.

The Non-aggression Principle: is an ethical stance which asserts that “aggression” is inherently illegitimate. “Aggression” is defined as the “initiation” of physical force against persons or property, the threat of such, or fraud upon persons or their property.

This is why the threat of violence cannot be used to negate the concept of self-ownership.  Holding a gun to my head and telling me to raise my arm does not mean you own the right to raise my arm any more than a thief owns the jewelry he stole.  Ownership cannot be transferred through violent means.

And the concept of homesteading which is best explained by John Locke:

“[E]very man has a property in his own person. This nobody has any right to but himself. The labour of his body and the work of his hands, we may say, are properly his. Whatsoever then he removes out of the state that nature hath provided, and left it in, he hath mixed his labour with, and joined to it something that is his own, and thereby makes it his property. It being by him removed from the common state nature placed it in, it hath by this labour something annexed to it that excludes the common right of other men. For this labour being the unquestionable property of the labourer, no man but he can have a right to what that is once joined to. . . .

He that is nourished by the acorns he picked up under an oak, or the apples he gathered from the trees in the wood, has certainly appropriated them to himself. Nobody can deny but the nourishment is his. I ask then when did they begin to be his? . . . And ‘tis plain, if the first gathering made them not his, nothing else could. That labour put a distinction between them and common. That added something to them more than nature, the common mother of all, had done: and so they become his private right. And will any one say he had no right to those acorns or apples he thus appropriated, because he had not the consent of all mankind to make them his? . . . If such a consent as that was necessary, man had starved, notwithstanding the plenty God had given him. We see in commons, which remain so by compact, that ‘tis the taking part of what is common, and removing it out of the state Nature leaves it in, whichbegins the property; without which the common is of no use”

Very quickly I will also mention a couple of the more common arguments that arise when natural rights are discussed.

First, natural rights do not extend from god or any other supernatural or theological forces.  They are based on rational and philosophical thought.  They are what is known as an “a priori”  argument.  To put it simply, natural rights are a logical deduction based on a number of easily recognized facts, primarily the concept of self-ownership.

Second, governments do not, and indeed cannot, grant any rights that natural rights have not already granted.  Let’s look at a current event that everyone always seems to think about backwards; the legalization of drugs for personal consumption.  Because of the right to self-ownership each and every individual already has the right to do whatever they choose with their own body as long as they do so with their own property and do not violently harm others in the process.   Even if the U.S. government “legalized” the use of drugs tomorrow, they are not granting anyone the right to do drugs, they are merely removing their own restrictions on something that is already a right.   The idea that law comes from the state is known as ‘legal positivism’  and proponents are hard pressed to defend actions such as slavery and extermination that were made legal by many nations throughout the course of human history.

 

Recommended Reading:

http://mises.org/rothbard/ethics/ethics.asp