Why the Nineties rocked (UnHerd)
AV Dicey as Legal Theorist (The Modern Law Review)
A twisted adaptation of the classic example of economic externalities: Golf club instead of serene houses, home day care in the place of noisy industrial unit.
Why the Nineties rocked (UnHerd)
AV Dicey as Legal Theorist (The Modern Law Review)
A twisted adaptation of the classic example of economic externalities: Golf club instead of serene houses, home day care in the place of noisy industrial unit.
2020 is turning into quite the publishing year.
Perhaps every year is like this and I just haven’t been paying attention before. Now, as I actively scan publisher sites and newsletters for upcoming books, there seems to be an abundance of super-interesting new stuff: how is anybody – even someone like me who does this for a living – supposed to keep up?
#1: The year began at full (or stagnating…?) speed with University of Houston professor Dietrich Vollrath‘s Fully Grown: Why a Stagnant Economy is a Sign of Success, With praise by Tyler Cowen and reviews in The Economist and the Wall Street Journal – and actually a lot of good discussions on Twitter – I’m sad that I haven’t taken time to read it. Later, perhaps, on the off-chance that nothing else on this incredible lists comes in the way.
#2: Next up was Diane Coyle‘s Markets, State, and People. Coyle, the endlessly interesting public intellectual/economist and newly(-ish) appointed Professor of Public Policy at Cambridge, is someone we all should read: she manages to be controversial and still balanced, provocative but still interesting. This book, however, seems to be in line with all the other “Third Way” books of last year: Acemoglu and Robinson’s The Narrow Corridor; Raghuram Rajan’s The Third Pillar; Branko Milanovic’s Capitalism, Alone. Crowded field. As I haven’t even gotten around to her previous book on GDP yet, I imagine I’ll read that one first whenever I carve out some time for Coyle.
The curse of modernity is quickly adding up.
#3: Changing gears somewhat – at least in terms of topics – I have started reading Charles Murray‘s Human Diversity: The Biology of Gender, Race, and Class and it’s exactly as provocative as you might think. Delivered, however, with the seriousness of scientific investigation and a massive chip on his shoulder. Still, exactly the kind of antidote to madness that fuels a lot of my priors. I’ll write up a comment or two whenever I finish this 528-page tome.
#4: In a similar vein is the Dutch writer and historian Rutger Bregman‘s Humankind: a Hopeful History, scheduled to be released in June. As Bregman isn’t somebody that I usually agree with, I’m very excited to read this take of his, which is hopefully a mix of Paul Bloom’s End of Empathy, Ruth DeFries’ The Big Ratchet and Paul Seabright’s The Company of Strangers. Sort of like Yuval Harari’s Sapiens but better (and no, I’m not on Team Harari – despite this excellent long-read in The New Yorker).
#5: Going back a little bit to what I think is chronologically the next book to be released (on Tuesday March 10 in the U.S., but not until April in the U.K.) is Robert Bryce’s A Question of Power: Electricity and the Wealth of Nations. Having recently written a piece on electricity generation and being into the weeds about climate change and emissions, I’m very curious about this take on electricity as a critical source for our prosperity. I hope it reads a little like an improved version of Zubrin’s best chapters in Merchants of Despair.
#6: March is also the month for Angus Deaton and Anne Case‘s Deaths of Despair and the Future of Capitalism (Amazon says it’s already out in the U.K.) Their hugely successful and highly relevant pet project for the last few years, Deaton and Case’s case(!) for how rising morbidity rates indicate a collapse of the fabric of society is a pretty standard one by now: globalization, economic inequality, the hollowing-out of tight-knit communities and the various forces that may have fueled this.
The reviews are already popping up left and right (WSJ, Financial Times) and their session was the most exciting and most talked-about at the ASSA meeting in San Diego. As I understand it, the latest findings is that American life expectancy – that pesky ever-increasing number that fell in recent years, in no small part due to overdoses and opioids – has recovered and is now again on the up-tick. Maybe Deaton and Case’s book will be one for an odd historic event rather than foreshadowing “The Future of Capitalism” (also, what’s up with shoving ‘Future of Capitalism’ into your titles?!).
#7: In a similar topic, Robert Putnam – yes, the Harvard professor famous for Bowling Alone and the idea of social capital – is back with another sweeping analysis of what’s gone wrong with American society. The Upswing: How America Came Together a Century Ago and How We Can Do It Again, coming out in June, is bound to make a lot of waves and receive a lot of attention by social commentators.
#8: Officially published just yesterday is John Kay and former Bank of England Governor Mervyn King‘s Radical Uncertainty: Decision-Making for an Unknowable Future. Admittedly, this is the book I’m least excited about on this list. Reviewing King’s 2016 End of Alchemy – where King discussed his experiences of the financial crisis and the global banking system – for the Financial Times, John Kay discussed exactly that: the title? “The Enduring Certainty of Radical Uncertainty.” Somebody please press the snooze button. Paul Krugman’s 4000 word review of End of Alchemy ought to be enough; I’d be surprised if Kay and King brings something new to the table in thus poorly-titled release (though, of course the fringe already loves it).
While the above eight titles are surely worth at least some of your time, the next five are worth all of it.
#9: I’ll begin with my two biggest hypes: Matt Ridley‘s How Innovation Works: And Why It Flourishes in Freedom, coming out May 14th in the U.K. and May 19th in the U.S. The author of The Rational Optimist and The Evolution of Everything is back with another 400-page rundown of a deep-seated and hyper-relevant topic: how do societies innovate and progress? What conditions assist it, and which obstacles prevent it?
#10: The second hype, William Quinn and John Turner‘s Book and Bust: A Global History of Financial Bubbles. Since John first told me about this book over a year-and-a-half ago, I’ve been super excited – I’m a big fan of his work – and I’m looking forward to receiving my review copy in the next couple of weeks. Publication date: August.
#11: For somebody who writes about bubbles and financial markets more than most people think healthy, I’m gonna get a warm-up in MIT professor Thomas Levenson‘s Money for Nothing: The South Sea Bubble & The Invention of Modern Capitalism. What’s with all these books on historical financial bubbles? Yes, you’re right: 2020 marks the three-hundred year anniversary of the South Sea Bubble, that iconic period of John Law in France and the similar government funding scheme in England will surely receive a lot of attention this year.
#12: Some environmental stuff at last: Bjørn Lomborg, the outspoken author and voice of reason in the climate change space announced that his False Alarm: How Climate CHange Panic Costs Us Trillions, Hurts The Poor, and Fails To Fix the Planet will be published in June this year! While possibly the least boring book on this list, the title receives lowest possible marks. What overworked publisher decided that this page-long subtitle was a good idea?!
#13: Also, Alex Epstein of the Centre for Industrial Progress and host of Power Hour (one of my all-time favorite podcasts) has been working on an update to his hugely popular The Moral Case for Fossil Fuels. As far as I understand, we’re to receive an updated and revised version in August – the Moral Case for Fossil Fuels 2.0!
So. The next six months have at least thirteen pretty interesting books coming up. I imagine there are a bunch more for the rest of the year – and a few I have completely overlooked.
Also, after this burst of links, Amazon should probably offer Notes On Liberty an affiliate program.
In sum: you can see my fields of interests overlapping here: (1) financial history and financial markets; (2) environment, climate change, and its solutions; (3) Big Picture society stories, preferably by interesting or quantitatively savvy authors. Not enough on the fourth big interest of mine: (4) money and monetary economics – particularly in historical contexts. Perhaps not, as David Birch’s Before Babylon, Beyond Bitcoin is on my desk, and I’m currently re-reading William Goetzmann’s Money Changes Everything – both first released in 2017.
Also: the absence or underrepresentation of women (or ethnic minorities or any other trait you care a lot about) might disturb you: 2 out of 17 authors women (4 out of 27 authors mentioned) – Needless to say, it must be because I’m sexist.
Post-script: Ha! As I just heard about Stephanie Kelton‘s upcoming book The Deficit Myth: Modern Monetary Theory and the Birth of the People’s Economy, I’m gonna quickly add it to the list and satisfy both of my qualms above: not enough women (now: 3/18 authors!), and not enough monetary economics. Splendid!
Happy reading, everyone!
“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”
Thomas Kuhn, The Structure of Scientific Revolutions p. 111
I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.
My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”
And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.
The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.
If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.
The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.
Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.
Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.
Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.
But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”
Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.
For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.
For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.
So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.
Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.
From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.
This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.
There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.
Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…
In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”
Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.
Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.
Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.
The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.
Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.
Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”
“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233
Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.
Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.
When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.
The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.
Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.
Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.
Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.
If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.
There were other attacks on the hoax, from Slate, Salon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.
Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.
The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”
Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.
In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.
However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.
3 die after attending HARD Summer rave near Fontana (http://lat.ms/2aKrN6q)
I just attended this concert, and lived. There was around 150,000 people. HARD Summer is an annual festival for electronic dance music, ordinarily thrown in the Los Angeles fairgrounds but moved to Fontana this year. Three twenty-year olds died during the two day event, presumably from drug overdoses; another two died last year, and eight have died from drug-related causes within LA county since 2006.
The intuition is simple: drugs are so popular at concerts is because it is one of the very few public places to actually engage in use without fearing legal consequences; few people get arrested while hidden in a crowd. Recreational effects are secondary, because recreational considerations account for all gatherings. It’s also a great way to make new friends, and factors as part of the culture, etc. The criminalization of drugs means that they are taken covertly instead of publicly, and thus much more dangerously and ignorantly. So, concert-goers, to satisfy their adventurousness and recreational fixation, must purchase their drugs in the streets and sneak them through security, instead of buying them safely inside from some reputable dealer. And there are cops on the premises, and not medical practioners and drug safety experts. (Cops that are especially incompetent with public health, as this article suggests.)
And so young adults die at these events, and their parents blame the management, the county, the city – for “failing to protect” the rave’s attendees from pushers distributing drugs. A lawsuit was filed last month, citing negligence and wrongful death, in the case of woman who consumed “what she believed to be pure ecstasy,” after she died of multiple drug intoxication. The promoter’s owner, Live Nation, the city, the County Fair Association, down to the security, Staff Pro, face the suit. There could possibly be a measure of protective failure. The management doesn’t make promises or guarantee welfare to the individual attendees, but the police, also known as public safety officers, were not able to effectively use CPR, according to a witness in the parking lot. In California, law enforcement is required, under Police Officers Standards and Training, to be accredited to perform CPR. Yet, even if legal responsibility was on the officer, moral responsibility rests on no one.
The risk-taking behavior was entirely in the hands of the attendees. Health as a consequent of personal risk-taking is inherently a personal responsibility. When consuming drugs – which are infinitely more dangerous because of criminalization – the consumer also incurs a perceived risk (based on subjective probability), proportionate to several external factors. One of these factors is the hospitality and security of the local environment. If it could be shown that assurance of protection had been made on behalf of nearby staff or officers, resulting in a reasonable estimation of security, a moral duty would be invested. No such guarantee existed though. On a side note, the staffers even provided free water, which is actually rare at these events, and vital for safe drug use. (But not as an antecedent necessarily resulting in safety, nor even enough to lower the perceived risk substantially such that otherwise drugs would not be consumed.)
The parents of the deceased twenty-year olds are planning to sue whoever could legally be held accountable, but I think it’s easy to see the difficulty in assigning meaningful blame. I know, also, that many people, more reasonable than the parents but not wholly impartial, want to blame the consumers themselves. I don’t think it is an altogether correct judgment to blame drug consumers for their deaths, simply for trying to squeeze more pleasure out of a state-suppressed existence. There exists responsibility, but the blame is incalculable and worthless to investigate. Who can rightfully be held accountable? The event organizers, for trying to suppress drugs but inevitably failing at whole prohibition? The pushers, in their harsh realism, living dangerously to supply wealthy and risky (but competent) young adults with their demands? The drug “kingpins”, for functioning productively in an open market with high demand, with full consent of all involved parties? The basement scientists, for discovering new chemical arrangements – agents that can be used medicinally as well as recreationally; agents that are inherently neutral to their alteration, route or variety of consumption? The Earth, and Nature itself, for creating the ingredients? I believe the chain of thought concludes with a puritanical condemnation of human nature. Human nature as something to be escaped, battled with religion or values; at the very least it must be vehemently detested by society. This is the conclusion of those who would want to sue others for their children’s use of drugs.
There are those too, that want to simply change the United State’s drug culture: our alcoholism, our designer drug scene. This not through laws necessarily. It’s worth pointing out, however, that whenever someone expresses the desire to change a cultural aspect, he or she can only be saying, in veiled language, that their ideology should replace the current ideology. There is no society, there are only individuals in that society; talking about battling “society” can only mean pushing on a new ideology to others. Society’s temperament and exclusive nature can be chalked up solely to psychological states in the brains of its members. When recognized as a useful fiction to describe coordinated groups of people, instead of an emergent quality, cultural attitudes can be critized. Otherwise, writing polemics about society, and not individuals in the social sphere, makes clear an authoritarian intent: group all these people together and inflict my rules; empower me with merciless authority; subjugate dissenters to anonymity.
(For a brief aside, this is one of the idealistic problems of progressive movements: their unceasing condemnation of an unreal entity. The great majority of people blame their problems on society. There’s a classic idiom, occasionally attributed to Neitzsche, that “God is in the details”; used to stress the significance of detail, it can also be used rather literally to describe man’s desperate search for God. In early history, the Western world thought its God lived in the clouds above, e.g., the tower of Babel. After the invention of the telescope, the world moved its God back to outer space. Now, with our advanced technology, we can see billions of light-years into space – with ourselves at the radius of the observable universe, of 45 billion light-years – and still cannot find God. So, the theological theories have changed (now God is “all around us,” or “in another dimension,” and he breaks the laws of physics and logic). The way that people brood on their social problems is similar. Without the ability to accurately pinpoint an antagonist, the invincible figure of Society is summoned to scapegoat problems that may not have any material instances. Thus, “institutional” is really a synonym for “individual.”)
It is detestable to enforce, legally or idealisticaly, a new ideology upon others. But the true moral repugnancy of this entire situation, rather than resting on event administrators, rests on those that would sue others – and thus attempt to prevent another 149,997 people from having a good time next year – for a grand payout because they cannot cope with their children’s choices, after they themselves raised them.