- Why we fight over fiction Robin Hanson, Overcoming Bias
- Why money matters Scott Sumner, MoneyIllusion
- Stand up with Aristotle Irfan Khawaja, Policy of Truth
- Never reason from a fatality change Nick Cowen, NOL
reason
Intellectuals You Should Know About
I read a lot. Wide, deep and across quite a number of different fields. As a self-proscribed ‘writer’ and ‘editor’, reading much is both satisfying an intellectual desire and a professionally useful practice in familiarize myself with various styles, voices and topics. A common tip for aspiring writers is to read someone they admire and try to imitate their style; at this, at least, I am somewhat successful, as a friend recently told me that my style reminded him of Deirdre McCloskey. Full of idolized admiration for Deirdre’s work, I couldn’t imagine a higher praise.
As readers, the eternal curse of modernity is our laughable inability to keep up with the couple of millions of books that are published every year. Not to mention written materials on blog or respectable outlets or in magazines and journals. As consumers of the written word, we are completely outstripped, utterly defenseless and overwhelmingly inundated.
When in September I published my discussion of geographer and anthropologist Jared Diamond’s impressive work, I got a lot of feedback of astonishment from friends and family – including the friend that praised me for occasionally (accidentally…?) write like McCloskey: “Wow,” he said, “I’ve never heard of him before!”
Huh, I thought. I wonder what other household names of public intellectuals are not read as much as they deserve.
My exact reaction of astonishment was more like a gaping “What?!”, betraying my wanna-know-everything attitude, slight elitism and writer lifestyle. Contrary to the belief that our times is one of all talking and no listening (well, writing and no reading), it takes a vast amount of reading before you can produce anything that others want to read. Sure, anybody with a laptop and an internet connection can start a blog and flush out their thoughts (I did so for years) but it takes knowledge to say something intelligent and interesting – knowledge acquired by extensive reading.
It also takes a lot of practice to develop a voice of one’s own. Authors with astonishing and recognizable writing styles are made, not born.
What, then, should you read?
In light of this surprise, I decided to make a list of intellectuals I would advise anybody to read. Note that this is not a list of the most important thinkers ever, nor is it a collection of the most profound academic contribution to various disciplines. Instead it’s a gathering of writers whose popular writing (often in addition to their rigorous academic work) is exactly that – popular. That means that a lot of others liked them (and if you’re anything like others, you might too) and more importantly: a lot of smart people you meet are rather likely refer to these authors or to the ideas contained in their work. Here are 11 authors I would consider to be household names and whose writing will make you a much smarter and interesting person.
Jared Diamond
Let’s begin our list with aforementioned Jared Diamond, whose trilogy on humanity is compulsory reading for pretty-much everyone. This year he released Upheaval, which received very mixed responses and that I decided to skip after hearing his pitch on Sam Harris’ Making Sense podcast. Diamond’s publisher maintains that this is the third installment of his “monumental trilogy” of how civilizations rise and fall, but to me that was The World Until Yesterday:
- Guns, Germs and Steel is the book that definitely made Diamond a well-known name, the kind of Big Picture civilizational economic history we have recently seen in Yuval Harari’s work – the author of Sapiens: A Brief History of Humankind, that strangely boring book that everyone seems to be reading these days – or the less well-known but more captivating Columbia professor Ruth DeFries’ The Big Ratchet. If you like, you could describe this Pulitzer prize-winning book as well-written geographical reasons for why the West is rich and the Rest isn’t. If that’s your thing, read away.
- Collapse: How Societies Choose to Fail or Succeed, the book that my September piece was mostly concerned with, is a dense story of many different human civilizations falling apart: Easter Islanders, Native Americans in the dry southwest or central America and my favourite chapter: The Greenland Norse. Complemented with the Fall of Civilizations podcast and Dan Carlin’s recent book The End is Always Near would make you ridiculously interesting to talk to in these hyper-catastrophist times. Upheaval is a natural extension of Collapse so if you crave more, that one is for you.
- I would rather point to The World Until Yesterday for Diamond’s third gem as it is a deep dive into the lives of traditional societies in general, but in practice mostly New Guinean societies. Somehow, Diamond made anthropology exciting!
Paul Collier
Rapidly moving up in controversy, Paul Collier is an Oxford development economist whose work most intellectuals have a distinctly firm opinion about. His popular claim to fame rests on:
- Exodus, a very cool (and prescient!) take on global migration. Highly recommended.
- The Bottom Billion, for a plunge into global poverty and development economics. It might be slightly outdated (published in 2007) as many of the 60 failing countries he identifies have seem quite some growth in the last decade.
I should also recommend his latest book, Future of Capitalism, but I wasn’t very impressed with it. In these times of political polarization, populist uprisings, urban-rural divides and worries about AI, it is still a relevant read.
Whenever Collier speaks, you want to listen.
The Four Horsemen of Atheism (or “New Atheism”):
Christopher Hitchens, Sam Harris, Richard Dawkins, and Daniel Dennett
to which we should add the “one Horse-woman“, Ayaan Hirsi Ali, whom I’m ashamed to only know as “the wife of Niall Ferguson” (yes, my background is money and history, OK, not politics or religion…).
Together, these 5 brilliant minds may have helped many out of their religiosity, but their contributions loom much larger than that. As most of the Western world has gradually abandoned faith, their religious inclinations have turned to other areas: environmentalism (Mike Munger’s take on recycling never gets old!), invented hierarchies or social justice. The writings of these five horsemen can be hugely beneficial here too. Some recommended reading includes:
- Dawkins: The God Delusion (update: Outgrowing God: A Beginner’s Guide, just released last month, apparently at first intended for children/teenagers)
- Hitchens: God Is Not Great
- Dennett: Darwin’s Dangerous Idea
- Harris: The End of Faith (but I like The Moral Landscape even more. Disclaimer: I’m a voracious consumer of his Making Sense podcast)
- Hirsi Ali: Infidel: My Life
Speaking of Ferguson, as I’m a big financial history guy, I am shamelessly squeezing in this prolific writer, professor (well, Senior Fellow at Hoover institution nowadays) and public intellectual:
- The Ascent of Money, which was my introduction to Ferguson during second year of Uni and still my favourite book of his
- The Cash Nexus, which I confess to not having read. Shame, I know.
- House of Rothschild (Money’s Prophets + The World’s Banker), the massive two-volume biography of the Rothschilds and an absolute treasure trove for 19th century European finance. Whenever I need some background info on that topic – or I find myself bored around a well-equipped academic library – I browse Ferguson’s diligent archival work.
- Empire: How Britain Made the Modern World, the controversial “maybe it wasn’t all bad…?” take on British imperialism. Predictably, Ferguson generated some outrage over this.
- Civilization, “a book that belongs at the more populist end of the Ferguson oeuvre” which we can also say about:
- The Great Degeneration (which I didn’t mind reading, but wasn’t overly impressed with).
I should also mention his two-volume biography of Henry Kissinger (first volume 2015, next probably finished next year), which I ignored (politics is boring) and his recent book The Square and the Tower, which I heard very bad things about – and so downgraded for now.
Steven Pinker
Ah, this Harvard cognitive scientist and linguist-turned-public-intellectual is a must-read. His top trilogy, which I voraciously consumed last fall, includes:
- The Blank Slate, the best description of this book that I ever heard came from Charlotta Stern, sociologist at Stockholm University: every sound argument against the “Nurture Only”-idea that biology doesn’t matter compiled into a single book. Yes, you want to read it.
- The Better Angels of Our Nature, a Big Picture humanity-scale look at violence, resurrecting Norbert Elias’ Civilizing Process theory to explain why we hurt and kill each other less than at probably any point in human history. Nassim Nicholas Taleb (see below) is decidedly not convinced.
- Enlightenment Now! The Case for Reason, Science, Humanism, and Progress, as if Better Angels wasn’t Big Picture enough, here’s the ultimate case for why humanity is doing pretty well, why doomsday sayers are wrong on every count and why we shouldn’t despair. Many of the topics of Better Angels re-occur in Enlightenment Now!, but I don’t regret reading both as Pinker’s prose is easy to follow and his content well-sourced should you require more convincing. Originally a cognitive scientist, he has a ton of more books you might wanna check out – The Language Instinct, for instance, ranks pretty high on my Next Up list:
- The Language Instinct
- How the Mind Works
- The Stuff of Thought
Matt Ridley
Speaking of optimistic people taking a Big Picture view of humanity, zoologist and science writer Matt Ridley is a must. Tall (like me!), Oxford-educated (like me!) and techno-optimist (like me!), no wonder I like him.
- Rational Optimist, a book in the same style as Pinker’s Enlightenment Now!, Hans Rosling’s Factfulness, Johan Norberg’s Progress, and Angus Deaton’s The Great Escape, briefly summarised as: Shit is getting better. Accept it.
- The Origin of Virtue, one of his earlier books in the 1990s that I haven’t read yet (together with Genome and The Red Queen), but I imagine is similar in content to Nicholas Christakis 500-page Blueprint from earlier this year (which I have read).
- Ridley’s most recent book is from 2015 Evolution of Everything and we’ll blame his House of Lords duties for distracting him from his forthcoming book on Innovation that I’ve written about before (How Innovation Works and Why It Flourishes in Freedom).
At last, How Innovation Works is schedule for May 2020.
Nassim Nicholas Taleb
Oh, boy – here’s a controversial one. Frequently does he get into loud and hostile arguments with other high-profile intellectuals, and rarely does he pull any punches. His popular writing is found in the “Incerto” serie – the Latin term for ‘doubt’ or ‘uncertainty’ that capture Taleb’s core work. The set of books are together described as “an investigation of luck, uncertainty, probability, opacity, human error, risk, disorder, and decision-making in a world we don’t understand:”
They are intended to push One Big Idea: that we frequently overlook how random the world is, ascribing causality where none belongs and overestimate what we can know from (relatively recent) past events. Black Swans, the proverbial unpredictable event, dominates the social sciences in Taleb’s view. While the 2000-odd pages worth of the Incerto series may seem daunting, the books (and even the individual chapters) are designed not to fall very far from each other. The interested reader can, in other words, pick any one of them and work backwards in accordance with whatever is of interest. You wanna read all – or any – of them.
Having read Fooled by Randomness first, I’ve always held that highest. Be ready for a lot of sarcastic and frequently hostile (but thoughtful) objections of things you took for granted.
In sum: just bloody read more
Any selection of important contemporary intellectuals is arbitrary, highly skewed and super-unfair. There are more, many more, whose fantastic writings deserve attention. As I said, the eternal curse of modernity is our laughable inability to keep up with avalanche of cool stuff written every year.
As readers, we are overrun – and the only thing you can do to keep is is to read more. Read widely.
Above are some amazing thinkers. Drop me a line or tweet me with readings you would add to a list like this.
The minimum wage induced spur of technological innovation ought not be praised
In a recent article at Reason.com, Christian Britschgi argues that “Government-mandated price hikes do a lot of things. Spurring technological innovation is not one of them”. This is in response to the self-serve kiosks in fast-food restaurants that seem to have appeared everywhere following increases in the minimum wage.
In essence, his argument is that minimum wages do not induce technological innovation. That is an empirical question. I am willing to consider that this is not the most significant of adjustment margins to large changes in the minimum wage. The work of Andrew Seltzer on the minimum wage during the Great Depression in the United States suggests that at the very least it ought not be discarded. Britschgi does not provide such evidence, he merely cites anecdotal pieces of support. Not that anecdotes are bad, but those that are cited come from the kiosk industry – hardly a neutral source.
That being said, this is not what makes me contentious towards the article. It is the implicit presupposition contained within: that technological innovation is good.
No, technological innovation is not necessarily good. Firms can use two inputs (capital and labor) and, given prices and return rates, there is an optimal allocation of both. If you change the relative prices of each, you change the optimal allocation. However, absent the regulated price change, the production decisions are optimal. With the regulated price change, the production decisions are the best available under the constraint of working within a suboptimal framework. Thus, you are inducing a rate of technological innovation which is too fast relative to the optimal rate.
You may think that this is a little luddite of me to say, but it is not. It is a complement to the idea that there are “skill-biased” technological change (See notably this article of Daron Acemoglu and this one by Bekman et al.). If the regulated wage change affects a particular segment of the labor (say the unskilled portions – e.g. those working in fast food restaurants), it changes the optimal quantity of that labor to hire. Sure, it bumps up demand for certain types of workers (e.g. machine designers and repairmen) but it is still suboptimal. One should not presuppose that ipso facto, technological change is good. What matters is the “optimal” rate of change. In this case, one can argue that the minimum wage (if pushed up too high) induces a rate of technological change that is too fast and will act in disfavor of unskilled workers.
As such, yes, the artificial spurring of technological change should not be deemed desirable!
Pinker wrote a nice rejoinder
Steven Pinker, the Harvard professor, recently published Enlightenment Now. The Case for Reason, Science, Humanism, and Progress.
It is a fine book that basically sets out to do what its subtitle promises. It does so covering a wide range of ideas and topics, and discusses and rejects most arguments often used against Enlightenment thought, which Pinker equates with classical liberalism.
Those who know the work of Johan Norberg of the Cato Institute, the late Julian Simon’s writings, Jagdish Bhagwati’s magisterial In Defense of Globalization, or last but not least, Deirdre McCloskey’s Bourgeois Trilogy will be updated on the latest figures, but will not learn much in terms of arguments.
Those new to the debate, or searching for material to defend classical liberal ideas and values, will find this a very helpful book.
The death of reason
“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”
Thomas Kuhn, The Structure of Scientific Revolutions p. 111
I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.
My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”
And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.
The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.
If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.
The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.
Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.
Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.
Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.
But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”
Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.
For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.
For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.
So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.
Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.
From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.
This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.
There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.
Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…
In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”
Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.
Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.
Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.
The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.
Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.
Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”
“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233
Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.
Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.
When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.
The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.
Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.
Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.
Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.
If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.
There were other attacks on the hoax, from Slate, Salon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.
Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.
The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”
Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.
In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.
However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.
Emotion Trumps Reason in Santa Monica
Santa Monica Community College caused a stir recently when it proposed offering “self-funded classes.” These would be extra sessions offered at $180 per unit, or $540 for a three-unit class, versus the normal $43 per unit.
Like most California Community Colleges, SMCC has drastically cut class sections. Students are frustrated and angry. The new proposal was met with widespread criticism. There were student demonstrations and even a whiff of pepper spray. Why?
If you asked any of the student demonstrators why, their answer would surely be, “it’s not fair!” It’s not fair that rich kids get in while others are left out.
Sad to say, gut reactions based on crude emotions are the best many college students can muster these days. No wonder, when so many of their professors bar nuanced analysis from their classrooms in favor of rants about “activism” or “social justice.”
So let’s apply a little analysis here. How would the proposed new offering affect lower-income students? Assuming the new classes do not divert resources away from the low-priced offerings – instructors, classrooms and such – these students should notice only one difference: less competition for scarce low-priced seats. The very problem they complain about most – lack of access to the classes they need to finish up their degree – would be lessened.
It’s the same old story – jealousy trumps reason. Studies have shown that people would prefer to earn $50,000 in a situation where their peers earn $25,000 over earning $75,000 when most people earn $100,000. (But it could be that the subjects of these studies are smart enough to realize that the premise of equal living costs in both situations is unrealistic because with more income, prices will be bid up.)
Critical thinking. Analytical thinking. These are the skills that a good economics department emphasizes – skills that are so valuable and so sadly lacking these days.
Iraq, War, and the Litmus Test of Rationality: Ron Paul Edition
The Republican Presidential debates have been on TV for the past, what?, five or six months now, and I am proud to admit that I haven’t watched a single one of them. By definition I am a left-leaning libertarian who thinks that free markets, limited government, and a humble foreign policy are the best tools to achieve social harmony, prosperity, and world peace.
So I had basically made up my mind on who I was going to vote for prior to the whole campaign season: Gary Johnson. Now, co-editor Fred Foldvary has some very pertinent critiques of Governor Johnson’s tax policy proposals, but on the whole, I still think he is by far the best man to get my vote.
Because Gary Johnson does not have any baggage, a solid record while in office, and a personality that does not attract the worst of the worst to his message, he was essentially dead on arrival when he announced his Presidential campaign. The media and its horse race would have none of it. So he bolted the Republican Party and is now fighting for the Libertarian Party’s nomination.
I think this is a big mistake. I think he should have stayed in the Republican Party and planned ahead for 2016. Now, he is going to be the next Ron Paul, who also bolted the Republican Party to run as a Libertarian in 1988. That move has cost him politically, and it is a shame that Johnson was too hot-headed with the national Party apparatus’ dismissal of his campaign. Continue reading