Is Fundamentalism a problem?

Today when a terrorist attack happens, the press too often avoids naming the perpetrators and instead seeks to be uncompromised by phrases like “car hits people.” But not long ago, the press usually blamed fundamentalists for terrorist attacks.

The name fundamentalist originated, interestingly enough, in Protestant circles in the US. Only much later was it used to describe other religions, and then mostly to Muslims. Among Protestants, the name fundamentalist was used to designate people against theological liberalism. I explain. With the Enlightenment, an understanding grew in theological circles that modern man could not believe in supernatural aspects of the Bible anymore. The answer was theological liberalism, a theology that tried to maintain the “historical Jesus,” but striping him from anything science couldn’t explain. Fundamentalism was an answer to this. Fundamentalists believed that some things are, well… fundamental! You can’t have Jesus without the virgin birth, the many miracles, the resurrection, and the ascension. That would be not Jesus at all! In other words, it is a matter of Principia: either science comes first and faith must submit, or faith comes before science.

The great observation made by fundamentalist theologian Cornelius Van Til is that fundamentalist Protestants are not the only fundamentalists! Everybody has fundamentals. Everybody has basic principles that are themselves not negotiable. If you start asking people “why” eventually they will answer “because it is so.”

If everybody has starting points that are themselves not open to further explanation, that means that our problem (and the problem with terrorism) is not fundamentalism per se. Everybody has fundamentals. The question is what kind of fundamentals do you have. Fundamentals that tell you about the holiness of human life, or fundamentals that tell you that somehow assassinating people is ok or even commendable?

Pinker wrote a nice rejoinder

Steven Pinker, the Harvard professor, recently published Enlightenment Now. The Case for Reason, Science, Humanism, and Progress.

NOL Pinker Edwin
Buy it

It is a fine book that basically sets out to do what its subtitle promises. It does so covering a wide range of ideas and topics, and discusses and rejects most arguments often used against Enlightenment thought, which Pinker equates with classical liberalism.

Those who know the work of Johan Norberg of the Cato Institute, the late Julian Simon’s writings, Jagdish Bhagwati’s magisterial In Defense of Globalization, or last but not least, Deirdre McCloskey’s Bourgeois Trilogy will be updated on the latest figures, but will not learn much in terms of arguments.

Those new to the debate, or searching for material to defend classical liberal ideas and values, will find this a very helpful book.

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.

Federalizing the Social Sciences

A few days ago I asked whether the social sciences could benefit from being unified. The post was not meant to make an argument in favor or against unification, although I myself favor a form of unification. The post was merely me thinking out loud and asking for feedback from others. In this follow up post I argue that the social sciences are already in the process of unification and a better question is what type of unification type this will be.


What is a social science?

First though allow me to define my terms as commentator Irfan Khawaja suggested. By social sciences I mean those fields whose subjects are acting individuals. For the time being the social sciences deal with human beings, but I see no particular reason why artificial intelligence (e.g. robots in the mold of Isaac Asimov’s fiction) or other sentient beings (e.g. extraterrestrials) could not be studied under the social sciences.

The chief social sciences are:

Economics: The study of acting individuals in the marketplace.

Sociology: The study of acting individuals and the wider society they make up.

Anthropology: The study of the human race in particular.

Political Science: The study of acting individuals in political organizations.

There are of course other social sciences (e.g. Demography, Geography, Criminology) but I believe the above four are those with the strongest traditions and distinctive methodologies. Commentators are more than encouraged to propose their own listings.

In review the social sciences study acting individuals.  A social science (in the singular) is an intellectual tradition that has a differentiating methodology. Arguably the different social sciences are not sciences as much as they are different intellectual schools.


Why do I believe the social sciences will be unified? 

On paper the social sciences have boundaries among themselves.

In practice though the boundaries between the social sciences blurs quickly. Economists in particular are infamous for crossing the line that the term ‘economics imperialism‘ has been coined to refer to the application of economic theory to non-market subjects. This imperialism has arguably been successful with Economists winning the Nobel prize for applying their theory to sociology (Gary Becker), history (Douglass North, Robert Fogel), law (Ronald_Coase) and political science (James M. Buchanan). The social sciences are in the process of being unified via economic imperialism.

Imperialism is a surprisingly proper term to describe the phenomenon taking place. Economists are applying their tools to subjects outside the marketplace, but little exchange is occurring on the other end. As the “Superiority of Economists” discusses, the other social sciences are reading and citing economics journals but the economics profession itself is very insular. The other social sciences are being treated as imperial subjects who must be taught by Economists how to conduct research in their own domains.

To an extent this reflects the fact that the economics profession managed to build a rigorous methodology that can be exported abroad and, with minimal changes, be used in new applications. I think the world is richer in so far that public choice theory has been exported to political science or price theory introduced to sociology. The problem lays in that this exchange has been so unequal that the other social sciences are not taken seriously by Economists.

Sociologists, Political Scientists, and Anthropologists might have good ideas that economics could benefit from, but it is only through great difficulty that these ideas are even heard. It is harder still for these ideas to be adopted.


Towards Federalizing the Social Sciences

My answer to economic imperialism is to propose ‘federalizing’ the social sciences, that is to say to give the social sciences a common set of methodologies so that they can better communicate with one another as equals but still specialize in their respective domains.

In practice this would mean reforming undergraduate education so that social science students take at minimum principle courses in each other’s fields before taking upper division courses in their specializations. These classes would serve the dual purpose of providing a common language for communication and encouraging social interaction between the students. Hopefully social interaction with one another will cause students to respect the work of their peers and discourage any one field from creating a barrier around itself. A common language (in the sense of methodology) meanwhile should better allow students to read each other’s work without the barriers that jargon terminology and other technical tools create. It is awful when a debate devolves into a semantics fight.

Supplementary methodologies will no doubt be introduced in upper division and graduate study, reflecting the different needs that occur from specialization, but the common methodology learned early on should still form the basis.

The unification of the social science need not mean the elimination of specialization. I do however fear that unless some attempt is made in ‘federalizing’ the social sciences we will see economics swallow up its sister sciences through imperialism.

As always I more than encourage thoughts from others and am all too happy to defer to better constructed opinions.

Could the social sciences benefit from being synthesized?

This past month a paper Marion Fourcade, Etienne Ollion, and Yann Algan by on the ‘Superiority of Economists‘ has made the rounds around the webs. Our own Brandon has made note of it before. I have given the paper some thought and cannot help but wonder if the social sciences could not benefit from being synthesized into a unified discipline.

Some background: I have been studying economics for a little under half a decade now. By all means I’m a new-born chicken, but I have been around long enough to have grown a distaste for certain elements of the dismal science. In particular I am disturbed by the insular nature of economists; relatively few seem interested in dropping by the History or Political Science departments next door to see what they’re working on. I cannot help but feel this insular nature will be economic’s undoing.

It should be no surprise that I hope to enter CalTech’s Social Science program for my PhD studies. The university is famed for its interdisciplinary nature and its social science program is no different. Its students are steeped in a core composed of microeconomics, statistics, and the other social sciences. For a while the New School in New York City offered a similar program.

I am sure there would be those who would object to synthesizing the social sciences into a unified discipline. Sociology and Economics might be more easily combined (as they were by folks such as Gary Becker) than Economics and Anthropology.

I am eager to hear other’s thoughts on this. Is the gap between the social sciences too large for them to be unified? Is unification even desirable? Should we content ourselves with an annual holiday dinner where we make fun of our common enemy?

Global Warming and Scholarly Conspiracies, etc. Part Two

In Part One of Scholarly Conspiracies, Scholarly Corruption and Global Warming, I drew on my own experience as a scholar to describe how the scientific enterprise can easily become corrupted for anodyne, innocent reasons, for reasons that are not especially cynical. I argued, of course, that this can especially happen in connection with such big, societal issues as climate change. I concluded that the findings of scientists do not, as a matter of principle, merit the quasi-religious status they are often granted. It follows from this that the Left’s attempt to stop any debate on the ground that science has spoken is grotesque.

I should have added in Part One that at different times in my career, I may have benefited by the kind of corruption I describe as well as having been hurt by it. Of course, one thing does not compensate for the other. Corruption is corruption; it constitutes more or less wide steps away from the truth whether I profit by it or whether it harms me. These things just add up, they don’t balance each other out.

Once you open your eyes, it’s not difficult to find gross derailments of the scientific enterprise. To be more precise, the transformation of limited scientific results into policy often gives rise to abuses. Sometimes, they are gross abuses verging on the criminal.

A recent book describes in detail how the slim results of 1950s studies that were obviously flawed both in their design and with respect to data collection were adopted by the American scientific establishment as policy. They resulted in a couple of generations of Americans being intellectually terrorized into adopting a restrictive, sad, un-enjoyable diet that may even have undermined their health. The book is The Big Fat Surprise by Nina Teicholz .

For most of my adult life, I limited my own intake of meats because saturated fats were supposed to give me cardiac illness and, ultimately heart attacks. I often thought something was fishy about the American Heart Association severity concerning saturated fats because of my frequent stays in France. There, I contemplated men of all ages feasting on pork chops fried in butter followed by five different kinds of cheese also eaten with butter. Then, they would have a post-prendial cigarette or two, of course. None of the men I knew exercised beyond walking to shop for pâtés, sausages, and croissants sweating butter (of course). Every time, I checked – often – Frenchmen had a longer life expectancy than American men (right now, it’s two and half years longer.)

Yet, such was the strength of my confidence (of our confidence) in the official medical-scientific establishment that I bravely followed my stern semi-macrobiotic diet even while in France. In my fifties, I developed Type Two diabetes. None of my four siblings who lived and ate in France did. I understand well the weakness of a such anecdotal evidence. And I know I could have been the one of five who hit the wrong number in the genetic lottery. (That would have been the inheritance from my grandfather who died at 26 of a worse illness than diabetes, a German bullet, in his case.) Yet, if there are quite a few cases like mine where siblings constitute a natural control for genetic factors, it would seem worth investigating the possibility that a diet high in carbohydrates is an actual cause of what is often described as an “epidemic” of Type II diabetes. If there are many more cases than there were before the anti-fat campaign, controlling for age, something must have changed in American society. The diet low on saturated fats pretty much forced on us since the fifties could be that societal change.

I am not saying that it is. I am saying it’s worth investigating, with proper design and normal rules of data selection. I am not holding my breath. I think the scientific establishment will not turn itself around until its biggest honchos of the relevant period pass away. Teicholz’s book may turn out to have many defects because she is more a journalist than a scientist. I am awaiting with great attention the rebuttals from the scientific establishment, or – you never know – their apologies.

And then, there is the old story of how it took twenty years for the American Medical Association to change its recommendation on how to treat the common duodenum ulcer after an obscure Australian researcher showed that it was almost always caused by a bacterium. (The story was told about twenty years ago, in Atlantic Monthly, I think I remember. You look it up.)

The de facto scientific establishment is not infallible but it usually wants to pretend that it is. It’s aided in its stubbornness by the religiously inspired passivity of ordinary people who were raised with misplaced all-around reverence for science and anything that appears, rightly or wrongly, “scientific.”

The climate change lobby, wrapped in a pseudo-scientific mantle still thrives in several policy areas in spite of most Americans’ relative indifference to the issue. Two of its main assets are these: First it is well served by irresponsible repetition of a simplified form of its message that amounts to constant, uncritical amplification; second, even well-educated people usually don’t pay a lot of attention to detail, don’t read critically because they are busy.

Now, I am not going to spend any time denouncing the myriad airheads with short skirts who add their own climate change sage commentary to their presentation of ordinary weather reports. (I am a man of vast culture, I listen to the same tripe in three different languages!) As I keep saying, I don’t beat on kindergartners. Let’s take National Geographic, instead, that justifiably respected monument to good information since 1888.

The October 2013 issue presents another striking photographic documentary intended to illustrate fast climate change. One of the photographic essays in the issue concerns, predictably, the alleged abnormal melting of glaciers. The talented photographer, James Balog, contributes his own completely superfluous, judgmental written commentary:

We know the climate is changing…. I never expected to see such huge changes in such a short period of time.

The guy is a photographer, for God’s sake! He has an undergraduate degree in communications. His credentials to pronounce on long-term climate change are…? Even the National Geographic, generally so careful about its assertions, couldn’t resist, couldn’t bring itself to tell him, “This is outside your area of competence, STFU!” Why not let the janitor also give his judgment in the pages of National Geographic? This is a free country after all. Most people simply don’t have the energy to notice thousands of such violations of good scientific practice.

Now to inattention, still with the venerated National Geographic. The September 2013 issue, entitled “Rising Seas” presents a truly apocalyptic future in case global warming is not controlled. As is usually the case with N.G. the article is chock-full with facts from studies. The article is also tightly argued. N.G. is normally careful about what it asserts. To make things even clearer, it offers a graph on pp. 40 -41 purporting to demonstrate a disastrous future for the earth starting very soon.

Being a leisurely retired man endowed with an unusually contrary personality, being furthermore well schooled in elementary data handling, I did the obvious with the graph, the obvious not one educated person in 10,000 would think of doing, or care to do. I took my desk ruler to the graph itself. Here is what I discovered:

Between 1880 and 2013, there was less than a one foot rise in the oceans level according to National Geographic. Of course, those 123 years cover the period of most rapid rise in the emission of alleged greenhouse gases. Imagine if National Geographic had an article entitled:

“Less Than Foot-Rise in Ocean in Spite of More than 120 Years of Greenhouse Emissions”

Many citizens would respond by thinking that maybe, possibly there is global warming but it’s not an urgent problem. Let’s take our time looking into the phenomenon more carefully, they would say. Let’s try and eliminate alternative explanations to greenhouse gases if we find that there is indeed abnormal warming. After all, how much of a rush would I be in even if I were convinced that water rises in my basement by almost one tenth of an inch each year on the average?

This is not an absurd mental exercise. The business of science is to try to falsify and falsify again. When you get interesting results, the scientific establishment (if not the individual scientist author of the findings) is supposed to jump on them with both feet to see if they stand up. Instead in connection with global warming, scientists have allowed the policy establishment and those in their midst that influence it to do exactly the reverse: If you see anything you like in a scientific study, try hard for more of the same. If you find something that contradicts your cause, bury it if you can, ignore it otherwise. You will get plenty of help in doing either.

Scientists have become collectively a complicit in massive anti-scientific endeavor with many religious features.

I am finally proofing the print copy of my book:

I Used to Be French: an Immature Autobiography.

Scholarly Conspiracies, Scholarly Corruption and Global Warming: Part One

97 % of scientists, blah, blah…. Ridiculous, pathetic.

Thus challenged, some people I actually like throw reading assignments at me. Some are assignments in scholarly journals; some, sort of. Apparently, I have to keep my mouth shut until I reach a high degree of technical competence in climate science (or something). I don’t need to do these absurd assignments. I am not blind and I am not deaf. I see what I see; I hear what I hear; it all sounds familiar. Been there, done it!

A long time ago, I accepted a good job in France in urban planning after receiving my little BA in sociology from Stanford. I was a slightly older graduate and I had no illusions that I knew much of anything then. I had some clear concepts in my mind and I had learned the basic of the logic of scientific inquiry from old Prof. Joseph Berger and from Prof. Bernard Cohen. I had also done some reading in the “excerpts” department including the trilogy of Max Weber, Emile Durkheim and Karl Marx. Only a couple of weeks after I took my job, my boss sent me to a conference of urban sociologists in Paris. Having been intellectually spoiled by several years in the US and conscious of my limited knowledge of urban planning, I asked many questions, of course.

In the weeks following the meeting, I became aware of a rumor circulating that presented me as an impostor. This guy coming out of nowhere – the USA – cannot possibly have studied sociology because he does not know anything, French sociologists thought. I had to ask how the rumor started. I was aware that I knew little but, but, I did not think it was exactly “nothing.” Besides, most of my questions at the conference had not been answered in an intelligible manner so, I was not convinced that my comparison set – French sociologists working in city planning – knew much more than I did.

Soon afterward, I wrote a “white paper.” It was about the eastern region where I had been tasked to plan for the future until 2005 (the year was 1967) as part of a multidisciplinary team. The white paper gave a list of social issues city planners had to face at this point, the starting point of the planning endeavor. As young men will do, I had allowed myself short flights of speculation in the white paper, flights I would not have indulged in a few years later. My direct supervisor, an older French woman who was supposed to be sociologist, read the whole ambitious product, or said she had, and made no comments except one. She took exception to one of my speculative flights in which I made reference to the idea that much societal culture rises up from the street. It was almost an off-hand remark. Had that part been left out, the white paper would have been pretty much the same. The supervisor insisted I had to remove that comment because, she said exactly, ”Marx asserts clearly that culture comes from the ruling class.” She told me she would not allow the white paper to be presented until I extirpated the offending statement.

In summary: The woman had nothing to say about the many parts of the report that were instrumental to the endeavor that our team was supposed to complete, about that for which she and I were explicitly being paid. She had nothing to say about the likely mistakes I exhibited in the report because of my short experience. Her self-defined role was strictly to protect what she took to be Marxist orthodoxy even if it was irrelevant. There was a double irony there. First, the government that employed us was explicitly not in sympathy with any form of Marxism. The woman was engaging in petty sedition. Second, Karl Marx himself was no lover of orthodoxies. He would have abhorred here role. (Marx is said to have declared before his death, “I am not a Marxist”!)

In any event, I was soon rid of the ideological harridan and I was able to do my job after a fashion. For those who like closure: I went back to the US to attend graduate school, at Stanford again. Two years later, my old boss called me back. He had come up in the world. He was in charge of a big Paris metropolitan area urban research institute. He begged me, begged on the phone to go back to France, and take charge of the institute’s sociology cell. He said that he understood not a word of what the “sociologists” there said to him. He added that I was the only sociologist he had ever understood. I yielded to his entreaties and I promised him a single year of my life. I interrupted my graduate studies and flew to Paris. In the event, I gave the sociologists at the institute one month warning. Then, I summoned each one of them to explain to me orally how his work contributed to Paris city and regional planning. (“What will it change to the way this is currently being done?” I asked.) They did not respond to my satisfaction and I fired all six of them. I replaced them with people who could keep their Marxism under control. My boss was grateful. I could have had a great career in France. I chose to return to my studies instead.

Three years later, having completed my doctorate, I found my self at critical juncture common to all those who go that course. You have to turn your doctoral thesis into papers published in double-blind refereed journals. (Here is what this means: “What’s Peer Review and Why It Matters“)

That’s a lot like leaving kindergarten: no more cozy relationships, no more friends assuring you that your work is just wonderful; the real world hits you in the face. The review process in good journals is often downright brutal. Anyone who does not feel a little vulnerable at that point is probably also a little silly. To make matters worse, the more respected the journal, the harder it is to get in and the better your academic career. As a rule, if you have not achieved publication in a first-rate journal in the first three or four years after completing your doctorate, you will be consigned forever to second-tier universities or worse.

Be patient, I am just setting the stage for what’s coming.

Much of my early scholarly work happened to take place within a school of research dominated by “neo-Marxists.” It was not my choice. I was interested in problems of economic development that happened to be largely in the hands of those people. My choice was between abandoning my interests or buckling up and taking my chances. I buckled up, of course. My first article to be published was innovative but a little esoteric. (Delacroix, Jacques. “The permeability of information boundaries and economic growth: a cross-national study.Studies in Comparative International Development. 12-1:3-28. 1977.) I presented to a specialized journal and therefore not one that could be called “first tier.” It happened to contain nothing that would offend the neo-Marxists. It took less than six months to have it accepted for publication.

The second published paper out of my dissertation struck at the heart of neo-Marxists convictions. It demonstrated – using their methods – that the parlous condition of the Third World – allegedly caused by capitalist exploitation – could be remedied through one aspect of ordinary good governance. I submitted it to one of the two most respected journals (the American Sociological Review). All the reviewers who had the technical skills to review my submission were also neo-Marxists or sympathetic to their doctrine. The paper reported on a study conducted according to methods that were by now common. Having the paper accepted for publication took more than three years. It also took a rare personal intervention by the journal’s editor whom I somehow managed to convince that the reviewers he had chosen were acting unreasonably. (The paper: Delacroix, Jacques. “The export of raw materials and economic growth: a cross-national study.American Sociological Review 42:795-808. 1977.) No need to read either paper.

Am I telling you here a story of conspiracy or a story of academic corruption? Yes, I faced a conspiracy but it was not a conspiracy against me personally and it was mostly not conscious. The only people – but me- who had the skills to pass judgment on my paper were not numerous. They were a small group that shared a common understanding of the reality of the world. It was not a cold, cerebral understanding. Those people formed a community of sentiment. They believed their work would contribute to the righting of a worldwide injustice, a “global” injustice committed against the defenseless people of underdeveloped countries. Is it possible that their ethical faith influenced their judgment? To ask the question is to answer it, I think. Did their faith induce them to close their eyes when others from their own camp cut some research corners here and there? On the contrary, were their eyes wide open when they were reviewing for a journal a submission whose conclusion impaired their representation of the world? In that situation, did they overreact to an uncrossed “t” or a dotted “i,” in a paper that undermined their beliefs? Might be. Could be. Probably was. Other things being equal, they may have just thought, it would be better if these annoying Delacroix findings were not publicized in a prime journal. Delacroix could always try elsewhere anyway.

So, yes, I faced corruption. It was not conscious, above-board corruption. It was not cynical. It was a corruption of blindness, much of it deliberate blindness. The blindness was all the more sturdy because it was seldom called into question. Those who would have cared did not understand the relevant techniques. Those who knew them shared in the blindness. This is a long way from cynical, deliberate lying. It’s just as destructive though. And it’s not only destructive for the lives of the likes of me who don’t belong to the relevant tribe. It’s destructive of what ordinary people think of as the truth. That is so because – however unlikely that sounds – the productions of elite and abstruse journals usually find their way into textbooks, even if it take twenty years.

Are the all-powerful editors of important journals part of the conspiracy? Mine were not but they tended to adhere to imperfect rules of behavior that made them objective accomplices of conspiracies. Here is the proof that the editor of the particular journal tried to be impartial. Only a month after he accepted my dissenting paper, the editor assigned me to review a submission from the same neo-Marxist school of thought that trumpeted another empirical finding proving that, blah, blah…. After one reading of the paper, my intuition smelled a rat. I spent days in the basement of the university library, literally days, taking apart the empirical foundation of the paper. I found the rat deep in its bowel. To put it briefly, if you switched a little thing from one category to another, all the conclusions were reversed. There was no imperative argument to put that one thing in one category rather than in the other. The author had chosen that which put his labor of love in line with the love of his neo-Marxist cozy-buddies. If he had not done it, his pluses would have become minuses, his professional success anathema. In the event, the editor agreed with my critique and dinged the paper for good. Nothing worse happened to the author. No one could tell whether he was a cheat. Or, no one would. No one was eager to. The editor was not in appetite for a fight. He let the whole matter go.

Myself, I came out of this experience convinced that it was likely that no one else in the whole wide world had both the skills and the motivation to dive into the depth of the paper to find that rat. It’s likely that no one else would have smelled a rat. It’s possible that if I had not still been smarting from three years of rejection of my own work, I would not have smelled the rat myself. The editor had the smarts, the intuition fed by experience, I would say, that he could put to work my unique positioning, my combination of competence and contrariness. He put it to work in defense of the truth. That fact is enough to exonerate him from complicity in the conspiracy I described. To answer my own question: Do I think that powerful scientific journal editors are often part of a conspiracy of the right thinking, of an orthodox cabala? I think not. Do they sometimes or often fall for one? Yes.

For those who like closure: My interests switched later to other topics. (See vita, linked to this blog’s “About me.”) I think the neo-Marxist school of thought to which I refer above gradually sank into irrelevance.

After that experience, and several others of the same kind, do I have something better to propose? I don’t but I think the current system of scholarship publication does not deserve anything close to religious reverence. Even if there were anything close to a “consensus” of scientists on anything, that should not mean that the book is closed. Individual rationalism also matters. It matters more, in my book.

What does this story of reminiscences this have to do with global warming, climate change, climate disruption , you might ask? Everything, I would say. More on the connection in part Two. [Update: Here is part 2, as promised! – BC]