The State in education – Part III: Institutionalization of learning

In The State in education – Part II: social warfare, we looked at the promise of state-sponsored education and its failure, both socially and as a purveyor of knowledge. The next step is to examine the university, especially since higher education is deeply linked to modern society and because the public school system purports to prepare its students for college.

First, though, there should be a little history on higher education in the West for context since Nietzsche assumed that everyone knew it when he made his remarks in Anti-Education. The university as an abstract concept dates to Aristotle and his Peripatetic School. Following his stint as Alexander the Great’s tutor, Aristotle returned to Athens and opened a school at the Lyceum (Λύκειον) Temple. There, for a fee, he provided the young men of Athens with the same education he had given Alexander. On a side note, this is also a beautiful example of capitalist equality: a royal education was available to all in a mutually beneficial exchange; Aristotle made a living, and the Athenians received brains.

The Lyceum was not a degree granting institution, and only by a man’s knowledge of philosophy, history, literature, language, and debating skills could one tell that he had studied at the Lyceum. A cultural premium on bragging rights soon followed, though, and famous philosophers opening immensely popular schools became de rigueur. By the rise of Roman imperium in the Mediterranean around 250 BC, Hellenic writers included their intellectual pedigrees, i.e. all the famous teachers they had studied with, in their introductions as a credibility passport. The Romans were avid Hellenophiles and adopted everything Greek unilaterally, including the concept of the lyceum-university.

Following the Dark Ages (and not getting into the debate over whether the time was truly dark or not), the modern university emerged in 1088, with the Università di Bologna. It was more of a club than an institution; as Robert S. Rait, mid-20th century medieval historian, remarked in his book Life in the Medieval University, the original meaning of “university” was “association” and it was not used exclusively for education. The main attractions of the university as a concept were it was secular and provided access to books, which were prohibitively expensive at the individual level before the printing press. A bisection of the profiles of Swedish foreign students enrolled at the Leipzig University between 1409 and 1520 shows that the average male student was destined either for the clergy on a prelate track or was of noble extraction. As the article points out, none of the students who later joined the knighthood formally graduated, but the student body is indicative of the associative nature of the university.

The example of Lady Elena Lucrezia Cornaro Piscopia, the first woman to receive a doctoral degree, awarded by the University of Padua in 1678, illuminates the difference between “university” at its original intent and the institutional concept. Cornaro wrote her thesis independently, taking the doctoral exams and defending her work when she and her advisor felt she was ready. No enrollment or attendance at classes was necessary, deemed so unnecessary that she skipped both the bachelor and masters stages. What mattered was that a candidate knew the subject, not the method of acquisition. Even by the mid-19th century, this particular path remained open to remarkable scholars, such as Nietzsche since Leipzig University awarded him his doctorate on the basis of his published articles, rather than a dissertation and defense.

Education’s institutionalization, i.e. the focus shifting more from knowledge to “the experience,” accompanied a broader societal shift. Nietzsche noted in Beyond Good and Evil that humans have an inherent need for boundaries and systemic education played a very prominent role in contemporary man’s processing of that need:

There is an instinct for rank which, more than anything else, is a sign of a high rank; there is a delight in the nuances of reverence that allows us to infer noble origins and habits. The refinement, graciousness, and height of a soul is dangerously tested when something of the first rank passes by without being as yet protected by the shudders of authority against obtrusive efforts and ineptitudes – something that goes its way unmarked, undiscovered, tempting, perhaps capriciously concealed and disguised, like a living touchstone. […] Much is gained once the feeling has finally been cultivated in the masses (among the shallow and in the high-speed intestines of every kind) that they are not to touch everything; that there are holy experiences before which they have to take off their shoes and keep away their unclean hands – this is almost their greatest advance toward humanity. Conversely, perhaps there is nothing about so-called educated people and believers in “modern ideas” that is as nauseous as their lack of modesty and the comfortable insolence in their eyes and hands with which they touch, lick, and finger everything [….] (“What is Noble,” 263)

The idea the philosopher pursued was the notion that university attendance conveyed the future right to “touch, lick, and finger everything,” a very graphic and curmudgeonly way of saying that a certain demographic assumed unjustified airs.

Given that in Anti-Education, Nietzsche lamented the fragmentation of learning into individual disciplines, causing students to lose a sense of the wholeness, the universality of knowledge, what he hated in the nouveau educated, if we will, was the rise of the pseudo-expert – a person whose knowledge was confined to the bounds of a fixed field but was revered as omniscient. The applicability of Socrates’ dialogue with Meno – the one where teacher and student discuss human tendency to lose sight of the whole in pursuit of individual strands – to the situation was unmistakable, something which Nietzsche, a passionate classicist, noticed. The loss of the Renaissance learning model, the trivium and the quadrivium, both of which emphasize an integrated learning matrix, carried with it a belief that excessive specialization was positive; it was a very perverse version of “jack of all trades, master of none.” As Nietzsche bemoaned, the newly-educated desired masters without realizing that all they obtained were jacks. In this, he foreshadowed the disaster of the Versailles Treaty in 1919 and the consequences of Woodrow Wilson’s unwholesome belief in “experts.”

The philosopher squarely blamed the model of the realschule, with its clear-cut subjects and predictable exams, for the breakdown between knowledge acquisition and learning. While he excoriated the Prussian government for basing all public education on the realschule, he admitted that the fragmentation of the university into departments and majors occurred at the will of the people. This was a “chicken or the egg” situation: Was the state or broader society responsible for university learning becoming more like high school? This was not a question Nietzsche was interested in answering since he cared more about consequences. However, he did believe that the root was admitting realschule people to university in the first place. Since such a hypothesis is very applicable today, we will examine it in the contemporary American context next.

Inventions that didn’t change the world

Have you ever learned about an amazing invention–whether it was the Baghdad battery or the ancient Roman steam engine or Chinese firecrackers–and wondered why it didn’t do more to change the world? In this podcast, we examine a selection of curiosities and explore hypotheses for why their inventors didn’t use them to full effect.

We move VERY quickly through a range of fascinating examples and hypotheses, and therefore leave a lot up to discussion. We hope to see your thoughts, feedback, and additions in the comments section!

For any invention that you want to learn more about, see the links below:

Knossos’ toilets

In the 2nd millennium BC, a “palace” (now thought to be a building that served as administrative, trade, and gathering hub) had running-water toilet flushing. Much like the Roman Cloaca Maxima, likely a HUGE public-health benefit, but basically died out. Does this show that military protection/staving off the “Dark Ages” was the only way to maintain amazing inventions?

Link: http://www.nature.com/news/the-secret-history-of-ancient-toilets-1.19960;

The Nimrud lens

Whether it was a fire-starter, a magnifying glass, or (for some overeager astronomy enthusaists), the Neo-Assyrian ground-crystal Nimrud lens is an invention thousands of years out of place. While the Egyptians, Greeks, and Romans all used lenses of different sorts, and glass-blowing was certainly popular by the 1st century BC in Roman Egypt, no glass lenses were made until the Middle Ages and the potential scientific and engineering uses of lenses–that can hardly be understated even in their 16th-to-18th-century applications–had to wait another couple millennia. Many devices like the Baghdad battery and Antikythera device are heralded for their possible engineering genius, but this seems like a simple one with readily available applications that disappeared from the historical record.

https://en.wikipedia.org/wiki/Nimrud_lens

Hero of Alexandria’s steam engine

In the 1st century AD, Hero was a master of simple machines (that were mostly used for plays) and also invented a force pump, a wind-powered machine, even an early vending machine. However, he is likely most famous for his Aeolipile, a rotating steam engine that used heated water to spin an axle. The best attested use of this is for devotion to the divine and party tricks.

https://en.wikipedia.org/wiki/Aeolipile

The ancient mechanical reaper

Ancient Gallo-Romans (or just Gauls) invented a novel way of grain harvesting: rather than using sickles or scythes, they used a mechanical reaper, 1700 years before Cyrus McCormick more than tripled the productivity of American farmers. This antiquated device literally but the cart before the oxen and required two men to operate: one man to drive the beasts, and another to knock the ears off the stalk (this reaper was obviously far less sophisticated than McCormick’s). This invention did not survive the Volkswanderung period.

http://www.gnrtr.com/Generator.html?pi=208&cp=3

http://reapertakethewheel.blogspot.com/2013/03/impacts-of-invention.html

Note: the horse collar (which allowed horses to be used to plow) was invented in 1600-1400 BC in China AND the Levant, but was not applied widely until 1000 AD in Europe. https://en.wikipedia.org/wiki/Horse_collar.

Inoculation

Madhav, an Indian doctor, compiled hundreds of cures in his Nidana, including an inoculation against smallpox that showed an understanding of disease transmission (he would take year-old smallpox-infected flesh and touch it to a recently made cutaneous wound). However, the next 13 centuries did not see Indian medical understanding of viruses or bacteria, or even copied techniques of this, development. https://books.google.com/books?id=Hkc3QnbagK4C&pg=PA105&lpg=PA105&dq=madhav+indian+smallpox+inoculation&source=bl&ots=4RFPuvbf5Y&sig=iyDaNUs4u5N7xHH6-pvlbAY9fcQ&hl=en&sa=X&ved=0ahUKEwic8e-1-JXVAhUp6IMKHfw3DLsQ6AEIOjAD#v=onepage&q=madhav%20indian%20smallpox%20inoculation&f=false

At least, thank god, their methods of giving nose jobs to those who had had their noses cut off as a punishment survived: https://en.wikipedia.org/wiki/History_of_rhinoplasty

The Chinese:

List of all chinese inventions:

https://en.wikipedia.org/wiki/List_of_Chinese_inventions#Four_Great_Inventions

Gunpowder

Gunpowder was discovered by Chinese alchemists attempting to discover the elixir of life (irony, no?)

https://www.thoughtco.com/invention-of-gunpowder-195160

https://en.wikipedia.org/wiki/Four_Great_Inventions

(maybe a good corollary would be Greek fire, which was used effectively in naval warfare by the Byzantines, but which was not improved upon and the recipe of which is still secret: https://en.wikipedia.org/wiki/Greek_fire)

Printing

The Chinese invented the printing press possibly as early as the 6th century. However, unlike the explosion of literacy seen in much of Europe (particularly Protestant Europe–see our last podcast), the Chinese masses never learned to read. In fact, in 1950 fewer than 20% of Chinese citizens were literate. Compare this to Europe, where some societies saw literacy rates of as high as 90% (Sweden, Male population) in some societies within a few centuries of the introduction of the printing press. Why? There may be several reasons–cultural, religious, political–but in our opinion, it would have to be the characters: 100,000 blocks were needed to create a single set.

http://www.nytimes.com/2001/02/12/news/chinas-long-but-uneven-march-to-literacy.html

https://en.wikipedia.org/wiki/History_of_printing_in_East_Asia

They also invented pulped paper by the 2nd century BC: https://en.wikipedia.org/wiki/List_of_Chinese_inventions.

The compass

Invented by 200 BC for divination and used for navigation by the Song dynasty; despite this and the availability of easily colonizable islands within easy sailing distance, the Chinese did not colonize Indonesia, Polynesia, or Oceania, while the Europeans did within the century after they developed the technology and first sailed there.

https://en.wikipedia.org/wiki/History_of_the_compass.

The rudder

While they did not invent the rudder, they invented the “medial, axial, and vertical” sternpost rudder that would become standard in Europe almost 1,000 years before it was used in Europe (1st century AD vs 11th century).

Natural gas

The Chinese discovered “fire wells” (natural gas near the surface) and erected shrines to worship there.

https://link.springer.com/referenceworkentry/10.1007%2F978-1-4020-4425-0_9568

They even understood their potential for fuel, but never developed beyond primitive burning and bamboo piping despite having advanced mining techniques for it by the 1st century BC.

Chinese miscelleni:

Hydraulic powered fan: https://en.wikipedia.org/wiki/Fan_(machine)#History

Cuppola furnace for smelting and molding iron: https://en.wikipedia.org/wiki/Cupola_furnace.

Coke as a fuel source: https://en.wikipedia.org/wiki/Coke_(fuel).

Belt-drive spinning wheel: https://en.wikipedia.org/wiki/Coke_(fuel).

The Precolumbian wheel

The pre- and early Mayans had toys that utilized primitive wheels, but did not use them for any labor-saving purpose (even their gods were depicted carrying loads on their backs). This may have been because scaling up met with mechanical difficulties, but the potential utility of wheels in this case with a bit of investment literally sat unrealized for centuries.

https://tcmam.wordpress.com/2010/11/11/did-pre-columbian-mesoamericans-use-wheels/

The Tucker:

http://www.smithsonianmag.com/history/the-tucker-was-the-1940s-car-of-the-future-135008742/

The following book contained some of our hypotheses:

https://books.google.com/books?id=ynejM1-TATMC&pg=PA399&lpg=PA399&dq=roman+and+greek+labor-saving+devices&source=bl&ots=BI6GVGTrxC&sig=8ZJqirOVUyjH7TNq0fcW6UUPn1k&hl=en&sa=X&ved=0ahUKEwj55O7395XVAhVqwYMKHSb2Dy4Q6AEIKTAB#v=onepage&q=roman%20and%20greek%20labor-saving%20devices&f=false

 

The rest of our hypotheses were amalgamated from our disparate classes in economics and history, but none of them are our own or uncommon in academic circles. Thanks for listening!

The death of reason

“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”

Thomas Kuhn, The Structure of Scientific Revolutions p. 111

I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.

My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”

And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.

The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.

If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.

The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.

Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.

Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.

Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.

But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”

Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.

For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.

For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.

So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.

Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.

From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.

This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.

There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.

Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…

In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”

Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.

Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.

Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.

The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.

Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.

Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”

“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233

Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.

Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.

When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.

The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.

Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.

Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.

Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.

If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.

There were other attacks on the hoax, from SlateSalon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.

Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.

The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”

Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.

In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.

However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.

The existentialist origins of postmodernism

In part, postmodernism has its origin in the existentialism of the 19th and 20th centuries. The Danish theologian and philosopher Søren Kierkegaard (1813-1855) is generally regarded as the first existentialist. Kierkegaard had his life profoundly marked by the breaking of an engagement and by his discomfort with the formalities of the (Lutheran) Church of Denmark. In his understanding (as well as of others of the time, within a movement known as Pietism, influential mainly in Germany, but with strong precedence over the English Methodism of John Wesley) Lutheran theology had become overly intellectual, marked by a “Protestant scholasticism.”

Scholasticism was before this period a branch of Catholic theology, whose main representative was Thomas Aquinas (1225-1274). Thomas Aquinas argued against the theory of the double truth, defended by Muslim theologians of his time. According to this theory, something could be true in religion and not be true in the empirical sciences. Thomas Aquinas defended a classic concept of truth, used centuries earlier by Augustine of Hippo (354-430), to affirm that the truth could not be so divided. Martin Luther (1483-1546) made many criticisms of Thomas Aquinas, but ironically the methodological precision of the medieval theologian became quite influential in Lutheran theology of the 17th and 18th centuries. In Germany and the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) Lutheranism became the state religion after the Protestant Reformation of the 16th century, and being the pastor of churches in major cities became a respected and coveted public office.

It is against this intellectualism and this facility of being Christian that Kierkegaard revolts. In 19th century Denmark, all were born within the Lutheran Church, and being a Christian was the socially accepted position. Kierkegaard complained that in centuries past being a Christian was not easy, and could even involve life-threatening events. In the face of this he argued for a Christianity that involved an individual decision against all evidence. In one of his most famous texts he makes an exposition of the story in which the patriarch Abraham is asked by God to kill Isaac, his only son. Kierkegaard imagines a scenario in which Abraham does not understand the reasons of God, but ends up obeying blindly. In Kierkegaard’s words, Abraham gives “a leap of faith.”

This concept of blind faith, going against all the evidence, is central to Kierkegaard’s thinking, and became very influential in twentieth-century Christianity and even in other Western-established religions. Beyond the strictly religious aspect, Kierkegaard marked Western thought with the notion that some things might be true in some areas of knowledge but not in others. Moreover, its influence can be seen in the notion that the individual must make decisions about how he intends to exist, regardless of the rules of society or of all empirical evidence.

Another important existentialist philosopher of the 19th century was the German Friedrich Nietzsche (1844-1900). Like Kierkegaard, Nietzsche was also raised within Lutheranism but, unlike Kierkegaard, he became an atheist in his adult life. Like Kierkegaard, Nietzsche also became a critic of the social conventions of his time, especially the religious conventions. Nietzsche is particularly famous for the phrase “God is dead.” This phrase appears in one of his most famous texts, in which the Christian God attends a meeting with the other gods and affirms that he is the only god. In the face of this statement the other gods die of laughing. The Christian God effectively becomes the only god. But later, the Christian God dies of pity for seeing his followers on the earth becoming people without courage.

Nietzsche was particularly critical of how Christianity in his day valued features which he considered weak, calling them virtues, and condemned features he considered strong, calling them vices. Not just Christianity. Nietzsche also criticized the classical philosophy of Socrates, Plato, and Aristotle, placing himself alongside the sophists. The German philosopher affirmed that Socrates valued behaviors like kindness, humility, and generosity simply because he was ugly. More specifically, Nietzsche questioned why classical philosophers defended Apollo, considered the god of wisdom, and criticized Dionysius, considered the god of debauchery. In Greco-Roman mythology Dionysius (or Bacchus, as he was known by the Romans) was the god of festivals, wine, and insania, symbolizing everything that is chaotic, dangerous, and unexpected. Thus, Nietzsche questioned the apparent arbitrariness of the defense of Apollo’s rationality and order against the irrationality and unpredictability of Dionysius.

Nietzsche’s philosophy values courage and voluntarism, the urge to go against “herd behavior” and become a “superman,” that is, a person who goes against the dictates of society to create his own rules . Although he went in a different religious direction from Kierkegaard, Nietzsche agreed with the Danish theologian on the necessity of the individual to go against the conventions and the reason to dictate the rules of his own existence.

In the second half of the 20th century existentialism became an influential philosophical current, represented by people like Jean-Paul Sartre (1905-1980) and Albert Camus (1913-1960). Like their predecessors of the 19th century, these existentialists criticized the apparent absurdity of life and valued decision-making by the individual against rational and social dictates.

Dear Greeks:

I hear you can’t pay your debts again. I am a little sorry but you brought it on yourselves. A few reminders.

Your country is a democracy. The way you got into this pickle is through the stupid, self-indulgent policies of those you elected. You did it again in your last election by bringing to power a bragging leftist party in the old Stalinist mold. What did you think they would do: Frighten the European Union, The International Monetary Fund (Number one stockholder the US), Germany, the world, into submission, into erasing your debt? Think!

The reason Germany is your principal creditor is that one of your previous governments begged Germany for help and it agreed to help. The Germans did not cram loan after loan down your throat; you asked. The big sillies thought you would be honorable and pay up as agreed. Do you care about your future reputation, your honor, your children’s future ability to walk in the world with their heads up? Here is a basic rule of politeness which is also a moral rule: When somebody gives you a hand, you don’t bite it viciously.

There are several reasons your government can’t pay its debts. One reason is that your political class is corrupt trough and through. Another is that you are reluctant to pay taxes the way normal people do in the European Union. Too many Greeks want to work and pretend-work for the government instead of doing real work. And your government still owns stuff no government anywhere should ever own because governments always make a mess of running them, resorts, among others.

Another reason why your government can’t pay its bills is that your country is genuinely poor for a European country. There too, you have a lot of explaining to do. For one thing, you have been living above your means for a long time, pretending you were more or less like Danes, or Germans. Well, the truth is that you are not, not even close; Danes and Germans are very productive; you are not. So, you should not have ever expected to work short weeks and to take long summer vacations, like Danes and Germans. Such privileges do not come automatically with membership in the Union, you know. You should look over the border on the despised neighbors, the Turks, instead. They don’t pretend to themselves that they are already rich; they go to work early and they close their shops late. Many of them work six days a weeks. Over the past ten years, the growth rates of their economy has left yours in the dust. Coincidence?

And you only make yourself even more scorned with your treatment of others. The real horrors that Nazi Germany inflicted on Greece more than 70 years ago are not much of an excuse anymore. A previous government of yours, an elected government, accepted reparations a long time ago. And, by the way, in 1945, Germany was much more devastated than Greece, and still in 1948. See where the Germans are now, and where you are? Any comment?

And do you ever wonder why the Estonians, in the stultifying Soviet prison for fitly years, never ask for new loans to pay back older loans? And how long anyway did you expect German workers to work until age 69 so your public servants could continue to retire at 63? Are you out of your minds?

One last thing: You are not exactly Classical Greece. Stop wrapping yourselves in Aristotle’s toga. Really study Socrates. He chose to die than cheat even a little. Neither he nor Aristotle was a whiner. That’s why they are still remembered and honored.

In the end, I wish you well. Everyone can unlearn bad habits and learn basic rationality, even late in life. I hope you soon leave that club where you don’t belong. I hope further that you can make your way back. Begin by getting up at 6 every morning. Also, learn the obvious: socialism does not work well for rich countries; it’s miserable for poor countries.

Why I Reject Marxism

I was recently given a copy of History of Political Philosophy edited by Leo Strauss and his successor, Joseph Cropsey. It’s a superb book, a well curated collection of essays by distinguished scholars in the field covering the time period from Thucydides to Martin Heidegger. Each essay succinctly covers, in about 20-30 pages, the political thought, life, and times of each of the figures studied. You really all should buy it.

That’s not the reason I’m writing, though. I’ve always felt a visceral disdain for Marxism, from the repugnant nature of its premises (I side with Aristotle: “But that the unequal should be given to equals, and the unlike to those who are like, is contrary to nature, and nothing which is contrary to nature is good” Politics Book VII 3.5, 1325.b) to the ugly cant of its diction. There is nothing in it that appeals to me, and its followers whom I have encountered, either snide petit bourgeois professors with manicured fingernails and soft hands, or the spoiled and deluded children of middle-class families, have not helped my perception. Always and interestingly for me is that I have never met a single member of the proletariat who has actually referred to himself as such, or has shown anything but a similarly gut hatred of Marxist rhetoric. Marx’s own anti-Semitism is not endearing to a Jew like me, either (I recommend Sander GIlman’s work, Jewish Self-Hatred, for those who would like more information).

Despite my misgivings, I have not found such a beautiful encapsulation of why I reject Marxism until I read Cropsey’s essay on the great thinker:

Unexpectedly, we now see coming into view a ground of agreement between ancients and pre-Marxian moderns on this most important point: political life rests upon the imperfection of man and continues to exist because human nature rules out the elevation of all men to the level of excellence. The connection between civil government and man’s imperfection is expressed by Rousseau, for example, in the form of the distinction between state and society: men can be social while uncorrupted, but in political community they prey and are preyed upon by one another. At the beginning of Common Sense, Thomas Paine wrote, “Society is produced by our wants, and government by our wickedness; the former promotes our happiness positively by uniting our affections,t he latter negatively by restraining our vices… The first is a patron, the last a punisher.”

Rousseau may be said to have suggested, via the doctrine of the perfectibility of man, that government may be more and more replaced by society: in the perfect freedom of self-government, coercion loses most of its sting. But Rousseau did not at all suppose that all men would become philosophic, nor that there is any perfect substitute for the full rationality of men that would render coercion and rhetoric of all kind, i.e. political life, dispensable. He did not, in brief, expect ordinary selfishness imply to disappear from among the generality of men.

What in Rousseau was a limited suggestion, although an emphatic one, came to be the dogmatic core of a confident prognosis, a strident propaganda, and a revolutionary incitation in Marx: the state or political order will wholly wither away, and homogeneous mankind will live socially under the rule of absolute benevolence – from each according to his ability, to each according to his needs. No longer will duty be performed incidentally to the pursuit of selfish interest. The link between duty and interest, which is to say the subordination of duty to interest, will be broken once and for all by the abolition of the categories ‘duty’ and ‘interest.’ They will be abolished by the revision of the property relations, by the inauguration of a new economics which will bring on the full perfection of human nature via the transcendence of production for exchange.

Marxism is not simply another political system, or one more ideology. It proposes nothing less than the end of the West – of political life, philosophy, and religion – as the foregoing summary indicates. Perhaps we should look forward with eager anticipation to the end of the West – but we cannot know whether we should without rationally examining the project for strangling philosophy. That rational examination is part of the philosophical quest itself. We cannot free ourselves of philosophy, if only because we must philosophize to pass judgment on philosophy. We begin to suspect the soundness of the anti-philosophic historicism of Marx. Observing its weakness prepares us to concede that history can make room for spiritually impoverished societies: the viability of Marxist nations is a sign not of the soundness of Marx’s prophecy but of the unsoundness of the sanguine historicism on which he based it. We have every right to conclude that history is the opiate of the masses.

Marx’s utopia is impossible because he desires perfection in men, perfection that pre-Marxian philosophers rightly saw as within the province of no one but the philosophical sages. Successful Marxists that came after him maintain their faith in the prognostications of orthodox Marxism in principle but reject it in practice. Mao, for example, rejected his more zealous comrades’ complaints that he had not abolished capitalism in the countryside, arguing that to do so would be inappropriate for a China that had never had a capitalist economy. The fruits of his relative moderation are seen in the totalitarian state he created, which lacked either economic or social development towards any goal but consolidation of the Politburo’s power. When Marxists claim that there has never been a truly socialist state, they are correct, but not for the reasons they think. It is not because socialism has not been fully tried, but because doing so is completely impossible. This impossibility is rooted in man’s very nature, biologically, physically, culturally determined but, crucially, determined without an end. There is no final cause in history. There is only flux, a notion Marx inherited from Heraclitus and stupidly wed to Hegelian progressivism, coming up with the paradoxical idea that the historical change by which past societies arose and fell apart would somehow come to an end in the decay of capitalism.

Libertarianism is an equally utopian vision, but the viability of a libertarian project is considerably rosier, as its utopia does not call for the radical transformation of the human being. Indeed, doing so would conflict with its fundamental principles. Rather, it calls for the harnessing of man’s most self-interested tendencies for a good purpose: selfishness leading to production, production leading to trade, trade to peace and prosperity. The realization of libertarianism in practice is possible. It is a vision of, not better selves, but a better self-actualization of the selves that we possess. Realizing this, we must not make the same mistake that Marx made, for as the popularity of our philosophy shows, we are not the majority. Not even close. We probably never will be. The political life, the state and its coercions, will likely never cease to be a factor in our lives. But what we can do, and what we ought to do, is actualize in ourselves the faculty of self-rule, upon which may be built the more virtuous state.

Why Republican Libertarianism? III

(This text was written for the European Students for Liberty Regional Conference in Istanbul at Boğaziçi University. I did not deliver the paper, but used it to gather thoughts which I then presented in an improvised speech. As it was quite a long text, I am breaking it up for the purposes of blog presentation)

There is a gap between ancient Athens and classical liberalism, and covering that gap will explain more about the development from antique republics to modern liberty. The trio of major antique republican thinkers mentioned above, Aristotle, Polybius, and Cicero, sets up the tradition. They establish the idea of the best state – polity/politea in Greek, republic/res publica in Latin – as one of hearing political power between groups in the context of shared citizenship and decision making.

For Aristotle, that is the sharing of power between oligarchs (the rich, in practice those wealthy through commerce), aristocrats (the virtuous, in practice the educated land owning classes) and the poor majority. Polybius was a later Greek thinker who admired the Roman republic and Cicero was a Roman aristocrat-philosopher from the last years before the republic gave way to the one-man emperor rule system.

Both use arguments from Aristotle but tend to refer to Sparta rather than Athens as the ideal republic, which indicates the difficulties for antique thought in accepting a commercial and free thinking republic as model. Polybius and Cicero both admire the Roman system because they see it as based on law and on sharing power between the people (citizens’ assembly), the aristocracy (senate), and a monarchical function shared between two year-long co-rulers (consuls).

Their arguments also rest on the idea of the state as military camp. It is interesting to note that Pettit the egalitarian liberal prefers this Roman model to Athens and that Arendt prefers the Athenian model. This suggests that Arendt has something to say to classical liberals and libertarians, though she is rarely taken up within that group, and that egalitarian liberalism is rather caught up in strong state ideas, the state strong enough to force redistribution of economic goods rather than impose extreme military spirit on its citizens, but a strong intervening state.

All three of the ancient republican thinkers had difficulty with the idea of a commercially orientated republic and has some idea of virtue in restraining wealth, though Cicero in particular was staggeringly rich suggesting that ancient republican thought had some difficulty in accommodating commercial spirit, more so than some ancient republics in practice.

There is one major step left in ancient republican thinking which is the account the senator-historian Tacitus, of the early Roman Emperor period, gives of liberty in the simple tribal republics of ancient Germans and Britons. He sees them as based on independence of spirit and a willingness to die for that independence, in a way largely lacking amongst the Romans of that time.

The admiration for such ‘barbarian’ liberty also gives some insight into the difficulty of combining commercial spirit with republicanism in ancient thinking. Wealth is seen as something tied to benefits from the state, state patronage, so reduces independence of the state whether the local state or a foreign invading state.

Republicanism takes the next great step forward when some way of thinking of wealth as existing at least partly independently of state patronage appears. This is what happens in northern Italy from about the thirteenth century. To some degree this Italian republicanism has older roots in the maritime republic of Venice, but the trading wealth is still very tied up with aristocratic status and a rigid aristocratic hold on politics.

It is Florence, which serves as a thirteenth, fourteenth, and fifteenth century Athens, where Italian culture, commercial wealth, and republican thinking all thrive. The cultural greatness goes back to the poet Dante and the republicanism to his tutor Bruno Latini. The really great moment in Florentine republicanism comes in the fifteenth and early sixteenth centuries, though, with Francesco Guicciardini, but mostly with Niccoló Machiavelli.

Commentary on Machiavelli is heavily burdened by the image of Evil Machiavel or at least of Machiavelli the cynical advocate of power politics in The Prince. This is just a completely false image of a man whose ideal was the revival of the Roman republic, not the rule of absolute and absolutely immoral princes.

The supposed wickedness and cynicism of The Prince related to comments on how kings seize and maintain power, in which as far as Machiavelli advocates rather than analyses, he advocates minor acts of political violence. The age of Machiavelli is the age of the Catholic Inquisition torturing heretics and passing them to the state to be burned at the stake, the mass persecution and expulsion of Iberian Jews and Muslims, wars of religion and conquest, which involved systematic and mass destruction of property, torture, rape, and murder.

Those who chose to condemn the ‘wickedness’ of Machiavelli at the time were often those engaged in such activities. Machiavelli’s advice to princes does no more than advocate at the most extreme, very limited amounts of violence to institute and maintain rule, certainly very limited by the standards of the time.