- Why we fight over fiction Robin Hanson, Overcoming Bias
- Why money matters Scott Sumner, MoneyIllusion
- Stand up with Aristotle Irfan Khawaja, Policy of Truth
- Never reason from a fatality change Nick Cowen, NOL
Aristotle
Nightcap
- Expanding the Liberty Canon: Aristotle Barry Stocker, NOL
- Expanding the Liberty Canon: Rome and Carthage in the Histories of Polybius Barry Stocker, NOL
- Expanding the Liberty Canon: Cicero’s On the Republic Barry Stocker, NOL
- Florentine Liberty II: Guicciardini, Dialogue on the Government of Florence Barry Stocker, NOL
Nightcap
- Migration in Europe? Where to start! Kapka Kassabova, Spectator
- Aristotle’s definition of citizenship John Hungerford, Law & Liberty
- Michelangelo’s definition of citizenship M Landgrave, NOL
- What can the Catholic Church do? John Cornwell, Financial Times
The State in education – Part III: Institutionalization of learning
In The State in education – Part II: social warfare, we looked at the promise of state-sponsored education and its failure, both socially and as a purveyor of knowledge. The next step is to examine the university, especially since higher education is deeply linked to modern society and because the public school system purports to prepare its students for college.
First, though, there should be a little history on higher education in the West for context since Nietzsche assumed that everyone knew it when he made his remarks in Anti-Education. The university as an abstract concept dates to Aristotle and his Peripatetic School. Following his stint as Alexander the Great’s tutor, Aristotle returned to Athens and opened a school at the Lyceum (Λύκειον) Temple. There, for a fee, he provided the young men of Athens with the same education he had given Alexander. On a side note, this is also a beautiful example of capitalist equality: a royal education was available to all in a mutually beneficial exchange; Aristotle made a living, and the Athenians received brains.
The Lyceum was not a degree granting institution, and only by a man’s knowledge of philosophy, history, literature, language, and debating skills could one tell that he had studied at the Lyceum. A cultural premium on bragging rights soon followed, though, and famous philosophers opening immensely popular schools became de rigueur. By the rise of Roman imperium in the Mediterranean around 250 BC, Hellenic writers included their intellectual pedigrees, i.e. all the famous teachers they had studied with, in their introductions as a credibility passport. The Romans were avid Hellenophiles and adopted everything Greek unilaterally, including the concept of the lyceum-university.
Following the Dark Ages (and not getting into the debate over whether the time was truly dark or not), the modern university emerged in 1088, with the Università di Bologna. It was more of a club than an institution; as Robert S. Rait, mid-20th century medieval historian, remarked in his book Life in the Medieval University, the original meaning of “university” was “association” and it was not used exclusively for education. The main attractions of the university as a concept were it was secular and provided access to books, which were prohibitively expensive at the individual level before the printing press. A bisection of the profiles of Swedish foreign students enrolled at the Leipzig University between 1409 and 1520 shows that the average male student was destined either for the clergy on a prelate track or was of noble extraction. As the article points out, none of the students who later joined the knighthood formally graduated, but the student body is indicative of the associative nature of the university.
The example of Lady Elena Lucrezia Cornaro Piscopia, the first woman to receive a doctoral degree, awarded by the University of Padua in 1678, illuminates the difference between “university” at its original intent and the institutional concept. Cornaro wrote her thesis independently, taking the doctoral exams and defending her work when she and her advisor felt she was ready. No enrollment or attendance at classes was necessary, deemed so unnecessary that she skipped both the bachelor and masters stages. What mattered was that a candidate knew the subject, not the method of acquisition. Even by the mid-19th century, this particular path remained open to remarkable scholars, such as Nietzsche since Leipzig University awarded him his doctorate on the basis of his published articles, rather than a dissertation and defense.
Education’s institutionalization, i.e. the focus shifting more from knowledge to “the experience,” accompanied a broader societal shift. Nietzsche noted in Beyond Good and Evil that humans have an inherent need for boundaries and systemic education played a very prominent role in contemporary man’s processing of that need:
There is an instinct for rank which, more than anything else, is a sign of a high rank; there is a delight in the nuances of reverence that allows us to infer noble origins and habits. The refinement, graciousness, and height of a soul is dangerously tested when something of the first rank passes by without being as yet protected by the shudders of authority against obtrusive efforts and ineptitudes – something that goes its way unmarked, undiscovered, tempting, perhaps capriciously concealed and disguised, like a living touchstone. […] Much is gained once the feeling has finally been cultivated in the masses (among the shallow and in the high-speed intestines of every kind) that they are not to touch everything; that there are holy experiences before which they have to take off their shoes and keep away their unclean hands – this is almost their greatest advance toward humanity. Conversely, perhaps there is nothing about so-called educated people and believers in “modern ideas” that is as nauseous as their lack of modesty and the comfortable insolence in their eyes and hands with which they touch, lick, and finger everything [….] (“What is Noble,” 263)
The idea the philosopher pursued was the notion that university attendance conveyed the future right to “touch, lick, and finger everything,” a very graphic and curmudgeonly way of saying that a certain demographic assumed unjustified airs.
Given that in Anti-Education, Nietzsche lamented the fragmentation of learning into individual disciplines, causing students to lose a sense of the wholeness, the universality of knowledge, what he hated in the nouveau educated, if we will, was the rise of the pseudo-expert – a person whose knowledge was confined to the bounds of a fixed field but was revered as omniscient. The applicability of Socrates’ dialogue with Meno – the one where teacher and student discuss human tendency to lose sight of the whole in pursuit of individual strands – to the situation was unmistakable, something which Nietzsche, a passionate classicist, noticed. The loss of the Renaissance learning model, the trivium and the quadrivium, both of which emphasize an integrated learning matrix, carried with it a belief that excessive specialization was positive; it was a very perverse version of “jack of all trades, master of none.” As Nietzsche bemoaned, the newly-educated desired masters without realizing that all they obtained were jacks. In this, he foreshadowed the disaster of the Versailles Treaty in 1919 and the consequences of Woodrow Wilson’s unwholesome belief in “experts.”
The philosopher squarely blamed the model of the realschule, with its clear-cut subjects and predictable exams, for the breakdown between knowledge acquisition and learning. While he excoriated the Prussian government for basing all public education on the realschule, he admitted that the fragmentation of the university into departments and majors occurred at the will of the people. This was a “chicken or the egg” situation: Was the state or broader society responsible for university learning becoming more like high school? This was not a question Nietzsche was interested in answering since he cared more about consequences. However, he did believe that the root was admitting realschule people to university in the first place. Since such a hypothesis is very applicable today, we will examine it in the contemporary American context next.
Inventions that didn’t change the world
Have you ever learned about an amazing invention–whether it was the Baghdad battery or the ancient Roman steam engine or Chinese firecrackers–and wondered why it didn’t do more to change the world? In this podcast, we examine a selection of curiosities and explore hypotheses for why their inventors didn’t use them to full effect.
We move VERY quickly through a range of fascinating examples and hypotheses, and therefore leave a lot up to discussion. We hope to see your thoughts, feedback, and additions in the comments section!
For any invention that you want to learn more about, see the links below:
Knossos’ toilets
In the 2nd millennium BC, a “palace” (now thought to be a building that served as administrative, trade, and gathering hub) had running-water toilet flushing. Much like the Roman Cloaca Maxima, likely a HUGE public-health benefit, but basically died out. Does this show that military protection/staving off the “Dark Ages” was the only way to maintain amazing inventions?
Link: http://www.nature.com/news/the-secret-history-of-ancient-toilets-1.19960;
The Nimrud lens
Whether it was a fire-starter, a magnifying glass, or (for some overeager astronomy enthusaists), the Neo-Assyrian ground-crystal Nimrud lens is an invention thousands of years out of place. While the Egyptians, Greeks, and Romans all used lenses of different sorts, and glass-blowing was certainly popular by the 1st century BC in Roman Egypt, no glass lenses were made until the Middle Ages and the potential scientific and engineering uses of lenses–that can hardly be understated even in their 16th-to-18th-century applications–had to wait another couple millennia. Many devices like the Baghdad battery and Antikythera device are heralded for their possible engineering genius, but this seems like a simple one with readily available applications that disappeared from the historical record.
https://en.wikipedia.org/wiki/Nimrud_lens
Hero of Alexandria’s steam engine
In the 1st century AD, Hero was a master of simple machines (that were mostly used for plays) and also invented a force pump, a wind-powered machine, even an early vending machine. However, he is likely most famous for his Aeolipile, a rotating steam engine that used heated water to spin an axle. The best attested use of this is for devotion to the divine and party tricks.
https://en.wikipedia.org/wiki/Aeolipile
The ancient mechanical reaper
Ancient Gallo-Romans (or just Gauls) invented a novel way of grain harvesting: rather than using sickles or scythes, they used a mechanical reaper, 1700 years before Cyrus McCormick more than tripled the productivity of American farmers. This antiquated device literally but the cart before the oxen and required two men to operate: one man to drive the beasts, and another to knock the ears off the stalk (this reaper was obviously far less sophisticated than McCormick’s). This invention did not survive the Volkswanderung period.
http://www.gnrtr.com/Generator.html?pi=208&cp=3
http://reapertakethewheel.blogspot.com/2013/03/impacts-of-invention.html
Note: the horse collar (which allowed horses to be used to plow) was invented in 1600-1400 BC in China AND the Levant, but was not applied widely until 1000 AD in Europe. https://en.wikipedia.org/wiki/Horse_collar.
Inoculation
Madhav, an Indian doctor, compiled hundreds of cures in his Nidana, including an inoculation against smallpox that showed an understanding of disease transmission (he would take year-old smallpox-infected flesh and touch it to a recently made cutaneous wound). However, the next 13 centuries did not see Indian medical understanding of viruses or bacteria, or even copied techniques of this, development. https://books.google.com/books?id=Hkc3QnbagK4C&pg=PA105&lpg=PA105&dq=madhav+indian+smallpox+inoculation&source=bl&ots=4RFPuvbf5Y&sig=iyDaNUs4u5N7xHH6-pvlbAY9fcQ&hl=en&sa=X&ved=0ahUKEwic8e-1-JXVAhUp6IMKHfw3DLsQ6AEIOjAD#v=onepage&q=madhav%20indian%20smallpox%20inoculation&f=false
At least, thank god, their methods of giving nose jobs to those who had had their noses cut off as a punishment survived: https://en.wikipedia.org/wiki/History_of_rhinoplasty
The Chinese:
List of all chinese inventions:
https://en.wikipedia.org/wiki/List_of_Chinese_inventions#Four_Great_Inventions
Gunpowder
Gunpowder was discovered by Chinese alchemists attempting to discover the elixir of life (irony, no?)
https://www.thoughtco.com/invention-of-gunpowder-195160
https://en.wikipedia.org/wiki/Four_Great_Inventions
(maybe a good corollary would be Greek fire, which was used effectively in naval warfare by the Byzantines, but which was not improved upon and the recipe of which is still secret: https://en.wikipedia.org/wiki/Greek_fire)
Printing
The Chinese invented the printing press possibly as early as the 6th century. However, unlike the explosion of literacy seen in much of Europe (particularly Protestant Europe–see our last podcast), the Chinese masses never learned to read. In fact, in 1950 fewer than 20% of Chinese citizens were literate. Compare this to Europe, where some societies saw literacy rates of as high as 90% (Sweden, Male population) in some societies within a few centuries of the introduction of the printing press. Why? There may be several reasons–cultural, religious, political–but in our opinion, it would have to be the characters: 100,000 blocks were needed to create a single set.
http://www.nytimes.com/2001/02/12/news/chinas-long-but-uneven-march-to-literacy.html
https://en.wikipedia.org/wiki/History_of_printing_in_East_Asia
They also invented pulped paper by the 2nd century BC: https://en.wikipedia.org/wiki/List_of_Chinese_inventions.
The compass
Invented by 200 BC for divination and used for navigation by the Song dynasty; despite this and the availability of easily colonizable islands within easy sailing distance, the Chinese did not colonize Indonesia, Polynesia, or Oceania, while the Europeans did within the century after they developed the technology and first sailed there.
https://en.wikipedia.org/wiki/History_of_the_compass.
The rudder
While they did not invent the rudder, they invented the “medial, axial, and vertical” sternpost rudder that would become standard in Europe almost 1,000 years before it was used in Europe (1st century AD vs 11th century).
Natural gas
The Chinese discovered “fire wells” (natural gas near the surface) and erected shrines to worship there.
https://link.springer.com/referenceworkentry/10.1007%2F978-1-4020-4425-0_9568
They even understood their potential for fuel, but never developed beyond primitive burning and bamboo piping despite having advanced mining techniques for it by the 1st century BC.
Chinese miscelleni:
Hydraulic powered fan: https://en.wikipedia.org/wiki/Fan_(machine)#History
Cuppola furnace for smelting and molding iron: https://en.wikipedia.org/wiki/Cupola_furnace.
Coke as a fuel source: https://en.wikipedia.org/wiki/Coke_(fuel).
Belt-drive spinning wheel: https://en.wikipedia.org/wiki/Coke_(fuel).
The Precolumbian wheel
The pre- and early Mayans had toys that utilized primitive wheels, but did not use them for any labor-saving purpose (even their gods were depicted carrying loads on their backs). This may have been because scaling up met with mechanical difficulties, but the potential utility of wheels in this case with a bit of investment literally sat unrealized for centuries.
https://tcmam.wordpress.com/2010/11/11/did-pre-columbian-mesoamericans-use-wheels/
The Tucker:
http://www.smithsonianmag.com/history/the-tucker-was-the-1940s-car-of-the-future-135008742/
The following book contained some of our hypotheses:
The rest of our hypotheses were amalgamated from our disparate classes in economics and history, but none of them are our own or uncommon in academic circles. Thanks for listening!
The death of reason
“In so far as their only recourse to that world is through what they see and do, we may want to say that after a revolution scientists are responding to a different world.”
Thomas Kuhn, The Structure of Scientific Revolutions p. 111
I can remember arguing with my cousin right after Michael Brown was shot. “It’s still unclear what happened,” I said, “based soley on testimony” — at that point, we were still waiting on the federal autopsy report by the Department of Justice. He said that in the video, you can clearly see Brown, back to the officer and with his hands up, as he is shot up to eight times.
My cousin doesn’t like police. I’m more ambivalent, but I’ve studied criminal justice for a few years now, and I thought that if both of us watched this video (no such video actually existed), it was probably I who would have the more nuanced grasp of what happened. So I said: “Well, I will look up this video, try and get a less biased take and get back to you.” He replied, sarcastically, “You can’t watch it without bias. We all have biases.”
And that seems to be the sentiment of the times: bias encompasses the human experience, it subsumes all judgments and perceptions. Biases are so rampant, in fact, that no objective analysis is possible. These biases may be cognitive, like confirmation bias, emotional fallacies or that phenomenon of constructive memory; or inductive, like selectivity or ignoring base probability; or, as has been common to think, ingrained into experience itself.
The thing about biases is that they are open to psychological evaluation. There are precedents for eliminating them. For instance, one common explanation of racism is that familiarity breeds acceptance, and infamiliarity breeds intolerance (as Reason points out, people further from fracking sites have more negative opinions on the practice than people closer). So to curb racism (a sort of bias), children should interact with people outside of their singular ethnic group. More clinical methodology seeks to transform mental functions that are automatic to controlled, and thereby enter reflective measures into perception, reducing bias. Apart from these, there is that ancient Greek practice of reasoning, wherein patterns and evidence are used to generate logical conclusions.
If it were true that human bias is all-encompassing, and essentially insurmountable, the whole concept of critical thinking goes out the window. Not only do we lose the critical-rationalist, Popperian mode of discovery, but also Socratic dialectic, as essentially “higher truths” disappear from human lexicon.
The belief that biases are intrinsic to human judgment ignores psychological or philosophical methods to counter prejudice because it posits that objectivity itself is impossible. This viewpoint has been associated with “postmodern” schools of philosophy, such as those Dr. Rosi commented on (e.g., those of Derrida, Lacan, Foucault, Butler), although it’s worth pointing out that the analytic tradition, with its origins in Frege, Russell and Moore represents a far greater break from the previous, modern tradition of Descartes and Kant, and often reached similar conclusions as the Continentals.
Although theorists of the “postmodern” clique produced diverse claims about knowledge, society, and politics, the most famous figures are nearly almost always associated or incorporated into the political left. To make a useful simplification of viewpoints: it would seem that progressives have generally accepted Butlerian non-essentialism about gender and Foucauldian terminology (discourse and institutions). Derrida’s poststructuralist critique noted dichotomies and also claimed that the philosophical search for Logos has been patriarchal, almost neoreactionary. (The month before Donald Trump’s victory, the word patriarchy had an all-time high at Google search.) It is not a far right conspiracy that European philosophers with strange theories have influenced and sought to influence American society; it is patent in the new political language.
Some people think of the postmodernists as all social constructivists, holding the theory that many of the categories and identifications we use in the world are social constructs without a human-independent nature (e.g., not natural kinds). Disciplines like anthropology and sociology have long since dipped their toes, and the broader academic community, too, relates that things like gender and race are social constructs. But the ideas can and do go further: “facts” themselves are open to interpretation on this view: to even assert a “fact” is just to affirm power of some sort. This worldview subsequently degrades the status of science into an extended apparatus for confirmation-bias, filling out the details of a committed ideology rather than providing us with new facts about the world. There can be no objectivity outside of a worldview.
Even though philosophy took a naturalistic turn with the philosopher W. V. O. Quine, seeing itself as integrating with and working alongside science, the criticisms of science as an establishment that emerged in the 1950s and 60s (and earlier) often disturbed its unique epistemic privilege in society: ideas that theory is underdetermined by evidence, that scientific progress is nonrational, that unconfirmed auxiliary hypotheses are required to conduct experiments and form theories, and that social norms play a large role in the process of justification all damaged the mythos of science as an exemplar of human rationality.
But once we have dismantled Science, what do we do next? Some critics have held up Nazi German eugenics and phrenology as examples of the damage that science can do to society (nevermind that we now consider them pseudoscience). Yet Lysenkoism and the history of astronomy and cosmology indicate that suppressing scientific discovery can too be deleterious. Austrian physicist and philosopher Paul Feyerabend instead wanted a free society — one where science had equal power as older, more spiritual forms of knowledge. He thought the model of rational science exemplified in Sir Karl Popper was inapplicable to the real machinery of scientific discovery, and the only methodological rule we could impose on science was: “anything goes.”
Feyerabend’s views are almost a caricature of postmodernism, although he denied the label “relativist,” opting instead for philosophical Dadaist. In his pluralism, there is no hierarchy of knowledge, and state power can even be introduced when necessary to break up scientific monopoly. Feyerabend, contra scientists like Richard Dawkins, thought that science was like an organized religion and therefore supported a separation of church and state as well as a separation of state and science. Here is a move forward for a society that has started distrusting the scientific method… but if this is what we should do post-science, it’s still unclear how to proceed. There are still queries for anyone who loathes the hegemony of science in the Western world.
For example, how does the investigation of crimes proceed without strict adherence to the latest scientific protocol? Presumably, Feyerabend didn’t want to privatize law enforcement, but science and the state are very intricately connected. In 2005, Congress authorized the National Academy of Sciences to form a committee and conduct a comprehensive study on contemporary legal science to identify community needs, evaluating laboratory executives, medical examiners, coroners, anthropologists, entomologists, ontologists, and various legal experts. Forensic science — scientific procedure applied to the field of law — exists for two practical goals: exoneration and prosecution. However, the Forensic Science Committee revealed that severe issues riddle forensics (e.g., bite mark analysis), and in their list of recommendations the top priority is establishing an independent federal entity to devise consistent standards and enforce regular practice.
For top scientists, this sort of centralized authority seems necessary to produce reliable work, and it entirely disagrees with Feyerabend’s emphasis on methodological pluralism. Barack Obama formed the National Commission on Forensic Science in 2013 to further investigate problems in the field, and only recently Attorney General Jeff Sessions said the Department of Justice will not renew the committee. It’s unclear now what forensic science will do to resolve its ongoing problems, but what is clear is that the American court system would fall apart without the possibility of appealing to scientific consensus (especially forensics), and that the only foreseeable way to solve the existing issues is through stricter methodology. (Just like with McDonalds, there are enforced standards so that the product is consistent wherever one orders.) More on this later.
So it doesn’t seem to be in the interest of things like due process to abandon science or completely separate it from state power. (It does, however, make sense to move forensic laboratories out from under direct administrative control, as the NAS report notes in Recommendation 4. This is, however, specifically to reduce bias.) In a culture where science is viewed as irrational, Eurocentric, ad hoc, and polluted with ideological motivations — or where Reason itself is seen as a particular hegemonic, imperial device to suppress different cultures — not only do we not know what to do, when we try to do things we lose elements of our civilization that everyone agrees are valuable.
Although Aristotle separated pathos, ethos and logos (adding that all informed each other), later philosophers like Feyerabend thought of reason as a sort of “practice,” with history and connotations like any other human activity, falling far short of sublime. One could no more justify reason outside of its European cosmology than the sacrificial rituals of the Aztecs outside of theirs. To communicate across paradigms, participants have to understand each other on a deep level, even becoming entirely new persons. When debates happen, they must happen on a principle of mutual respect and curiosity.
From this one can detect a bold argument for tolerance. Indeed, Feyerabend was heavily influenced by John Stuart Mill’s On Liberty. Maybe, in a world disillusioned with scientism and objective standards, the next cultural move is multilateral acceptance and tolerance for each others’ ideas.
This has not been the result of postmodern revelations, though. The 2016 election featured the victory of one psychopath over another, from two camps utterly consumed with vitriol for each other. Between Bernie Sanders, Donald Trump and Hillary Clinton, Americans drifted toward radicalization as the only establishment candidate seemed to offer the same noxious, warmongering mess of the previous few decades of administration. Politics has only polarized further since the inauguration. The alt-right, a nearly perfect symbol of cultural intolerance, is regular news for mainstream media. Trump acolytes physically brawl with black bloc Antifa in the same city of the 1960s Free Speech Movement. It seems to be the worst at universities. Analytic feminist philosophers asked for the retraction of a controversial paper, seemingly without reading it. Professors even get involved in student disputes, at Berkeley and more recently Evergreen. The names each side uses to attack each other (“fascist,” most prominently) — sometimes accurate, usually not — display a political divide with groups that increasingly refuse to argue their own side and prefer silencing their opposition.
There is not a tolerant left or tolerant right any longer, in the mainstream. We are witnessing only shades of authoritarianism, eager to destroy each other. And what is obvious is that the theories and tools of the postmodernists (post-structuralism, social constructivism, deconstruction, critical theory, relativism) are as useful for reactionary praxis as their usual role in left-wing circles. Says Casey Williams in the New York Times: “Trump’s playbook should be familiar to any student of critical theory and philosophy. It often feels like Trump has stolen our ideas and weaponized them.” The idea of the “post-truth” world originated in postmodern academia. It is the monster turning against Doctor Frankenstein.
Moral (cultural) relativism in particular only promises rejecting our shared humanity. It paralyzes our judgment on female genital mutilation, flogging, stoning, human and animal sacrifice, honor killing, Caste, underground sex trade. The afterbirth of Protagoras, cruelly resurrected once again, does not promise trials at Nuremberg, where the Allied powers appealed to something above and beyond written law to exact judgment on mass murderers. It does not promise justice for the ethnic cleansers in Srebrenica, as the United Nations is helpless to impose a tribunal from outside Bosnia-Herzegovina. Today, this moral pessimism laughs at the phrase “humanitarian crisis,” and Western efforts to change the material conditions of fleeing Iraqis, Afghans, Libyans, Syrians, Venezuelans, North Koreans…
In the absence of universal morality, and the introduction of subjective reality, the vacuum will be filled with something much more awful. And we should be afraid of this because tolerance has not emerged as a replacement. When Harry Potter first encounters Voldemort face-to-scalp, the Dark Lord tells the boy “There is no good and evil. There is only power… and those too weak to seek it.” With the breakdown of concrete moral categories, Feyerabend’s motto — anything goes — is perverted. Voldemort has been compared to Plato’s archetype of the tyrant from the Republic: “It will commit any foul murder, and there is no food it refuses to eat. In a word, it omits no act of folly or shamelessness” … “he is purged of self-discipline and is filled with self-imposed madness.”
Voldemort is the Platonic appetite in the same way he is the psychoanalytic id. Freud’s das Es is able to admit of contradictions, to violate Aristotle’s fundamental laws of logic. It is so base, and removed from the ordinary world of reason, that it follows its own rules we would find utterly abhorrent or impossible. But it is not difficult to imagine that the murder of evidence-based reasoning will result in Death Eater politics. The ego is our rational faculty, adapted to deal with reality; with the death of reason, all that exists is vicious criticism and unfettered libertinism.
Plato predicts Voldemort with the image of the tyrant, and also with one of his primary interlocutors, Thrasymachus, when the sophist opens with “justice is nothing other than the advantage of the stronger.” The one thing Voldemort admires about The Boy Who Lived is his bravery, the trait they share in common. This trait is missing in his Death Eaters. In the fourth novel the Dark Lord is cruel to his reunited followers for abandoning him and losing faith; their cowardice reveals the fundamental logic of his power: his disciples are not true devotees, but opportunists, weak on their own merit and drawn like moths to every Avada Kedavra. Likewise students flock to postmodern relativism to justify their own beliefs when the evidence is an obstacle.
Relativism gives us moral paralysis, allowing in darkness. Another possible move after relativism is supremacy. One look at Richard Spencer’s Twitter demonstrates the incorrigible tenet of the alt-right: the alleged incompatibility of cultures, ethnicities, races: that different groups of humans simply can not get along together. The Final Solution is not about extermination anymore but segregated nationalism. Spencer’s audience is almost entirely men who loathe the current state of things, who share far-reaching conspiracy theories, and despise globalism.
The left, too, creates conspiracies, imagining a bourgeois corporate conglomerate that enlists economists and brainwashes through history books to normalize capitalism; for this reason they despise globalism as well, saying it impoverishes other countries or destroys cultural autonomy. For the alt-right, it is the Jews, and George Soros, who control us; for the burgeoning socialist left, it is the elites, the one-percent. Our minds are not free; fortunately, they will happily supply Übermenschen, in the form of statesmen or critical theorists, to save us from our degeneracy or our false consciousness.
Without the commitment to reasoned debate, tribalism has continued the polarization and inhumility. Each side also accepts science selectively, if they do not question its very justification. The privileged status that the “scientific method” maintains in polite society is denied when convenient; whether it is climate science, evolutionary psychology, sociology, genetics, biology, anatomy or, especially, economics: one side is outright rejecting it, without studying the material enough to immerse oneself in what could be promising knowledge (as Feyerabend urged, and the breakdown of rationality could have encouraged). And ultimately, equal protection, one tenet of individualist thought that allows for multiplicity, is entirely rejected by both: we should be treated differently as humans, often because of the color of our skin.
Relativism and carelessness for standards and communication has given us supremacy and tribalism. It has divided rather than united. Voldemort’s chaotic violence is one possible outcome of rejecting reason as an institution, and it beckons to either political alliance. Are there any examples in Harry Potter of the alternative, Feyerabendian tolerance? Not quite. However, Hermione Granger serves as the Dark Lord’s foil, and gives us a model of reason that is not as archaic as the enemies of rationality would like to suggest. In Against Method (1975), Feyerabend compares different ways rationality has been interpreted alongside practice: in an idealist way, in which reason “completely governs” research, or a naturalist way, in which reason is “completely determined by” research. Taking elements of each, he arrives at an intersection in which one can change the other, both “parts of a single dialectical process.”
“The suggestion can be illustrated by the relation between a map and the adventures of a person using it or by the relation between an artisan and his instruments. Originally maps were constructed as images of and guides to reality and so, presumably, was reason. But maps, like reason, contain idealizations (Hecataeus of Miletus, for examples, imposed the general outlines of Anaximander’s cosmology on his account of the occupied world and represented continents by geometrical figures). The wanderer uses the map to find his way but he also corrects it as he proceeds, removing old idealizations and introducing new ones. Using the map no matter what will soon get him into trouble. But it is better to have maps than to proceed without them. In the same way, the example says, reason without the guidance of a practice will lead us astray while a practice is vastly improved by the addition of reason.” p. 233
Christopher Hitchens pointed out that Granger sounds like Bertrand Russell at times, like this quote about the Resurrection Stone: “You can claim that anything is real if the only basis for believing in it is that nobody has proven it doesn’t exist.” Granger is often the embodiment of anemic analytic philosophy, the institution of order, a disciple for the Ministry of Magic. However, though initially law-abiding, she quickly learns with Potter and Weasley the pleasures of rule-breaking. From the first book onward, she is constantly at odds with the de facto norms of the university, becoming more rebellious as time goes on. It is her levelheaded foundation, but ability to transgress rules, that gives her an astute semi-deontological, semi-utilitarian calculus capable of saving the lives of her friends from the dark arts, and helping to defeat the tyranny of Voldemort foretold by Socrates.
Granger presents a model of reason like Feyerabend’s map analogy. Although pure reason gives us an outline of how to think about things, it is not a static or complete blueprint, and it must be fleshed out with experience, risk-taking, discovery, failure, loss, trauma, pleasure, offense, criticism, and occasional transgressions past the foreseeable limits. Adding these addenda to our heuristics means that we explore a more diverse account of thinking about things and moving around in the world.
When reason is increasingly seen as patriarchal, Western, and imperialist, the only thing consistently offered as a replacement is something like lived experience. Some form of this idea is at least a century old, with Husserl, still modest by reason’s Greco-Roman standards. Yet lived experience has always been pivotal to reason; we only need adjust our popular model. And we can see that we need not reject one or the other entirely. Another critique of reason says it is fool-hardy, limiting, antiquated; this is a perversion of its abilities, and plays to justify the first criticism. We can see that there is room within reason for other pursuits and virtues, picked up along the way.
The emphasis on lived experience, which predominantly comes from the political left, is also antithetical for the cause of “social progress.” Those sympathetic to social theory, particularly the cultural leakage of the strong programme, are constantly torn between claiming (a) science is irrational, and can thus be countered by lived experience (or whatnot) or (b) science may be rational but reason itself is a tool of patriarchy and white supremacy and cannot be universal. (If you haven’t seen either of these claims very frequently, and think them a strawman, you have not been following university protests and editorials. Or radical Twitter: ex., ex., ex., ex.) Of course, as in Freud, this is an example of kettle-logic: the signal of a very strong resistance. We see, though, that we need not accept nor deny these claims and lose anything. Reason need not be stagnant nor all-pervasive, and indeed we’ve been critiquing its limits since 1781.
Outright denying the process of science — whether the model is conjectures and refutations or something less stale — ignores that there is no single uniform body of science. Denial also dismisses the most powerful tool for making difficult empirical decisions. Michael Brown’s death was instantly a political affair, with implications for broader social life. The event has completely changed the face of American social issues. The first autopsy report, from St. Louis County, indicated that Brown was shot at close range in the hand, during an encounter with Officer Darren Wilson. The second independent report commissioned by the family concluded the first shot had not in fact been at close range. After the disagreement with my cousin, the Department of Justice released the final investigation report, and determined that material in the hand wound was consistent with gun residue from an up-close encounter.
Prior to the report, the best evidence available as to what happened in Missouri on August 9, 2014, was the ground footage after the shooting and testimonies from the officer and Ferguson residents at the scene. There are two ways to approach the incident: reason or lived experience. The latter route will lead to ambiguities. Brown’s friend Dorian Johnson and another witness reported that Officer Wilson fired his weapon first at range, under no threat, then pursued Brown out of his vehicle, until Brown turned with his hands in the air to surrender. However, in the St. Louis grand jury half a dozen (African-American) eyewitnesses corroborated Wilson’s account: that Brown did not have his hands raised and was moving toward Wilson. In which direction does “lived experience” tell us to go, then? A new moral maxim — the duty to believe people — will lead to no non-arbitrary conclusion. (And a duty to “always believe x,” where x is a closed group, e.g. victims, will put the cart before the horse.) It appears that, in a case like this, treating evidence as objective is the only solution.
Introducing ad hoc hypotheses, e.g., the Justice Department and the county examiner are corrupt, shifts the approach into one that uses induction, and leaves behind lived experience (and also ignores how forensic anthropology is actually done). This is the introduction of, indeed, scientific standards. (By looking at incentives for lying it might also employ findings from public choice theory, psychology, behavioral economics, etc.) So the personal experience method creates unresolvable ambiguities, and presumably will eventually grant some allowance to scientific procedure.
If we don’t posit a baseline-rationality — Hermione Granger pre-Hogwarts — our ability to critique things at all disappears. Utterly rejecting science and reason, denying objective analysis in the presumption of overriding biases, breaking down naïve universalism into naïve relativism — these are paths to paralysis on their own. More than that, they are hysterical symptoms, because they often create problems out of thin air. Recently, a philosopher and mathematician submitted a hoax paper, Sokal-style, to a peer-reviewed gender studies journal in an attempt to demonstrate what they see as a problem “at the heart of academic fields like gender studies.” The idea was to write a nonsensical, postmodernish essay, and if the journal accepted it, that would indicate the field is intellectually bankrupt. Andrew Smart at Psychology Today instead wrote of the prank: “In many ways this academic hoax validates many of postmodernism’s main arguments.” And although Smart makes some informed points about problems in scientific rigor as a whole, he doesn’t hint at what the validation of postmodernism entails: should we abandon standards in journalism and scholarly integrity? Is the whole process of peer-review functionally untenable? Should we start embracing papers written without any intention of making sense, to look at knowledge concealed below the surface of jargon? The paper, “The conceptual penis,” doesn’t necessarily condemn the whole of gender studies; but, against Smart’s reasoning, we do in fact know that counterintuitive or highly heterodox theory is considered perfectly average.
There were other attacks on the hoax, from Slate, Salon and elsewhere. Criticisms, often valid for the particular essay, typically didn’t move the conversation far enough. There is much more for this discussion. A 2006 paper from the International Journal of Evidence Based Healthcare, “Deconstructing the evidence-based discourse in health sciences,” called the use of scientific evidence “fascist.” In the abstract the authors state their allegiance to the work of Deleuze and Guattari. Real Peer Review, a Twitter account that collects abstracts from scholarly articles, regulary features essays from the departments of women and gender studies, including a recent one from a Ph. D student wherein the author identifies as a hippopotamus. Sure, the recent hoax paper doesn’t really say anything. but it intensifies this much-needed debate. It brings out these two currents — reason and the rejection of reason — and demands a solution. And we know that lived experience is going to be often inconclusive.
Opening up lines of communication is a solution. One valid complaint is that gender studies seems too insulated, in a way in which chemistry, for instance, is not. Critiquing a whole field does ask us to genuinely immerse ourselves first, and this is a step toward tolerance: it is a step past the death of reason and the denial of science. It is a step that requires opening the bubble.
The modern infatuation with human biases as well as Feyerabend’s epistemological anarchism upset our faith in prevailing theories, and the idea that our policies and opinions should be guided by the latest discoveries from an anonymous laboratory. Putting politics first and assuming subjectivity is all-encompassing, we move past objective measures to compare belief systems and theories. However, isn’t the whole operation of modern science designed to work within our means? The system by Kant set limits on humanity rationality, and most science is aligned with an acceptance of fallibility. As Harvard cognitive scientist Steven Pinker says, “to understand the world, we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity.”
Pinker goes for far as to advocate for scientism. Others need not; but we must understand an academic field before utterly rejecting it. We must think we can understand each other, and live with each other. We must think there is a baseline framework that allows permanent cross-cultural correspondence — a shared form of life which means a Ukrainian can interpret a Russian and a Cuban an American. The rejection of Homo Sapiens commensurability, championed by people like Richard Spencer and those in identity politics, is a path to segregation and supremacy. We must reject Gorgian nihilism about communication, and the Presocratic relativism that camps our moral judgments in inert subjectivity. From one Weltanschauung to the next, our common humanity — which endures class, ethnicity, sex, gender — allows open debate across paradigms.
In the face of relativism, there is room for a nuanced middleground between Pinker’s scientism and the rising anti-science, anti-reason philosophy; Paul Feyerabend has sketched out a basic blueprint. Rather than condemning reason as a Hellenic germ of Western cultural supremacy, we need only adjust the theoretical model to incorporate the “new America of knowledge” into our critical faculty. It is the raison d’être of philosophers to present complicated things in a more digestible form; to “put everything before us,” so says Wittgenstein. Hopefully, people can reach their own conclusions, and embrace the communal human spirit as they do.
However, this may not be so convincing. It might be true that we have a competition of cosmologies: one that believes in reason and objectivity, one that thinks reason is callow and all things are subjective.These two perspectives may well be incommensurable. If I try to defend reason, I invariably must appeal to reasons, and thus argue circularly. If I try to claim “everything is subjective,” I make a universal statement, and simultaneously contradict myself. Between begging the question and contradicting oneself, there is not much indication of where to go. Perhaps we just have to look at history and note the results of either course when it has been applied, and take it as a rhetorical move for which path this points us toward.
The existentialist origins of postmodernism
In part, postmodernism has its origin in the existentialism of the 19th and 20th centuries. The Danish theologian and philosopher Søren Kierkegaard (1813-1855) is generally regarded as the first existentialist. Kierkegaard had his life profoundly marked by the breaking of an engagement and by his discomfort with the formalities of the (Lutheran) Church of Denmark. In his understanding (as well as of others of the time, within a movement known as Pietism, influential mainly in Germany, but with strong precedence over the English Methodism of John Wesley) Lutheran theology had become overly intellectual, marked by a “Protestant scholasticism.”
Scholasticism was before this period a branch of Catholic theology, whose main representative was Thomas Aquinas (1225-1274). Thomas Aquinas argued against the theory of the double truth, defended by Muslim theologians of his time. According to this theory, something could be true in religion and not be true in the empirical sciences. Thomas Aquinas defended a classic concept of truth, used centuries earlier by Augustine of Hippo (354-430), to affirm that the truth could not be so divided. Martin Luther (1483-1546) made many criticisms of Thomas Aquinas, but ironically the methodological precision of the medieval theologian became quite influential in Lutheran theology of the 17th and 18th centuries. In Germany and the Nordic countries (Denmark, Finland, Iceland, Norway and Sweden) Lutheranism became the state religion after the Protestant Reformation of the 16th century, and being the pastor of churches in major cities became a respected and coveted public office.
It is against this intellectualism and this facility of being Christian that Kierkegaard revolts. In 19th century Denmark, all were born within the Lutheran Church, and being a Christian was the socially accepted position. Kierkegaard complained that in centuries past being a Christian was not easy, and could even involve life-threatening events. In the face of this he argued for a Christianity that involved an individual decision against all evidence. In one of his most famous texts he makes an exposition of the story in which the patriarch Abraham is asked by God to kill Isaac, his only son. Kierkegaard imagines a scenario in which Abraham does not understand the reasons of God, but ends up obeying blindly. In Kierkegaard’s words, Abraham gives “a leap of faith.”
This concept of blind faith, going against all the evidence, is central to Kierkegaard’s thinking, and became very influential in twentieth-century Christianity and even in other Western-established religions. Beyond the strictly religious aspect, Kierkegaard marked Western thought with the notion that some things might be true in some areas of knowledge but not in others. Moreover, its influence can be seen in the notion that the individual must make decisions about how he intends to exist, regardless of the rules of society or of all empirical evidence.
Another important existentialist philosopher of the 19th century was the German Friedrich Nietzsche (1844-1900). Like Kierkegaard, Nietzsche was also raised within Lutheranism but, unlike Kierkegaard, he became an atheist in his adult life. Like Kierkegaard, Nietzsche also became a critic of the social conventions of his time, especially the religious conventions. Nietzsche is particularly famous for the phrase “God is dead.” This phrase appears in one of his most famous texts, in which the Christian God attends a meeting with the other gods and affirms that he is the only god. In the face of this statement the other gods die of laughing. The Christian God effectively becomes the only god. But later, the Christian God dies of pity for seeing his followers on the earth becoming people without courage.
Nietzsche was particularly critical of how Christianity in his day valued features which he considered weak, calling them virtues, and condemned features he considered strong, calling them vices. Not just Christianity. Nietzsche also criticized the classical philosophy of Socrates, Plato, and Aristotle, placing himself alongside the sophists. The German philosopher affirmed that Socrates valued behaviors like kindness, humility, and generosity simply because he was ugly. More specifically, Nietzsche questioned why classical philosophers defended Apollo, considered the god of wisdom, and criticized Dionysius, considered the god of debauchery. In Greco-Roman mythology Dionysius (or Bacchus, as he was known by the Romans) was the god of festivals, wine, and insania, symbolizing everything that is chaotic, dangerous, and unexpected. Thus, Nietzsche questioned the apparent arbitrariness of the defense of Apollo’s rationality and order against the irrationality and unpredictability of Dionysius.
Nietzsche’s philosophy values courage and voluntarism, the urge to go against “herd behavior” and become a “superman,” that is, a person who goes against the dictates of society to create his own rules . Although he went in a different religious direction from Kierkegaard, Nietzsche agreed with the Danish theologian on the necessity of the individual to go against the conventions and the reason to dictate the rules of his own existence.
In the second half of the 20th century existentialism became an influential philosophical current, represented by people like Jean-Paul Sartre (1905-1980) and Albert Camus (1913-1960). Like their predecessors of the 19th century, these existentialists criticized the apparent absurdity of life and valued decision-making by the individual against rational and social dictates.
Dear Greeks:
I hear you can’t pay your debts again. I am a little sorry but you brought it on yourselves. A few reminders.
Your country is a democracy. The way you got into this pickle is through the stupid, self-indulgent policies of those you elected. You did it again in your last election by bringing to power a bragging leftist party in the old Stalinist mold. What did you think they would do: Frighten the European Union, The International Monetary Fund (Number one stockholder the US), Germany, the world, into submission, into erasing your debt? Think!
The reason Germany is your principal creditor is that one of your previous governments begged Germany for help and it agreed to help. The Germans did not cram loan after loan down your throat; you asked. The big sillies thought you would be honorable and pay up as agreed. Do you care about your future reputation, your honor, your children’s future ability to walk in the world with their heads up? Here is a basic rule of politeness which is also a moral rule: When somebody gives you a hand, you don’t bite it viciously.
There are several reasons your government can’t pay its debts. One reason is that your political class is corrupt trough and through. Another is that you are reluctant to pay taxes the way normal people do in the European Union. Too many Greeks want to work and pretend-work for the government instead of doing real work. And your government still owns stuff no government anywhere should ever own because governments always make a mess of running them, resorts, among others.
Another reason why your government can’t pay its bills is that your country is genuinely poor for a European country. There too, you have a lot of explaining to do. For one thing, you have been living above your means for a long time, pretending you were more or less like Danes, or Germans. Well, the truth is that you are not, not even close; Danes and Germans are very productive; you are not. So, you should not have ever expected to work short weeks and to take long summer vacations, like Danes and Germans. Such privileges do not come automatically with membership in the Union, you know. You should look over the border on the despised neighbors, the Turks, instead. They don’t pretend to themselves that they are already rich; they go to work early and they close their shops late. Many of them work six days a weeks. Over the past ten years, the growth rates of their economy has left yours in the dust. Coincidence?
And you only make yourself even more scorned with your treatment of others. The real horrors that Nazi Germany inflicted on Greece more than 70 years ago are not much of an excuse anymore. A previous government of yours, an elected government, accepted reparations a long time ago. And, by the way, in 1945, Germany was much more devastated than Greece, and still in 1948. See where the Germans are now, and where you are? Any comment?
And do you ever wonder why the Estonians, in the stultifying Soviet prison for fitly years, never ask for new loans to pay back older loans? And how long anyway did you expect German workers to work until age 69 so your public servants could continue to retire at 63? Are you out of your minds?
One last thing: You are not exactly Classical Greece. Stop wrapping yourselves in Aristotle’s toga. Really study Socrates. He chose to die than cheat even a little. Neither he nor Aristotle was a whiner. That’s why they are still remembered and honored.
In the end, I wish you well. Everyone can unlearn bad habits and learn basic rationality, even late in life. I hope you soon leave that club where you don’t belong. I hope further that you can make your way back. Begin by getting up at 6 every morning. Also, learn the obvious: socialism does not work well for rich countries; it’s miserable for poor countries.
Why Republican Libertarianism? III
(This text was written for the European Students for Liberty Regional Conference in Istanbul at Boğaziçi University. I did not deliver the paper, but used it to gather thoughts which I then presented in an improvised speech. As it was quite a long text, I am breaking it up for the purposes of blog presentation)
There is a gap between ancient Athens and classical liberalism, and covering that gap will explain more about the development from antique republics to modern liberty. The trio of major antique republican thinkers mentioned above, Aristotle, Polybius, and Cicero, sets up the tradition. They establish the idea of the best state – polity/politea in Greek, republic/res publica in Latin – as one of hearing political power between groups in the context of shared citizenship and decision making.
For Aristotle, that is the sharing of power between oligarchs (the rich, in practice those wealthy through commerce), aristocrats (the virtuous, in practice the educated land owning classes) and the poor majority. Polybius was a later Greek thinker who admired the Roman republic and Cicero was a Roman aristocrat-philosopher from the last years before the republic gave way to the one-man emperor rule system.
Both use arguments from Aristotle but tend to refer to Sparta rather than Athens as the ideal republic, which indicates the difficulties for antique thought in accepting a commercial and free thinking republic as model. Polybius and Cicero both admire the Roman system because they see it as based on law and on sharing power between the people (citizens’ assembly), the aristocracy (senate), and a monarchical function shared between two year-long co-rulers (consuls).
Their arguments also rest on the idea of the state as military camp. It is interesting to note that Pettit the egalitarian liberal prefers this Roman model to Athens and that Arendt prefers the Athenian model. This suggests that Arendt has something to say to classical liberals and libertarians, though she is rarely taken up within that group, and that egalitarian liberalism is rather caught up in strong state ideas, the state strong enough to force redistribution of economic goods rather than impose extreme military spirit on its citizens, but a strong intervening state.
All three of the ancient republican thinkers had difficulty with the idea of a commercially orientated republic and has some idea of virtue in restraining wealth, though Cicero in particular was staggeringly rich suggesting that ancient republican thought had some difficulty in accommodating commercial spirit, more so than some ancient republics in practice.
There is one major step left in ancient republican thinking which is the account the senator-historian Tacitus, of the early Roman Emperor period, gives of liberty in the simple tribal republics of ancient Germans and Britons. He sees them as based on independence of spirit and a willingness to die for that independence, in a way largely lacking amongst the Romans of that time.
The admiration for such ‘barbarian’ liberty also gives some insight into the difficulty of combining commercial spirit with republicanism in ancient thinking. Wealth is seen as something tied to benefits from the state, state patronage, so reduces independence of the state whether the local state or a foreign invading state.
Republicanism takes the next great step forward when some way of thinking of wealth as existing at least partly independently of state patronage appears. This is what happens in northern Italy from about the thirteenth century. To some degree this Italian republicanism has older roots in the maritime republic of Venice, but the trading wealth is still very tied up with aristocratic status and a rigid aristocratic hold on politics.
It is Florence, which serves as a thirteenth, fourteenth, and fifteenth century Athens, where Italian culture, commercial wealth, and republican thinking all thrive. The cultural greatness goes back to the poet Dante and the republicanism to his tutor Bruno Latini. The really great moment in Florentine republicanism comes in the fifteenth and early sixteenth centuries, though, with Francesco Guicciardini, but mostly with Niccoló Machiavelli.
Commentary on Machiavelli is heavily burdened by the image of Evil Machiavel or at least of Machiavelli the cynical advocate of power politics in The Prince. This is just a completely false image of a man whose ideal was the revival of the Roman republic, not the rule of absolute and absolutely immoral princes.
The supposed wickedness and cynicism of The Prince related to comments on how kings seize and maintain power, in which as far as Machiavelli advocates rather than analyses, he advocates minor acts of political violence. The age of Machiavelli is the age of the Catholic Inquisition torturing heretics and passing them to the state to be burned at the stake, the mass persecution and expulsion of Iberian Jews and Muslims, wars of religion and conquest, which involved systematic and mass destruction of property, torture, rape, and murder.
Those who chose to condemn the ‘wickedness’ of Machiavelli at the time were often those engaged in such activities. Machiavelli’s advice to princes does no more than advocate at the most extreme, very limited amounts of violence to institute and maintain rule, certainly very limited by the standards of the time.
Expanding the Liberty Canon: Aristotle
Apparently some people have enjoyed the posts on ‘Another Liberty Canon’, so I will keep going on that tack, but with a revision to the heading as I ‘ll be covering some thinkers already accepted into the liberty canon, or at least some of the various canons. I’ll continue to discuss what I think should be brought into the canon, and push the boundaries a bit on those already generally accepted into the canon. I’ll be giving coverage to major figures, with regard to their work as a whole, but at some point I’ll start doing some relatively detailed readings of individual classic works.
I’ll start at the beginning, more of less with Aristotle. I’m sure there are texts and thinkers within the Greek tradition, and certainly in the Near East, southern and eastern Asia, and so on worthy of attention, but for substantial books clearly devoted to the nature of politics, and which have a focus on some idea of liberty, Aristotle seems as a good a place as any to start.
There is maybe a case for starting with Aristotle’s teacher Plato, or even Plato’s teacher. I think Plato should be rescued from the persistent image, never popular with Plato scholars, of forerunner of twentieth century totalitarianism, because just to start off the counter-arguments, Plato’s arguments refer to a reinforcement, albeit radical and selective, of existing customs rather than the imposition of a new state imposed ideology, and certainly do to suggest that arbitrary state power should rise above law.
However, on the liberty side, Plato’s teacher Socrates was the promoter in his own life style of a kind of individualist strength and critical spirit who fell foul of public hysteria. We know very little about Socrates apart from the ways Plato represents him, but the evidence suggests Socrates was more concerned with a kind of absolutism about correct customs, laws and philosophical claims, in his particular critical individualistic attitude than what we would now recognise as a critical individualistic attitude.
It looks like Socrates was an advocate of the laws and constitutions in Greek states, like Sparta that were less respectful of individuality, liberty and innovation than Athens. Though Aristotle does not look like the ideal advocate of liberty by our standards, he was critical of Plato (often referring to him though Socrates, though it looks like he is reacting to Plato’s texts rather than any acquaintance with Socratic views different to those mentioned by Plato) for subordinating the individual to the state and abandoning private property, presumably referring to the Republic which does seem to suggest that for Plato, ideally the ruling class of philosopher-guardians should not own property, and that the lower classes composed of all those who accumulate money through physical effort, a special craft, or trade, should be completely guided by those guardians.
It is not clear that Plato ever meant the imaginary ideal state of the Republic to be implemented, but it is clear that it reflects the preference Plato had for what he sees as the changeless pious hierarchies and laws of the Greek states of Crete and Sparta, and the already ancient kingdom of Egypt, in which power goes to those who at least superficially have detached themselves from the world of material gain in some military, political, or religious devotion to some apparently higher common good.
Plato and (maybe) Socrates had some difficulties in accepting the benefits of the liberties and democracy associated with fifth and fourth century BCE Athens that fostered commercial life, great art, great literature, and great philosophy. I will discuss the explanation and promotion of the values of Athens in a future post on the most distinguished leader of democratic Athens, Pericles, so I will not say more about it here.
Aristotle (384-322 BCE) came from outside Athens, he was born in monarchist Macedon which lacked the republican institutions of participatory government in the city states of Greece. Aristotle’s family was linked with the monarchy which turned Macedon into the hegemon of Greece, destroying the autonomy of Athens and the other republics. Aristotle even spent time as the tutor of Alexander the Great, who turned the Madedonian-Greek monarchical state into an empire stretching to India and Libya.
Aristotle was however not the advocate of such empires, but had already studied with Plato in Athens, where he acquired a preference for the self-governing city state participatory model of politics . His links with the Macedonian state sometimes made it difficult to spend time in Athens where most resented the domination from the north, so he spend time in Anatolia (apparently marrying the daughter of a west Anatolian king), the Aegean islands and the island of Euboea off Athens, dying in the latter place.
Despite these difficulties, Aristotle was so much in favour of the values of republican Athens that he even endorsed the idea that foreigners, or those born of one foreign parent could not be citizens, in case of a dilution of the solidarity and friendship between citizens. This issue brings us onto the ways in which Aristotle does not appeal to the best modern ideas of liberty. He was attached to the idea of a self-enclosed citizen body, along with slavery, the secondary status of women, the inferior nature of non-Greeks, restrictions on commerce, and the inferiority of those who labour for a living or create new wealth.
Nevertheless, given the times he lived in, his attitudes were no worse than you would expect and often better. Despite his disdain for non-Greeks, he recognised that the north African city of Carthage had institutions of political freedom worth examining. His teacher Plato was perhaps better on one issue, the education of women, which appeared to have no interest to Aristotle.
Still unlike Plato, he did not imagine a ‘perfect’ city state where everything he found distasteful had been abolished and did not dream of excluding free born males at least from the government of their own community. Aristotle disdained labourer as people close to slavery in their dependence on unskilled work to survive, but assumed that such people would be part of a citizens’ assembly in any state where there was freedom.
His ideal was the law following virtuous king, and then a law following virtuous aristocracy (that is those who inherited wealth), but even where the government was dominated by king or aristocracy, he thought the people as a whole would play some part in the system, and that state power would still rest on the wishes of the majority.
All Greeks deserved to Iive with freedom, which for Aristotle meant a state where laws (which he thought of as mainly customary reflecting the realities of ancient Greece) restrained rulers and rulers had the welfare of all free members of the community as the object of government. In this way rulers developed friendship with the ruled, an aspect of virtue, which for Aristotle is the same as the happy life, and justice.
Friendship is justice according to Aristotle in its more concrete aspects, and ideally would replace the more formal parts of justice. Nevertheless Aristotle did discuss justice in its more formal aspects with regard to recompense for harms and distribution of both political power and wealth.
Like just about every writer in the ancient world, Aristotle found the pursuit of unlimited wealth or just wealth beyond the minimum to sustain aristocratic status discomforting, and that applies to writers who were very rich. Given that widespread assumption Aristotle makes as much allowance for exchange and trade as is possible, and recognised the benefits of moving from a life of mere survival in pre-city societies to the material development possible in a larger community where trade was possible under common rules of justice.
As mentioned, Aristotle preferred aristocratic or monarchical government, but as also mentioned he assumed that any government of free individuals would include some form of broad citizen participation . We should therefore be careful about interpreting his criticisms of democracy, which have little to do with modern representative democracy, but are directed at states where he thought citizens assemblies had become so strong, and the very temporary opinions of the majority so powerful that rule of law had broken down. He still found this preferable to rule by one person or a group lacking in virtue, which he called tyranny and oligarchy.
He suggested that the most durable form of government for free people was a something he just called a ‘state’ (politea) so indicating its dominant normality, where the people between the rich and the poor dominated political office, and the democratic element was very strong though with some place for aristocratic influence. It’s a way of thinking about as close as possible to modern ideas of division or separation of powers in a representative political system, given the historical differences, most obviously the assumption of citizens’ assemblies in very small cities as the central part of political participation rather than elections for national assemblies.
Relevant texts by Aristotle
There is no clear distinction between politics and ethics in Aristotle, so his major text in each area should be studied, that is the Politics and the Nicomachean Ethics. Other relevant texts include the Poetics (which discussed the role of kings in tragedy), the Constitution of Athens, the Eudemian Ethics, and the Rhetoric (the art of speaking was central to political life in the ancient world). Aristotle of course wrote numerous other books on various aspects of philosophy and science.
Investment & Prudence
To be prudent amounts to making sure that one takes good care of oneself in all important areas of one’s life. Health, wealth, family, friendship, understanding, etc. are all in need of good care so that one will achieve and sustain one’s development as a human individual. It all begins with following the edict: “Know thyself!”
All those folks who make an effort to keep fit and to eat properly are embarking on elements of a prudent life. Unfortunately, the virtue of prudence has been undermined by the idea that everyone automatically or instinctively pursues his or her self-interest.
We all know the rhetorical question, “Isn’t everyone selfish?” Because of certain philosophical and related doctrines, the answer has been mainly that everyone is. In the discipline of economics, especially, scholars nearly uniformly hold the view that we all do whatever we do so as to please ourselves, to feel good. No room exists there for pure generosity or charity, for altruism, because in the final analysis everyone is driven to act to further his or her own wellbeing, or for carelessness, recklessness. If people do not achieve the goal of self-enhancement, it is primarily out of ignorance – they just don’t know what is in their best interest but they all intend to achieve it and even when they appear to be acting generously, charitably, helpfully and so on, in the end they do so because it gives them satisfaction, fulfills their own desires and serves their idea of what is best for them.
This is not prudence but what some have dubbed animal spirit. People are simply driven or motivated to be this way, instinctively, if you will. The virtue of prudence would operate quite differently.
One who practices it would be expected to make a choice to pursue what is in one’s best interest and one could fail also to do so. Practicing prudence is optional, not innately produced. Like other moral virtues, prudence requires choice. It is not automatic by any means. The reason it is thought to be so, however, has to do with the intellectual-philosophical belief that human conduct is exactly like the behavior of non-human beings, driven by the laws of motion!
Once this idea assumes prominence, there is no concern about people having to be prudent. They will always be, as a matter of their innate nature. What may indeed be needed is the opposite, social and peer pressure to be benevolent or kind, to adhere to the dictates of altruism, something that requires discipline and education and does not come naturally to people.
It would seem, however, that this idea that we are automatically selfish or self-interested or prudent doesn’t square with experience. Consider just how much self-destructiveness there is in the human world, how many projects end up hurting the very people who embark upon them. Can all that be explained by ignorance and error?
Or could it be, rather, that many, many human beings do not set out to benefit themselves, to pursue their self-interest? Could it be that human beings need to learn that they ought to serve their own wellbeing and that their conduct is often haphazard, unfocused, even outright self-destructive (as, for example, in the case of hard drug consumers, gamblers, romantic dreamers, fantasizers and the lazy)?
It seems that this latter is a distinct possibility if not outright probability. It is a matter of choice whether one is or is not going to be prudent, in other words. And once again, ordinary observation confirms this.
One can witness numerous human beings across the ages and the globe choosing to work to benefit themselves, as when they watch their diets or work out or obtain an education, and many others who do not and, instead, neglect their own best interest. Or, alternatively, they often act mindlessly, thoughtlessly, recklessly, etc.
The contention that they are really trying to advance their self-interest, to benefit themselves, seems to be one that stems from generalizing a prior conviction that everything in nature moves so as to advance forward. This is the idea that came from the philosopher Thomas Hobbes, who learned it from Galileo who took it from classical physics.
Accordingly, acting prudently, in order to advance one’s wellbeing, could be a virtue just as the ancient philosopher Aristotle believed it to be. And when one deals with financial matters, careful investing would qualify as prudence, just as is working out at a gym, watching one’s diet, driving carefully, etc.