Separation of Children: an American Tradition

Many Americans deplore the forced separation of children from their parents when they attempt an unauthorized entry into the USA. The recorded crying of children traumatized from having their parents taken away is terrible to hear for anyone with empathy. Administrations excuse this by claiming that they are only enforcing a legally mandated zero tolerance, that this separation acts as a useful deterrent to immigration, and that the law is ordained by God.

The claim by those opposing this policy is that this cruel separation is un-American. But in fact, the forced separation of children is an American tradition. Under slavery prior to the end of the Civil War, children were sold separately from their parents. This action too was presumably a law ordained by God.

The separation of children from their parents was also imposed on native American Indians. Children were forcibly removed from their homes and put into boarding schools, the aim being the assimilation of Indians into Euro-American culture. Indian children were not allowed to speak in their native languages. Rather than being un-American, this physical and cultural separation was seen as an Americanization. Canada had a similar program for its Indians.

This separation continued the genocide of Indians by having a high rate of death. The misery that children felt in their familial and cultural separation was compounded by abusive treatment and a high mortality rate.

Since the current child separation is a continuation of past policies, we can expect similar outcomes: abuse, death, and suicides. Feeling no hope of ever seeing their parents again, confined to small cages, suffering from boredom, and constantly hearing other children crying, there could be substantial illness and even suicide in these detention camps. It would at first be covered up, and then exposed, and denied as “fake news.”

This anti-family policy is supported by many Republicans and conservatives. The conservative claim of supporting “family values” has now been shown to be fake. The real conservative stance is the imposition of traditional European culture and supremacy. Most of the migrants from Central America and Mexico are of native Indian ancestry. When they are rejected and sent back to their home countries to get killed by the violence from which they fled, this is in accord with the American tradition of European racial supremacy over native American Indians. If those seeking to immigrate were Norwegians, those families would not be split up.

Indeed, those subjected to forced family separation were races that were conquered and regarded as inferior. A large immigration from Mexico and Central America would repopulate the USA with native Indian “blood,” unacceptable to Euro-American supremacists.

Therefore the forced separation of native Indians from their parents and the rejection of further immigration is as American as one could get.

RCH: the Cherokee Nation and the US Civil War

That’s the topic of my Tuesday column over at RealClearHistory. An excerpt:

Ross was critical of the success of the death warrants against the Treaty Party Men, but the most interesting aspect of the two mens’ rivalry was the fact that they used the rule of law to fight their battles. Now, the rule of law in the 19th century meant the use of violence between factions (think here about Tombstone, Ariz., where Wyatt Earp and his friends were U.S. Marshals and the friends of the Clantons were Sheriffs), but there was a belief held at the time that violence could only be used by civilized men if the law was on their side. Ross and Watie were both firm believers in this form of rule of law.

Please, read the rest and share it with your friends.

The State in education – Part III: Institutionalization of learning

In The State in education – Part II: social warfare, we looked at the promise of state-sponsored education and its failure, both socially and as a purveyor of knowledge. The next step is to examine the university, especially since higher education is deeply linked to modern society and because the public school system purports to prepare its students for college.

First, though, there should be a little history on higher education in the West for context since Nietzsche assumed that everyone knew it when he made his remarks in Anti-Education. The university as an abstract concept dates to Aristotle and his Peripatetic School. Following his stint as Alexander the Great’s tutor, Aristotle returned to Athens and opened a school at the Lyceum (Λύκειον) Temple. There, for a fee, he provided the young men of Athens with the same education he had given Alexander. On a side note, this is also a beautiful example of capitalist equality: a royal education was available to all in a mutually beneficial exchange; Aristotle made a living, and the Athenians received brains.

The Lyceum was not a degree granting institution, and only by a man’s knowledge of philosophy, history, literature, language, and debating skills could one tell that he had studied at the Lyceum. A cultural premium on bragging rights soon followed, though, and famous philosophers opening immensely popular schools became de rigueur. By the rise of Roman imperium in the Mediterranean around 250 BC, Hellenic writers included their intellectual pedigrees, i.e. all the famous teachers they had studied with, in their introductions as a credibility passport. The Romans were avid Hellenophiles and adopted everything Greek unilaterally, including the concept of the lyceum-university.

Following the Dark Ages (and not getting into the debate over whether the time was truly dark or not), the modern university emerged in 1088, with the Università di Bologna. It was more of a club than an institution; as Robert S. Rait, mid-20th century medieval historian, remarked in his book Life in the Medieval University, the original meaning of “university” was “association” and it was not used exclusively for education. The main attractions of the university as a concept were it was secular and provided access to books, which were prohibitively expensive at the individual level before the printing press. A bisection of the profiles of Swedish foreign students enrolled at the Leipzig University between 1409 and 1520 shows that the average male student was destined either for the clergy on a prelate track or was of noble extraction. As the article points out, none of the students who later joined the knighthood formally graduated, but the student body is indicative of the associative nature of the university.

The example of Lady Elena Lucrezia Cornaro Piscopia, the first woman to receive a doctoral degree, awarded by the University of Padua in 1678, illuminates the difference between “university” at its original intent and the institutional concept. Cornaro wrote her thesis independently, taking the doctoral exams and defending her work when she and her advisor felt she was ready. No enrollment or attendance at classes was necessary, deemed so unnecessary that she skipped both the bachelor and masters stages. What mattered was that a candidate knew the subject, not the method of acquisition. Even by the mid-19th century, this particular path remained open to remarkable scholars, such as Nietzsche since Leipzig University awarded him his doctorate on the basis of his published articles, rather than a dissertation and defense.

Education’s institutionalization, i.e. the focus shifting more from knowledge to “the experience,” accompanied a broader societal shift. Nietzsche noted in Beyond Good and Evil that humans have an inherent need for boundaries and systemic education played a very prominent role in contemporary man’s processing of that need:

There is an instinct for rank which, more than anything else, is a sign of a high rank; there is a delight in the nuances of reverence that allows us to infer noble origins and habits. The refinement, graciousness, and height of a soul is dangerously tested when something of the first rank passes by without being as yet protected by the shudders of authority against obtrusive efforts and ineptitudes – something that goes its way unmarked, undiscovered, tempting, perhaps capriciously concealed and disguised, like a living touchstone. […] Much is gained once the feeling has finally been cultivated in the masses (among the shallow and in the high-speed intestines of every kind) that they are not to touch everything; that there are holy experiences before which they have to take off their shoes and keep away their unclean hands – this is almost their greatest advance toward humanity. Conversely, perhaps there is nothing about so-called educated people and believers in “modern ideas” that is as nauseous as their lack of modesty and the comfortable insolence in their eyes and hands with which they touch, lick, and finger everything [….] (“What is Noble,” 263)

The idea the philosopher pursued was the notion that university attendance conveyed the future right to “touch, lick, and finger everything,” a very graphic and curmudgeonly way of saying that a certain demographic assumed unjustified airs.

Given that in Anti-Education, Nietzsche lamented the fragmentation of learning into individual disciplines, causing students to lose a sense of the wholeness, the universality of knowledge, what he hated in the nouveau educated, if we will, was the rise of the pseudo-expert – a person whose knowledge was confined to the bounds of a fixed field but was revered as omniscient. The applicability of Socrates’ dialogue with Meno – the one where teacher and student discuss human tendency to lose sight of the whole in pursuit of individual strands – to the situation was unmistakable, something which Nietzsche, a passionate classicist, noticed. The loss of the Renaissance learning model, the trivium and the quadrivium, both of which emphasize an integrated learning matrix, carried with it a belief that excessive specialization was positive; it was a very perverse version of “jack of all trades, master of none.” As Nietzsche bemoaned, the newly-educated desired masters without realizing that all they obtained were jacks. In this, he foreshadowed the disaster of the Versailles Treaty in 1919 and the consequences of Woodrow Wilson’s unwholesome belief in “experts.”

The philosopher squarely blamed the model of the realschule, with its clear-cut subjects and predictable exams, for the breakdown between knowledge acquisition and learning. While he excoriated the Prussian government for basing all public education on the realschule, he admitted that the fragmentation of the university into departments and majors occurred at the will of the people. This was a “chicken or the egg” situation: Was the state or broader society responsible for university learning becoming more like high school? This was not a question Nietzsche was interested in answering since he cared more about consequences. However, he did believe that the root was admitting realschule people to university in the first place. Since such a hypothesis is very applicable today, we will examine it in the contemporary American context next.

Eye Candy: the Arab world’s administrative divisions

NOL map Arab world admin divisions
Click here to zoom

Imagine if these divisions were all states in a federal republic. Myself, I think some of them,maybe even half of them, could be combined, but if that ever happened, and the resulting combined administrative divisions of the Arab world federated, the region would be in much better shape. (The federation of Arabia would need a Senate, of course.)

What if the OECD did the same? Or simply the US and it’s closest allies?

RCH: the Ottoman Empire

My subject for this weekend’s RealClearHistory column is battles that shaped the Ottoman Empire. Here is an excerpt:

On June 4, 1915, the Third Battle of Krithia was fought between the Ottoman Empire and its Allied enemies, composed of mostly French and British troops. The Ottomans won, handily and somewhat surprisingly. The Allies had to retreat and regroup as a result, and the Balkans campaign had to go through a more careful re-think by Allied strategists.

World War I marked the end of the Ottoman Empire, of course, but the “sick man of Europe” had more fight in it than many Western historians give it credit for. Scholarship on the Ottoman Empire has improved over the years, but there is still plenty of opportunity to do more. The Ottoman Empire spanned three continents, after all, and lasted for 623 years.

The Ottoman Empire was actually one of three multi-ethnic, multi-religious empires in Europe that perished as a result of World War I, along with Austria-Hungary and tsarist Russia. To the east of the Ottomans were two other, long-lasting empires, the Persian empire ruled by the Qajar dynasty (which perished in 1925) and the Mughal empire of India (which perished in 1857). These eastern empires are referred to by many historians as “gunpowder empires” and they controlled the Eurasian trade routes that Chinese and especially European merchants used for exchanging goods and ideas. Here are 10 battles that shaped the Ottoman Empire:

Please, read the rest. And have a good weekend.

On “strawmanning” some people and inequality

For some years now, I have been interested in the topic of inequality. One of the angles that I have pursued is a purely empirical one in which I attempt to improvement measurements. This angle has yielded two papers (one of which is still in progress while the other is still in want of a home) that reconsider the shape of the U-curve of income inequality in the United States since circa 1900.

The other angle that I have pursued is more theoretical and is a spawn of the work of Gordon Tullock on income redistribution. That line of research makes a simple point: there are some inequalities that are, in normative terms, worrisome while others are not. The income inequality stemming from the career choices of a benedictine monk and a hedge fund banker are not worrisome. The income inequality stemming from being a prisoner of one’s birth or from rent-seekers shaping rules in their favor is worrisome.  Moreover, some interventions meant to remedy inequalities might actually make things worse in the long-run (some articles even find that taxing income for the sake of redistribution may increase inequality if certain conditions are present – see here).  I have two articles on this (one forthcoming, the other already published) and a paper still in progress (with Rosolino Candela), but they are merely an extension of the aforementioned Gordon Tullock and some other economists like Randall Holcombe, William Watson and Vito Tanzi. After all, the point that a “first, do no harm” policy to inequality might be more productive is not novel (all that it needs is a deep exploration and a robust exposition).

Notice that there is an implicit assumption in this line of research: inequality is a topic worth studying. This is why I am annoyed by statements like those that Gabriel Zucman made to ProMarket. When asked if he was getting pushback for his research on inequality (which is novel and very important), Zucman answers the following:

Of course, yes. I get pushback, let’s say not as much on the substance oftentimes as on the approach. Some people in economics feel that economics should be only about efficiency, and that talking about distributional issues and inequality is not what economists should be doing, that it’s something that politicians should be doing.

This is “strawmanning“. There is no economist who thinks inequality is not a worthwhile topic. Literally none. True, economists may have waned in their interest towards the topic for some years but it never became a secondary topic. Major articles were published in major journals throughout the 1990s (which is often identified as a low point in the literature) – most of them groundbreaking enough to propel the topic forward a mere decade later. This should not be surprising given the heavy ideological and normative ramifications of studying inequality. The topic is so important to all social sciences that no one disregards it. As such, who are these “some people” that Zucman alludes too?

I assume that “some people” are strawmen substitutes for those who, while agreeing that inequality is an important topic, disagree with the policy prescriptions and the normative implications that Zucman draws from his work. The group most “hostile” to the arguments of Zucman (and others such as Piketty, Saez, Atkinson and Stiglitz) is the one that stems from the public choice tradition. Yet, economists in the public-choice tradition probably give distributional issues a more central role in their research than Zucman does. They care about institutional arrangements and the rules of the game in determining outcomes. The very concept of rent-seeking, so essential to public choice theory, relates to how distributional coalitions can emerge to shape the rules of the game in a way that redistribute wealth from X to Y in ways that are socially counterproductive. As such, rent-seeking is essentially a concept that relates to distributional issues in a way that is intimately related to efficiency.

The argument by Zucman to bolster his own claim is one of the reason why I am cynical towards the times we live in. It denotes a certain tribalism that demonizes the “other side” in order to avoid engaging in them. That tribalism, I believe (but I may be wrong), is more prevalent than in the not-so-distant past. Strawmanning only makes the problem worse.