The short-sightedness of big C Conservatism

As we celebrate the approval of the Oxford-AstraZeneca Covid-19 vaccine, it is hard to imagine that anyone might take offense at the existence of an inexpensive, transportable solution to the pandemic. Yet this is exactly what I have encountered. A friend who is an arch-Conservative (note the capital C) responded with hostility during a discussion on differences between the Oxford and Pfizer vaccines. The issue was that my friend couldn’t accept the scientific evidence that the Oxford vaccine is superior to the Pfizer one. He fixated on the fifteen-billion-dollar subsidy Pfizer received from the US government to create their vaccine. For the Conservative, it was as if to admit the difference between the vaccines was unpatriotic since one was bought by the US taxpayer. His objections were not based on scientific evidence or ideology but upon identity and background.

During the discussion, my Conservative friend brought up the Oxford team’s continuous publication of their data as if that action somehow lessened their research’s impact or validity. The final paragraph on the Oxford research team’s webpage says:

This is just one of hundreds of vaccine development projects around the world; several successful vaccines offer the best possible results for humanity. Lessons learned from our work on this project are being shared with teams around the world to ensure the best chances of success.

The implication was “well, they’re just wacko do-gooders! They’re not going to make a profit acting like that!” The idea being that legitimate scientific research bodies behave like Scrooge McDuck with their knowledge. On a side note, this type of “Conservative” mentality has greatly damaged public perception of capitalism, a topic I’ll return to at a later point.

Members of the Oxford vaccine team are assumed to be in the running for the Nobel Prize, and for this, odds of winning are proportionate to the speed with which the broader scientific community can check findings. The Conservative could not overcome a mental block over the fifteen billion dollars. The difference is one of vision. To put it bluntly, Oxford is aware as an institution that it has existed for almost nine hundred years before the creation of Pfizer and that it will probably exist nine hundred years after Pfizer is no longer. Oxford wants the Nobel Prize; the long-term benefits – investment, grants, funding awards, etc. – far outweigh any one-time payout. As to the long-term outlook required for Nobel Prize pursuit, the willingness to pass up one benefit in favor of a multitude of others, it is alien to those whose focus is short-sighted, who are enticed by single-time subsidies or quick profits.

The conversation represented a problem which caused F.A. Hayek to write in “Why I am not a Conservative,”

In general, it can probably be said that the conservative does not object to coercion or arbitrary power so long as it is used for what he regards as the right purposes. He believes that if government is in the hands of decent men, it ought not to be too much restricted by rigid rules. Since he is essentially opportunist and lacks principles, his main hope must be that the wise and the good will rule—not merely by example, as we all must wish, but by authority given to them and enforced by them. Like the socialist, he is less concerned with the problem of how the powers of government should be limited than with that of who wields them; and, like the socialist, he regards himself as entitled to force the value he holds on other people.

In the case of the vaccine, the Conservative I spoke with had the idea that since the government sponsored Pfizer’s version, Americans ought to accept placidly the Pfizer vaccine as their lot in life. Consequently, coercive policies, for instance refusing the AstraZeneca vaccine FDA approval (something which hasn’t occurred – yet), are acceptable. Behind this facile, even lazy, view lies an incomprehension when confronted with behaviors and mindsets calibrated for large scale enterprises. Actions taken to achieve long-term building – in this instance the possibility of winning a Nobel Prize – are branded as suspicious, underhanded. At an even deeper level lies a resentment of AstraZeneca’s partner: Oxford with all of its associations.

Rather than being a malaise of big C “Conservatism,” the response, detailed in this anecdote, to a comparison between the vaccines conforms to Conservative ideas. Narrowness of mind and small scope of vision are prized. As Hayek pointed out in 1960, these traits lead to a socio-cultural and intellectual poverty which is as poisonous as the material and moral poverty of outright socialism. My own recent conclusion is that the poverty of big C “Conservatism” might be even worse than that of socialism because mental and socio-cultural poverty can create circumstances leading to a longer, more subtle slide into material poverty while accompanied by a growing resentment as conformity still leads to failure. When class and ideological dynamics invade matters such that scientific evidence is interpreted through political identities, we face a grave threat to liberty.  

The poverty of the modern middle class: prologue

About six months after I graduated from Columbia, a couple who knew members of my extended family asked me to lunch unexpectedly. Not wishing to be rude, I went. As it turned out the couple had an agenda; they wanted to talk about having their daughter apply to Ivy League graduate schools.

Their daughter had recently graduated from a private liberal arts college and was having trouble finding permanent employment in a field and at a level her parents considered acceptable given the cost of her education. In fairness to her, she was interning at a non-profit in NYC. Her parents, though, had unrealistic expectations and seemed to feel that having paid for her to go to a “prestigious” private school, she should have entered the workforce at a much higher level.

The parents had some highly specific questions, ones that were so precise that I suggested they needed to contact someone in admissions at the respective universities or speak with an application consultant. In retrospect, I suspect they may already done so and the feedback hadn’t been favorable. Their questions were focused on seeing if there might be workarounds or special exemptions for the graduate program prerequisites. While there are, their daughter wasn’t eligible for any of them.

The parents were visibly angry, unable to accept that their daughter’s endless sports and community involvement, which they had so carefully funded, were meaningless in the face of program prerequisites. The graduate programs had foreign study abroad components, so the language prerequisites, which the daughter couldn’t meet, were immutable. Additionally, as the programs was designed for those interested in careers such as publishing, journalism, or policy writing, all applications demanded a large and exceptionally high-quality writing sample. To have an idea of what was expected, think of Princeton University’s standard 50,000 word (i.e. a small book) undergraduate thesis.[1] The daughter had neither the language skills nor the writing sample. In the case of the former, the private college her parents had chosen didn’t offer modern languages at anything resembling the level expected; for the writing sample, the young woman simply didn’t have one. Her parents were vague as to the reason, but I think she may have chosen an academic track which didn’t require an undergraduate thesis.

The parents weren’t completely sure which upset them more: that their capability as parents was under review, or that everything they thought was “valuable” or “worthy” had been found wanting. Sports? Irrelevant. Door-to-door political canvassing? Commonplace. The parents were proud of having provided certain experiences, such as trips to Disneyland, ski trips, and cruises. These activities have importance as symbols of a financial middle-class with enough liquidity to spend on recreation, but the daughter couldn’t include them as significant in personal statements. In this, the daughter was disadvantaged compared to Abigail Fisher and her discovery that 1,999,999 other people in any given year are Habitat for Humanity volunteers. 

The episode revealed a bankruptcy of mind, culture, and outlook which is the poverty of those whose incomes are firmly middle class but whose intellectual knowledge and cultural capital is lacking. Like the person of my previous post, there was a trust in the opinions of the majority and an uninquiring faith that doing x, y, and z is guaranteed to lead to immediate status, security, and success. The financial but not social or cultural middle-class has realized that parts of life and social experiences are out of reach; Not because they were originally off limits, but because too much time has passed, and individuals, such as those in this story, are behind the curve when it comes to specific skills and types of knowledge. People, entire sections of the population, have gone so far down a particular path, it’s too late to turn back.

[1] In case readers are wondering if it is possible to access this type of writing preparation at the undergraduate level outside of the Ivy League, it is. Speaking from my own experience, most liberal arts colleges and large universities offer an Honors track or program through which participating students receive the support and guidance to write longer, more advanced papers and theses.

Choosing inadequacy

About a year ago, I had dinner with a friend who I have known more or less my entire life. We hadn’t seen each other in over ten years, though, not since she started college. During the interval, she became an inveterate social climber – at one point avowing completely seriously that she was open to marrying a rich man if it meant that she could have a flat in one of the world’s most expensive cities. She was also an expert at being woke. The contradiction in her thought processes – her craving for a life of riches and luxury and her woke “eat the rich” attitude – caused me to recognize the fuel behind the attraction redistributionist ideologies have for young Americans.

At some point in her trajectory, my friend had pitched on using the education system to climb the social ladder. In fairness to her, there is a pervasive idea that this is a valid approach; J.D. Vance mentioned it in the conclusion to Hillbilly Elegy. Choosing between the flagship state university and a small private liberal arts college, she picked the latter, which was a “social” school held in high esteem regionally and thought to be intellectually rigorous.

Upon graduating and moving two time zones away for graduate school, she made two unwelcome discoveries: 1) she was behind academically and intellectually, and 2) her college had scant brand-name value in the broader world. According to her, her graduate university’s student body was comprised of the children of America’s elite and “who didn’t get into Harvard.” She held a teaching assistantship for 101-level English literature classes and was discomfited to find that her freshmen students were better writers with a broader sense of literature and the humanities than she. She mentioned that she found out about entire chunks of the English literary canon from them, which is appalling given that she had majored in English at her liberal arts college.  

When Austrian novelist Stefan Zweig died, his executors found the manuscript for his novel Rausch der Verwandlung among his papers. The book’s title in English is The Post-Office Girl,[1] and it tells the story of a 1920s provincial girl who assumes a false identity to join the privileged world of her relatives. Everything works out – until it doesn’t:

Unwittingly Christine revealed the gaps in her worldliness. She didn’t know that polo was played on horseback, wasn’t familiar with common perfumes like Coty and Houbigant, didn’t have a grasp of the price range of cars; she’d never been to the races. Ten or twenty gaucheries like that and it was clear she was poorly versed in the lore of the chic. And compared to a chemistry student’s her schooling was nothing. No secondary school, no languages (she freely admitted she’d long since forgotten the scraps of English she’d learned in school). No, something was just not right about elegant Fräulein von Boolen, it was only a question of digging a little deeper […].

After Christine is unmasked, she returns to her previous life but this time she’s angry and bitter, aware now of the existence of another world, one lost through her own irresponsibility. Most of the book is about the girl’s mental unravelling. When I first read the book, I thought his ending of suicidal thoughts and participation in serious criminality to be melodrama for its own sake. Now, I think he was on to something.

In Zweig’s book, the root of the problem is the anti-heroine’s discovery that what is top-notch in her village isn’t held in the same esteem elsewhere: “[W]hat was the showpiece of her wardrobe [a green rayon blouse] yesterday in Klein-Reifling seems miserably flashy and common to her now.” My friend recounted a similar experience cast in academic terms. She slid through high school and college without any struggle. Upon starting her MA, she had difficulty keeping up with her cohort. Three years after starting a doctoral program, her dissertation proposal was rejected, with the evaluators citing lack of languages as one of the reasons. This last is interesting because it connects to Zweig’s list of faults that expose Christine’s real social standing. In the case of my friend, her background became equivalent to Christine’s blouse: haute couture in one locale and unsophisticated in another.

For both the Bright Young Things of Zweig’s world and my own generation more generally, there is a question over culpability. In the book, Christine’s aunt agonizes over the girl’s uncouth manners and dress, repeatedly reminding herself “how was she to know?” My friend and her parents assumed that “the system” would take care of her. Sure, the public school wasn’t great, but it also wasn’t too terrible and everyone else was going there. The college was the best and most expensive private college in the region, so surely the faculty and advisors there knew what they were doing.

This is not to say that there weren’t red flags if one knew where to look. For example, the college offered only two years of accredited foreign language training. My friend acknowledged this contributed to the problems with her first proposal. However, my friend also admitted that she hadn’t considered the curriculum when she picked the college. Her focus had been purely social. Consequently, the truth is that she chose her path at the moment she picked her values.  The fact that her measurement system didn’t hold up well to broader scrutiny is her fault.

Zweig’s anti-heroine contemplates suicide in response to her inadequacy; kangaroo courts, or cancel culture, are more my friend’s style. Not much has changed over the course of a century. In Zweig’s time, self-destruction was the default choice; in ours, destruction of others is the preferred MO. The source of the anger, though, is the same: envy stemming from inadequacy. Unlike the Bright Young Things, though, the modern generations chose their inadequacy.

[1] Much of the crucial action is set in a Swiss hotel, and Wes Anderson has said that the book was one of his inspirations for The Grand Budapest Hotel.

The Dunning-Kruger Effect on stature

With the collapse of a false sense of stature comes a disintegration of perceptions of personal dignity. The Dunning-Kruger Effect says that a person’s incompetence masks his or her ability to recognize his own (and others’) incompetence. Building off this concept, I hypothesize that many of the social tensions we are experiencing today are a result of a type Dunning-Kruger Effect wherein those who are incapable eventually become aware of their inability or unsuitability.

In 2010, Dr. David Dunning, now retired from Cornell University and one half of the Dunning-Kruger name, gave an interview to the NYT on the eponymous Effect. The genesis of Dunning’s research came when he read about a bank robber who made no attempt to conceal his face, resulting in his being apprehended in less than a day. During his interrogation, the man revealed that he had covered his face with lemon juice, having developed the notion that lemon juice would make “him invisible to the cameras.” He’d even tested the concept beforehand by taking a picture of himself after putting lemon juice on his face. He wasn’t in the picture (Dunning suggested that perhaps camera was pointed in the wrong direction) and concluded that the idea was sound. As Dunning put it, “he was too stupid to be a bank robber but he was too stupid to know he was too stupid to be a bank robber.” As the story of the robber illustrates, though, there always comes a moment of reckoning, where the reward of stupidity, unintentional or willful, is paid.

When I was a recent music grad, I obtained a position as a fellow with a small liberal arts college orchestra. Ostensibly, the school was one of the best of its type, but its weaknesses became apparent fairly quickly. During one rehearsal, what began as a discussion of the minuet form evolved into the conductor having to introduce the students to Jane Austen. The students all assured the conductor that they “had taken” Literature 101 and 102. As it turned out, only one of the students had read Jane Austen, and the girl had done so on her own time as Austen was no longer taught in the curriculum.

The conductor queried the students as to why they thought they could play music while being ignorant of its broader cultural setting. The students became surly in response, retorting that “she isn’t on the reading list,” as if such a statement was all the justification needed. The dynamic in the ensemble was already rocky as two weeks before all but two students had skipped rehearsal to attend a varsity soccer match. The conductor had chastised them for their unprofessionalism, pointing out that missing rehearsals is cause for termination in professional ensembles. As a defense, the students cited college regulations which said that all sports events took precedence over other commitments. In other words, the conductor could not discipline skippers if they missed to attend a sports event.

At the end of that year, the conductor resigned, and I followed one term later. I made the decision to leave after two students for whom I was responsible said in front of me, “at least this one [the new conductor] respects us. [The old one] always treated us like we were stupid, didn’t know things.” As someone who was present, I can testify that the conductor was a model of patience and positive leadership. The students truly didn’t possess the basic knowledge reasonable to expect from third years at a supposedly good private liberal arts college. And they had no interest in remedying their deficiencies. They saw any situation where their ignorance displayed itself as a “gotcha” setup.

Technically these students didn’t fit the Dunning-Kruger pattern as they possessed enough knowledge to know they didn’t know. There is, perhaps, a similarity to the bank robber’s lemon juice in that the students made a blanket assumption that completing college-assigned reading was sufficient to turn them into literate people. Where this notion originated is beyond me, as it is common knowledge that extracurricular reading is a vital component for success at elite institutions. Just like the robber’s, the students’ ignorance was appalling – and much less amusing.

For our non-American readers, it is a conceit of small private liberal arts colleges that they are educating/raising (this is key) the leaders of the future. There is some justification for this view since these colleges can provide a door into better institutions. It is, however, rare to find a national-level politician, industry leader, or public figure who hasn’t at least finished his or her education at one of the über-competitive, big name schools or universities.

The stars of this particular anecdote were convinced that they were destined for great things. A challenge to their knowledge base was an assault on their identities, and therefore their sense of stature. Granted, their unprofessional behavior had already cost them their dignity, but they didn’t know that. The sense that dignity might be a distinction to be earned and not granted through entitlement escaped them. In a modification of the Dunning-Kruger Effect, the students had no dignity because of their ignorance and unprofessionalism but they were too stupid to know that they had forfeited their chance to be respected. 

When hard work doesn’t equal productive work

In March 2020, David Rubenstein gave an interview in which he lamented the vanishing of a system in which “hard work” guarantees success. While the source of nostalgia is understandable, there is an epistemological problem with the conjoined assumptions underlying the concept of hard work and also what a “system” promises, i.e. if one works hard, then one become successful. The issue appears to be one of qualifying and quantifying “hard work.”

My previous article “An aspirational paradox” mentions Abigail Fisher and her failed lawsuit against University of Texas – Austin over her non-acceptance to the institution. The case was a painful example of the disillusionment which must follow when believers in the exceptionality of the commonplace are finally made aware of its mediocrity. The Fisher saga represents the modern tragedy of familial ambition: a child’s parents place her on a systemic path, promised by wise public-school teachers and caring guidance counselors to lead to success, only to discover that the end is the furnace of Moloch. Caveat emptor.

The strange, disembodied entity called “the system” doesn’t fail; what fails is individual and collective concepts of what the system is and what it requires. Mankind has a capacity for filling a void of ignorance with figments of its imagination. In general, such practice is harmless. But when a person believes his own creation and builds his future upon it, that is when the ‘systemic failure’ narrative begins.

Drawing again from my own encounters, for many years I knew a music teacher who believed that one must never listen to repertoire. Yes, you read that correctly: a teacher of an aural art form believes that listening to music is detrimental. The person had many long, pseudo-pedagogical explanations for this peculiar belief. His idea was atypical. Professors at the world’s top conservatories and musicians from major ensembles all emphasize listening as a crucial part of study. Listening as a formal component of music study dates to the invention and mass distribution of the phonograph in the early 20th century. Even further back, students attended live concerts.

This teacher had a pedagogical system built around his beliefs, which included that students should neither learn basic keyboard skills nor how to play with accompaniment. Unsurprisingly, students who adhered to his system didn’t progress very well. Problems ranged from poor intonation and lack of ensemble skills to arriving for college auditions with no grasp of appropriate repertoire. Feedback from competitions was kind but completely honest. The more students failed, the more obstinately he insisted that political maneuverings or class biases were to blame. “The system,” by which he meant auditions, was “broken,” designed to not give people a “fair chance.”

Sadly, this man affected a large number of students, many of whom worked hard – practicing long hours, racking up credits, participating in multiple ensembles – only to discover that their “system” was a fraud. All of their hard work was for naught.

There was one particularly heartbreaking case of a young woman who applied to a fairly prominent private university. By her own account, her audition was catastrophic. In the lead up to the audition, she did her best to ensure success; she had two lessons a week, increased her daily practice time by an hour, and played along to background recordings. The amount of work she did, measured in terms of effort and time spent, was brutal. But she didn’t pass the audition and was understandably devastated.

A system she had followed religiously since fourth grade had failed her; moreover, her hard work was guaranteed to fail. There was no way for her to succeed based upon her training. In some ways, this girl’s story parallels Abigail Fisher’s history. For years both put in hours of effort only to discover that they had misjudged and misplaced their energies. Bluntly, these young women worked hard but not strategically.

The failure of these girls was unrelated to the broader “system,” whether that system was auditions or college applications. To argue that “the system” is broken on the basis that hard work is not rewarded is irrational, albeit understandable on an emotional basis. Before rushing off to denounce “the system” for not rewarding hard work, one should critically examine the foundational premise and ask: Was this hard work or was it productive work?

An aspirational paradox

In general, contemporary society disparages people extremely focused on their careers, labelling them as “careerists.” No real objections are presented; instead there is simply a vague simmering contempt for “careerists.” When an anti-careerist manages to articulate an objection, it is usually couched as a social justice problem: careerists cause unfair societies by being ahead of everyone else. When practiced appropriately, i.e. looking after one’s own best interests, “careerist” mentalities and behaviors, such as discipline, planning, and diligence, are necessary for prosperity, both personal and societal.

During the early stages, the opposite of careerism isn’t drifting: it’s aspirational behavior. An illustrative anecdote: a pre-med major in my year failed part of her science requirements repeatedly, mostly due to partying. Despite her inadequate academic record in her major field area, she applied to Harvard Medical in her final year. Unsurprisingly, she wasn’t accepted. As a mutual acquaintance said: “one’s going up against people who’ve been working to get into Harvard Medical since middle school; they [Harvard] don’t need someone like her.” Although the result was only to be expected, it came as a complete shock to the girl and her parents since they had all believed that she was destined to attend Harvard Med. To listen to the former pre-med student talk today, she didn’t get to go to Harvard Med. It is as if some external force denied her a chance.

As a quick explanation to non-American readers, because the US system requires that students take courses outside of their major field, there is a high tolerance for poor marks in general education requirements; the trade-off is that one is expected to earn reasonably high marks in one’s own field in order to advance to the next level. For an institution such as Harvard Med, that a pre-med student earned anything below average marks in a science course would be unacceptable, unless there was a very good reason clarified in the application statement. A good reason would be a family tragedy or some life event beyond one’s control, but not partying.

The sheer reality is that my friend was right: Harvard Med receives applications from candidates who shown single-minded focus in pursuit of a goal since age twelve. In comparison to that, a twenty-three-year-old woman whose transcript screams “unfocused” is not a prize. Even the act of applying to Harvard would count against her since assessors would conclude that she hadn’t bothered to read the guidance and fit sections, i.e. the pages where expected grades and MCAT scores are listed, on Harvard Med’s website.

The case of Fisher vs. University of Texas 2013 and 2016 (the Supreme Court heard the case twice) is an example of the dichotomy between aspiration and careerism. Abigail Fisher applied to University of Texas – Austin but was turned away as her grades, though above average for her public school, were below the university’s automatic admission standards. The crux of her suit was that UT – Austin both didn’t take into account her extracurriculars and replaced her with an affirmative action candidate. Eventually, the Supreme Court ruled – both times – that although UT – Austin ought to have been more transparent in their admissions process, there was no evidence that the university had discriminated against Fisher.

The aspirational part of this story was that Fisher’s extracurriculars, Habitat for Humanity, local youth orchestra without holding a principle position, and miscellaneous charitable activities, were not genuine distinctions in a candidate pool such as that commanded by UT – Austin. Based on my own experience as someone who was in youth orchestras from middle school until college – if one counts my first training ensembles, then I’ve been in orchestras from the age of six up – should an applicant wish to use music involvement as proof of merit, then youth ensemble participation is a simple expectation. Unless one was a section leader, youth orchestra membership is not a sign of exceptionality. According to Habitat for Humanity’s North America page, the annual number of volunteers is around two million. Volunteering with the charity is generous and worthy, but doing so does not make the volunteer stand out in a candidate pool.

While society can discuss endlessly the merits and demerits of affirmative action, the Fisher case indicates that the policy has taken on a role no one could have predicted: scapegoat. The policy has become an escape hatch for aspirationalists seeking to avoid facing their own inadequacies and lack of proper preparation or career focus. Instead of blunt conversations about the real reasons a person didn’t qualify for a desirable professional school or first-choice university, aspirational individuals can offload blame onto the policy. While one can hardly blame the policy for being a scapegoat, one must acknowledge that such use has potential to be very damaging to the social fabric.

The importance of biography

There is a now-out-of-print children’s book series entitled “Childhood of Famous Americans,” published as a subdivision of the Landmark Books series between 1950 and 1970. When I was between the ages of six and ten, I was fortunate to be able to read almost all of the books, which were, unsurprisingly, the biographies of prominent Americans written for children. Even when I was little, the books were fairly ancient: the most recent subjects they covered were Eleanor and Franklin D. Roosevelt and Albert Einstein. Despite, or even because of, their relative antiquity, these books had a major impact on my own trajectory.

This is not to say that they weren’t flawed since they were. Often they were riddled with historical inaccuracies, the quality of writing varied wildly from author to author, and the content could be outright offensive in regards to religion and races. The overall series, however, did a very good job of including biographies of Americans from minority groups, but, depending on the subject, author, and time period, the portrayals of other races could be quite insensitive.

The books all followed the Joseph Campbell theory of story to a T, with the result that they were very good stories. A critic might argue that these biographies lionized or apotheosized individuals in an unrealistic way. While such an accusation would be true, the series was titled “Famous Americans,” not “average Joe Americans.” The important trait of these books though was that they all shared a common theme: stature was a choice and one that was made in childhood or adolescence.

Using Campbell-ian terms, the moment of awakening was almost invariably an episode where the subject realized that the people surrounding him or her were stupid, fearful, and conventional – Mark Twain being expelled from multiple schools, Abraham Lincoln denied an education by his illiterate father (as I said, not all of the stories were tremendously accurate), Henry Clay fighting for his inheritance rights against his extended family, Jim Thorpe struggling against racial and social prejudice throughout his sporting career.

On a side note, there was a remarkable absence of American fine arts figures in the series. Mark Twain was one of a handful of writers that included Edgar Allan Poe and Louisa May Alcott; I don’t recall that some of the more sophisticated writers, such as Washington Irving, Henry James, or Edith Wharton, received the honor. One could say that the absence of fine artists was countered by an equal absence of career military men. Dwight D. Eisenhower had a biography, as did George Armstrong Custer (his was most uncomplimentary). Robert E. Lee and Ulysses S. Grant both received a book. I’m sure that there’s room for analysis of a vision of civil society expressed in who the series’ editors decided to cover.

The “Childhood of Famous Americans” series only rarely had a specific antagonist. Some combination of self-satisfied parents, authority figures attached to a status quo, and parochial small-mindedness served as the villains. The subject’s daily obstacles were educational and cultural mediocrity, societal complacency, intellectually inferior peers, and timorous and incapable mentors, who by extension weren’t very good at their job.

Fundamentally, the goal of the series was to create role models for young readers. The model proposed was complete rejection of (and a little healthy contempt for) existing systems. The unifying theme among all the people selected was the tradition of “rugged individualism” and the idea that progress was due to the action of individuals, not that of their communities (recall, the village inhabitants were invariably shown as small-minded, poor spirited morons).

Carl J. Schramm argued in his 2006 book The Entrepreneurial Imperative that the “rugged individual” ethos was an American casualty of post-World War II society. Americans turned more toward the concept of the “workforce,” with its communal overtones, and away from individual achievement and success. The peak of statist, stagnant communitarianism came in the 1970s, the decade in which the “Childhood of Famous Americans” also ceased publication.

Both biography and entrepreneurial spirit speak of a path to personal greatness, a way for individuals to emancipate themselves from their origins if they have sufficient will. The loss of biography and an entrepreneurial ethos indicate an impoverishment in role models. Without role models of individualistic thought or practice, most people lack the originality to conceive of ways of life beyond their current existence. Discontent and feelings of betrayal by “the system,” society, or the status quo are the ultimate result.

Today, we are confronted by the implosion of the post-WWII status quo. To further complicate matters, the majority of the adult population lacks a blueprint for either challenging what remains of the status quo, or for forging a new path. Without the proper role models of individuality, shown in biography, such people are in thrall to the false promise of communitarianism.

Foundering in academia

For the last couple of weeks, I have been reading and re-reading Gerard Klickstein’s book The Musician’s Way: A Guide to Practice, Performance, and Wellness. Klickstein is a musician and professor who has spent much of his teaching career helping other musicians recover from physical injury or overcome psychological issues, such as performance anxiety. Klickstein argues that the vast majority of musicians’ problems, physical and psychological, are a result of poor formation at critical stages of development. Reversing problems engendered by “unqualified,” i.e. incompetent, teachers is an overarching theme in the book. Reading Klickstein’s anecdotes in which many of his students are recent college graduates, one becomes alarmed at the sheer number of incompetent teachers present in “higher education.”    

Several summers ago, at a music festival, I sat with an opera singer friend and we assembled her audition book. An audition book is a selection of opera arias which a singer provides to producers during the audition process. My friend and I were deep into research and consideration, when another musician, also a singer, joined us. His contribution was to question us as to why we bothered with the book at all.

He went on to reveal that he wasn’t planning on attempting the opera house and festival audition cycle, nor was he considering trying for a choral ensemble. Instead, he was applying for faculty positions at small colleges. He was a recent doctoral graduate from a university which is overall relatively famous but not particularly well-regarded for its music school. At that time, the three of us were roughly at the same level. His experience and education were slightly above average for the types of small, regional institutions he was targeting.  

Behind his dismissive behavior lay a mentality of minimal effort. Why should he go to the trouble of researching roles, evaluating musical suitability, and learning parts when his résumé would satisfy the expectations of small, provincial colleges? He lacked the vocabulary to explain his vision, but what he described was a sinecure. Before the festival ended, he had secured a full-time position at an institution in a backwater of the American southwest.

One side of the proverbial coin says that the institution was lucky to have him – his background certainly was above anything the college could expect on the basis of its own reputation and musical standing; the other side of the coin says that it is concerning that someone like him could see academia as a safety net. 

Now American colleges have begun to furlough staff. As you can imagine, many of my Facebook friends are people who attended and are now staff at small liberal arts colleges and small state universities throughout the country. In the atmosphere of uncertainty, my own FB feed has filled up with people lashing out against a society, which, they insist, doesn’t value them. There is an underlying financial element; few can afford to be furloughed. But there is a deeper issue present: a professional inactivity that has pervaded American small liberal arts academia for the last few decades.  

In truth, financial concerns are more a symptom of professional inactivity than they are representative of some overarching truth about poor pay for teachers. I recall how one of my Columbia professors told my class never to rely on a single income stream. He would talk about how all breaks are opportunities to be productive. He told us about how when he was starting his career in the 1960s, he deliberately accepted a part-time position, rather than a full-time one, so that he could finish writing his first book. In terms of his career, the book was more important than his job at a small city college because the book paved the way for the big opportunities. To tell the truth, it didn’t matter that he taught at a small city college, outside of gaining some official teaching experience which he could have obtained through teaching just one class. There’s a difference between being professionally active and simply being busy or being employed.

There is a species of person who follows the same MO as the singer from the music festival. Academia is a safety net, and the goal is to rush into a full-time position and sit there for a lifetime. Their attitude is that of a career teacher, not a professor. They lecture and grade, however there is no professional contribution or creativity on their part. Such people tend to be barren of original thought and to react with hostility to new ideas or concepts. A quick search of academic databases shows that they don’t write articles, they haven’t written books (their theses don’t count), and they don’t write for think tanks or journals. An egregious example is a college professor who writes movie reviews for popular art enthusiast magazines; he’s been passing this activity off as “publishing” and “being published” for years. 

There is, I know, a perception of a double standard on some level. For example, Kingsley Amis taught English Literature at Oxford for the majority of his career. He published comparatively little on the academic side in contrast to some of his peers, and much of his lighter work took the form of reviews, essays, and opinion pieces for newspapers and magazines, such as the London Times or The New Yorker. But he averaged a novel a year. Recognized in his own lifetime as a giant of twentieth century English literature, no one questioned his publication record or his ability to teach the field.

The subtle stagnation at the liberal arts college level has contributed to a culture of belief in talent and luck, rather than good decision making and hard – by which I mean calculated and carefully weighed – work. There are many people today who would classify my Columbia professor’s story as one of privilege and make assumptions about a background of wealth that allowed part-time work. In actual fact, he did not come from a particularly “privileged” background: he simply settled on his priorities, thought ahead, and made his decisions accordingly.

One thing one learns very quickly in the arts is that one must create without expectation of immediate payment. Singers learn arias, instrumentalists study concerti, filmmakers shoot reels all so that when the moment is right, they can produce a piece that demonstrates ability and wins a commission. One tidbit my professor included was that he had to write several critically acclaimed books before he began to receive advances for his work. The principle is the same: create first then receive a reward. A person who works according to the parameters of payment is a drone, and it is unsurprising that such people do not create new works, make discoveries, or have groundbreaking insights. If one considers that American small colleges have populated themselves largely with professional drones, one must reevaluate their worth to education.    

A Queens’ Marxist in the Lions’ Court

When I first walked into the conference room, two other girls were already there. One of them caught my eye and with a friendly nod indicated I should take the seat next to her. I did and then observed the girl on the other side of the table.

She was quite striking, well-dressed in the trendiest fashion, and clearly intelligent, but she exuded an agitation and antagonism that clashed with the sleepy serenity of the room and our own quiet desire for friendship. As our other six classmates trickled in, the Girl across the Table never relaxed and though she responded correctly to any friendly overture, she did so with an attitude of suspicion. Puzzled but too preoccupied to give it much thought, I turned my attention to the department chair who was opening orientation.

For the first couple of weeks I was much too in awe of my new surroundings at this Ivy League university to concern myself with anything more than adjusting as quickly as possible. Only one of us had attended an Ivy for undergraduate and she was one of the nicest people in the class. Recognizing how intimidating the new environment could be, she went out of her way to demystify the place for us, and with her help we soon realized that the tranquil, yet demanding, atmosphere of the first day was genuine. We were meant to become our best selves, not to compete insanely with each other. About three weeks in, our entering cohort of nine had settled into a social and academic routine with everyone participating in a cordial, collegial manner, everyone except one: the Girl across the Table – hereafter called GatT. 

Her hostility from the first day was unabated, and now we were its direct target. During lunch, if someone suggested a book, she had a snarky putdown, even if seconds later she would be raving about another book by the same author. One evening a group of the classical music lovers took advantage of free tickets from the school to go to the opera. GatT came with us. Stretching our legs at intermission time devolved into standing in  a circle and listening uncomfortably as GatT made snide comments about how everyone in the lobby was dressed. As we turned to go back in, I heard her mutter something about “bourgeois” under her breath. A light went on in my heard: GatT was a Marxist – puzzle solved! The next morning, GatT publicly avowed her Marxist leanings during a seminar discussion. 

The mystery of her hostility solved, we moved on with our social lives and pretty much managed to maintain a state of cautious détente with GatT. She made her desire to lead a jacquerie against us fairly clear a couple of times a week. This became funny once a casual lunch conversation revealed that eight of the nine of us had some familiarity with firearms; I commented to the friendly girl from the first day that this particular jacquerie wouldn’t end the way GatT thought. Eventually we became accustomed to her outbursts, and it took one of extraordinary absurdity to elicit any reaction from us. The closest anyone came to snapping at her was the time she claimed that our completing assignments on time was an act of class oppression against her. 

One of the other students was the daughter of two economists who had became ardent free-marketeers after spending their youths as equally ardent Marxists; consequently her grasp of both arguments was comprehensive. After losing a verbal bout with her, GatT refrained from practical arguments and retreated to social commentary. One day during our daily class coffee gathering, she proclaimed that if she had known our school was an Ivy, in order to show support for the proletariat, she would not have applied. As the “discussion” continued, she branded us as privileged elitists. Meanwhile, we quietly drank our cheap coffee and pondered the fellowships that made this our most affordable option. 

The remainder of our graduate studies passed in the pattern of endless writing and studying, intense debates on all sorts of topics, excursions to museums and evenings at the theatre or concerts, and of course simply socializing with each other. We tuned out GatT’s insulting nattering and someone always ensured she received an invitation to whatever activity was scheduled. Despite her clear resentment, she usually came. 

In the final term, when the course load was intentionally light to leave room for writing the Masters thesis, GatT disappeared for a few weeks. We learned through her social media that she was participating in anti-austerity protests in Europe and was immediately sprayed with tear gas during a raucous demonstration. Soon after she returned to school, I ran into her. She told me that she hadn’t started writing her thesis yet: the submission deadline was three weeks away.

I haven’t seen GatT since that last meeting, but the rest of us stay in touch. During a dinner with some of the gang a few months ago we tallied where everyone is now. GatT was the only one we couldn’t account for; because of her propensity for agitating, we suspect she might be locked away in a third-world prison somewhere. We also wonder if she ever managed to complete her thesis.     

Eco’s ‘How to eat ice cream’

A friend recently gifted me a vintage copy of some of Umberto Eco’s essays translated to English. One of the essays, titled “How to eat ice cream,” opened with an anecdote Eco said was based on his childhood. In the story, there was an ice cream vendor who sold regular cones for two cents and ice cream pie cones for four cents. Eco said his parents and grandmother would buy him which ever type he requested, but there was a limit. Young Eco envied the neighbor children who would parade down the street carrying a regular cone in each hand. But whenever he asked for four cents to buy two cones, the adults would flatly refuse and tell him that he could have a pie cone instead. As an adult, he mused:

[…] I realize that those dear and now departed elders were right. Two two-cent cones instead of one at four cents did not signify squandering, economically speaking, but symbolically they surely did. It was for this precise reason that I yearned for them: because two ice creams suggested excess. And this was precisely why they were denied me: because they looked indecent, an insult to poverty, a display of fictitious privilege, a boast of wealth. […] And parents who encouraged this weakness, appropriate to little parvenus, were bringing up their children in the foolish theatre of “I’d like to but I can’t.” They were preparing them to turn up at tourist-class check-in with a fake Gucci bag bought from a street peddler on the beach at Rimini.

The parenting method must have worked because he became Umberto Eco. What Eco recognized was that his parents had inoculated him against false consumerist behavior. The preventative measures were not against the so-called consumerist society but ostentatious display, the process of “keeping up with the Joneses.”

Around January 2018, there was a meme floating around social media. It said something along the lines of “Entrepreneur: someone who lives a few years the way most people won’t so that they can spend the rest of their lives living the way most people can’t.” I very belatedly discovered Carl Schramm’s 2004 book The Entrepreneurial Imperative. Schramm identified the 1950s as the time when American society ceased valorizing business ownership and virtuous risk in favor of material security. As part of the “security first” mentality, children and young people were openly discouraged from seeking independence or from being different in a positive way. The world was one of ossification and stagnation, even as the federal government and media pushed a strong Keynesian message of “consume to grow.” On a side note, now that I think about it, Keynesian economics resemble the children’s video game Snake: one guides the snake to food so that it will grow but eventually it becomes so big that it bites itself – Game over.

Even given the massive propaganda effort put into promoting Keynesian theories, scapegoating “consumerism” or “consumerist society” is a form of escapist thought, a dodging of responsibility. Eco spotted the cause and effect nature of being a parvenu. The desire for “fictitious privilege” creates a set of priorities that cause one to spend his wherewithal thoughtlessly. In turn a “boast of wealth” strategy leads to “’I’d like to but I can’t’” through ensuring that there is no money when real opportunity arrives. The world becomes one of abundant middle as the effort to possess everything spirals.

The Good Life vs. reality

Recently, a former classmate badgered me into accompanying her on a run to the supermarket. As we were checking out, I, as a person who is very dedicated to the principle of self-interest, used a handful of coupons and a discount card to lower my final tally. My companion had a judgmental reaction to the proceedings: she gave me to understand that she never sought discounts or used coupons because to do so was beneath her station. Oddly, she could see no connection between her attitude and her continuous complaints about being short on funds. It was only much later that I connected her attitude at the cash register with her frequent monologues about a “broken society,” a slight fixation on “inequality,” and an overweening sense of entitlement.  

In 1971, NBC produced a sitcom called The Good Life, not to be confused with the British series of the same name. The American series was unsuccessful, in comparison to its competition, and it was canceled after fifteen episodes. I have never seen the show as NBC has never rerun it or provided a home release of it. I first heard of The Good Life in a book, whose title I have regrettably forgotten (for a long time I thought the book was Greg Easterbrook’s The Progress Paradox but now I can’t find any allusion to the tv show in Easterbrook’s book.). The author of the forgotten book alluded to The Good Life as a watershed moment in tv history with its portrayal of the so-called super-rich – the one bit I remember was that the book described the show as “the most luxurious show [in terms of portrayal of lifestyle]” and connected the show to a sudden increase in a broad sense of entitled victimhood throughout society. The Good Life  was also, apparently, part of creating the environment conducive for the success of the soap opera Dallas (1978 – 1991).

The plot behind The Good Life is that a middle-class couple become exhausted with the pressures of suburban life and maintaining a lifestyle that’s beyond their means. Consequently, the pair decide to scam their way into the household of an industrialist multimillionaire by disguising themselves as a butler and housekeeper. The theme which (apparently) underlay the show was the idea that there is a class of people who live extravagant, exotic lives (the proverbial good life) and therefore can afford to support some sponging malcontents.

When researching the show, one thing that struck me about it was how prescient it was in terms of foretelling some of the themes which are present in our current socio-political discourse. The two con-artists are reasonably successful college graduates who believe that society promised them the good life as a reward for going to college and having careers; however, when the pair see the lifestyle shown in glossy magazines – mansions, tennis courts, Rolls-Royce cars – the couple feels that society has reneged on its promise. The logic of the show’s premise is that the couple has been pushed by society – that wicked, amorphous “they” – toward a life of deception because there is no other path to riches open to them.

LitHub ran an article titled “How the well-educated and downwardly mobile found socialism.” The article isn’t worth reading, but the title touches on what began as the fictional premise of The Good Life and has become a full blown, ideologically fraught, issue today. What happens when perception of status is overblown and there is no sense of timeframe to temper expectations? 

Thinking of the popularity of AOC or Andrew Yang and the manner in which they have successfully tapped into the tropes of “unjust society” of “inequality,” the modern millennial (my own generation) seems to have embraced the premise of The Good Life. The tv show contained a very subtle, and completely subversive, inversion of the moral order: because “society’s promises” were broken, the dishonesty of the protagonists was not immoral. The extension of such reasoning is that the industrialist was obligated to support the swindlers anyway due to his greater wealth.

Capx just ran a terrific article by Jethro Elsden, “Jane Austen, the accidental economist,” in response to the new film version of Emma. One of the interesting tidbits the author found was that in modern terms, Mr Darcy’s £10,000 per annum income is probably equivalent to £60 million today, which would make his wealth around £3 billion. Even then Elizabeth Darcy had to “make small economies” once she decided to support her sponging sister and feckless brother-in-law. Granted the economies might have been the result of not telling her husband, but still the point remains that no one can long support spongers.

Elsden alluded to the logic of social pressure and the malignant effect it had on Austen’s characters who feel compelled to engage in an “arms race.” A major reason the swindlers of The Good Life turn to dishonesty is that they feel pressured to look like successful suburban college graduates. The problem was that  in the case of Austen’s characters and the tv show from 154 years later, the definition of “success” in relation to appearances was fungible. Rationally, it is ridiculous for the youngish couple of The Good Life to be in same place financially and socially as their mark, the middle-aged, widower industrialist whose lifestyle (but not work ethic) they covet.

To return, finally, to the anecdote regarding my shopping expedition, the episode is an example of a type of path that begins with frivolous preconceptions and ends with The Good Life on the comic end and the rise of Andrew Yang, Bernie Sanders, or Elizabeth Warren on the other. These politicians have located a demographic which has no sense of progression of time, stages of development, or realistic expectations. A perfect example is my ex-classmate, who has subjected herself to a fantasy regarding her own realistic expectation and now believes that the social contract has been broken.  For such a demographic, the emotional trumps the rational. It is easier to believe themselves wronged than as merely victims of their own imaginations.

The veil of nostalgia

In article for Worth, titled “A new wealth gap is growing – attention inequality,” authors Joon Yun and Eric Yun of the Yun Family Foundation, an institute dedicated to “transforming the way people think,” argued that “attention inequality” is having a destructive force on society and expressed nostalgia for the days of “monoculture.” They defined this idyllic time as one where all attention was focused on one or two people or groups, e.g. the Beatles, and on no one else. The idea expressed by the Yuns is that the new internet world where everyone may take his best shot at fame is unfair, and a veil that should not have been lifted has been removed. In the meantime, everyone, described as “the heart-broken masses,” wanders through the selection at will, as customers as well as fame-seekers. The Yuns’ complaint is very similar to a running theme in the works of Michel Houellebecq: the free market of choice has created winners and losers and in doing so has destroyed the dreams and self-respect of the last group.

Perhaps the question is whether existing in a world of dreams, one in which a person could feel good about himself using the “might have been” fantasy, is an acceptable burden to thrust upon society. After all, in his short story “The secret life of Walter Mitty [which the Ben Stiller film butchered],” satirist James Thurber’s point was that living in dreams replaces action, allowing people to imagine themselves as people filled with unrecognized abilities. Even Thurber’s picture of the type for whom such an existence is necessary was probably accurate: a passive middle-aged man who had missed opportunities in his youth (implied WWII vet, so both chances to be a military hero and cash in benefits to start a business, further education, etc.) and resents his wife as the cause and the personification of the mediocrity of his existence.

But are we better off with the veil of mediocre monoculture lifted? Is the fact that revelation may not be pleasant for those who discover that they are unappealing to the modern market really a justifiable cause for concern? Is the old world of “monoculture” really something to look back upon with nostalgia?

My former composition and counterpoint teacher was also a concert pianist, who trained at The Juilliard School. While still a student my teacher was signed by a major record label. One of the tidbits I learned from him was that back in “those days (mid-20th century)” practically the only way a young (classical) artist had of obtaining notice was to be at an elite conservatory since that is where the scouts went almost exclusively.

The MO for finding the “latest new thing” made perfect sense for the time period. There was (and still is) a tremendous amount of investment on the part of the label that went into publicity for and grooming of a young artist. Further, in my teacher’s case, the label handled studio and recording expenses, created and booked concert tours, and handled venue costs. The artist did not have to repay the funding; however, total expenses would be deducted from any royalties should he/she become successful. The investment risk meant that going to places where the already-succeeding were clustered was the safest bet for the big labels. There was very little room in the equation for a person who was not already positioned to join the upper professional echelons, or someone who had no insider access.

Was a situation where the major labels acted as gatekeepers and only considered people who fit a certain profile really better than the current one where the internet and digital tools allow artists to perform directly to the audience? The nostalgia for a time of “monoculture” speaks to a yearning for a closed, stratified world. The world where my teacher grew up and worked was a world in which someone with big dreams could imagine himself as simply undiscovered, an unrecognized talent whose gifts would never benefit society. There is some security, a perverse comfort, in such a dynamic. A person never has to confront the idea that maybe he has no talent, maybe his music is not good enough, maybe what he does is something no one finds interesting, perhaps there is no market for him to fill.

The breakup of the “monoculture” has forced average Joe dreamer to confront these possibilities. Instead of only playing and dreaming in his garage, he can now release his own albums on iTunes and Prime Music, upload videos to YouTube and Daily Motion; he can have his own website and create his own publicity. He can wait to see if his work is accepted and if there is an audience for it. The Yun family has argued that the process of exposure and competition is cruel, that it breaks up human contact, that it consigns the vast majority who desire to be part of the “culture” to being part of the “heartbroken masses.” But the real question is: How is average Joe dreamer any better off under the old system? Isn’t a situation in which he at least has a chance to be seen, to make it big, better than one in which he is simply locked out? 

Contempt for capitalism

Scarcity is merely contempt for that which is easily obtained.

                                                                                               ~ Seneca (attributed)[1]

While I am aware that many of the NOL writers do not agree with Jordan Peterson, I think he is correct when he says that many of the contemporary problems faced by society are the product of prosperity. The essence of his argument is that in an evolutionary sense mankind has two sets of priorities: 1) necessities of life and 2) social standing attached to life meaning. For the first time in history, we are living at a point where the necessities of physically staying alive, e.g. housing, food, medical care, etc. are obtainable and now people are free to fixate on the second set of priorities. The point where I personally diverge from Peterson and those of a traditional “conservative” inclination is that I do not think it is incumbent upon broader society to provide a sense of meaning or status for others.

In an interview with Christina Hoff Sommers and Danielle Crittenden of AEI for their Femsplainers podcast, Peterson illustrated his argument with an anecdote from his childhood on the Canadian plains. He described how, despite college being free in Canada at the time, he was the only one from his high school class to go onto tertiary education. Some but not all of the cause, he said, was ignorance. He observed that much of the reasoning related to an inability to delay gratification. At the time, his hometown was an oil center and was flush with money from the boom of the 1980s. Consequently, going to work immediately and making a lot of money was more attractive for high school seniors than was attending university. For the next decade this strategy worked. But, as Peterson recounted, when he was in the final year of his PhD, two things happened: 1) the wells supporting the town ran dry and 2) the oil bubble burst.

Unsurprisingly, what were a blip on the radar and a minor hiccup in the global and historical scope were disastrous for Peterson’s hometown. What became apparent in the fallout was a softening of character and mind. Counterintuitively, men who had no problem rising at 4:00 AM and working in subzero temperatures for sixteen hours were shown to be fragile in the face of greater adversity. Having manufactured an identity that emphasized the pointlessness of skill acquisition but also placed great importance upon high income, these beneficiaries of the oil bubble were devastated when their incomes disappeared and their lack of relevant skills left them unemployable and consequently without status. In the following decades, the town became one of many others of the rustbelt variety, plagued by unemployment, substance abuse, and apathy. It was this sequence that Peterson said caused him to start thinking in terms of the evolutionary stages of development. For us, the most relevant takeaway is the limitations of money on conferring status.  

As Alvin Saunders Johnson (1874 – 1971) wrote in his article “Influences affecting the development of thrift (1907)” in which he lamented the loss of capitalist mentality among the American people,

It is obvious that the more closely society is knit together, the greater the influence of social standards upon economic conduct. The solitary frontiersman shapes his economic conduct with very little regard for the opinion of other frontiersmen with whom he may occasionally associate. The one rich man in a community of poor men is scarcely at all under the dominance of social economic standards.

Saunders ultimately concluded that increased status anxiety or social dissatisfaction was, however counterintuitively, a sign of progress. After all, the real life of a frontiersman was quite different from that romanticized in Zane Grey books. Therefore, assuming that no one really wants to live a subsistence existence, a world in which people are materially comfortable, even if subjected to social pressures, is inherently better.

The concerns induced by social pressure are external to material wellbeing since a positive environment for the latter ensures that people can live above subsistence without difficulty. Social pressure is hardly objective, and because of its variable nature, symbolized in Johnson’s essay by a “suburbanite touring car,” which was both unproductive and quickly superseded by newer vehicles, sweeping claims about feelings of social pressure must be denied validity. Johnson presented the thirst, which he observed in 1907, for status symbols as both positive and negative. Positive because it indicated greater prosperity; Negative because it created a different set of societal problems. As society became increasingly removed from a subsistence existence, the human cycle of craving and acquisition – which in a hunter-gatherer environment meant survival of winter – developed into an obstacle, rather than a necessary tool. The contemporary extension to Johnson’s line of reasoning correlates to Peterson’s assertion that the fixation of broader society on status is a natural sequitur to achieving a level of prosperity sufficient for the basic needs of all to be addressed.

While the topic of interest to Johnson was thrift and capitalist mentality, or rather the lack of it, on the part of the American people, his essay contained an interesting and pertinent observation regarding land practices. Remembering that he wrote this particular essay in 1907, he remarked on an agricultural real estate bubble existing in the Midwest, his native region. The reason he thought it was important was that he maintained it demonstrated two points in the spending-thrift, status- (genuine) capitalist paradigm. Property prices and valuations had risen, he argued, past the productive potential and therefore market value of the real estate in question. This rise was driven, he said, by the farmers themselves who spent lavishly on land because of an association of ownership with status, even if they were then unable to use the land effectively and therefore unable to recoup the investment. He predicted that land purchased under these circumstances was not a good investment. Further he added that when the bubble collapsed, not only would there be monetary loses, there would also be ego and identity crises as those farmers who saw the value of their holdings diminish would feel as though they had lost social standing, even if they didn’t lose a single acre. However since, Johnson observed that many of the farmers he knew, and upon whom he based his hypotheses, had repeatedly mortgaged their current holdings in order to buy additional pieces, the economist predicted property forfeiture as well.

A prototype Austrian economist, though born and raised on the Nebraska plains, Johnson made no moral judgements about the acquisitive instinct or the needless purchase of property. The pettiness exemplified by Thorstein Veblen (1857 – 1929) and his claim that curved driveways was a pointless display of wealth and status because they used space that might be put to better use (The Theory of the Leisure Class, 1899) was simply not a factor in Johnson’s works. Johnson’s conclusion on improper land acquisition and use was based purely on dollars and cents: for a farmer, unprofitable land was not an investment, and no amount of wishful thinking would make it into one. Johnson’s predictions about an impending time of property price collapse and hurt egos came to pass in the 1920s with the agricultural depression, which was in full swing by 1924. The bubble burst was possibly delayed by World War I. The Dust Bowl and Great Depression of John Steinbeck’s The Grapes of Wrath was an epilogue to a situation that had begun many decades before.

In 1922, shortly before the agricultural bubble burst spectacularly, Johnson wrote another article, “The Promotion of Thrift in America,” in which he warned, again, that the American land-owning farmer was overleveraged. He also identified the existence of a tenant farmer class whose status was due to lending policies which encouraged current owners to become overleveraged through easy mortgages, thereby inflating property prices, but prevented first-time buyers from entering the market. The dispute between NIMBY and YIMBY is hardly new. To further exacerbate the situation, the agricultural sector had begun to receive federal subsidies as part of the war effort. While Johnson didn’t mention this in his article, the subsidies further distorted the connected real estate market as they made farming appear more profitable than it was. In response to criticism for his decision to refuse to subsidize agriculture in any way, Calvin Coolidge, who was desperately trying to salvage the American economy and took a firm anti-subsidy, anti-price control approach, said, “well, farmers never have made much money.” Johnson’s predictions from 1907 came true, but Coolidge’s 1926 decision on subsidies received the blame.   

What relevance, one might reasonably ask, has a set of observations from 1907 on the investment attitudes of, primarily Midwestern, American farmers have today? Why should these observations be combined with comments on status, work, and society made by a controversial academic? The reason is best exemplified by Sen. Marco Rubio’s article in the National Review titled “The case for common good capitalism.” It was an apology of soft socialism swaddled in pseudo-religious sanctimony culled piecemeal from various Catholic encyclicals. Since Kevin Williamson already dissected the article and its flaws, there is no need to rebut Rubio at this juncture. The overarching thesis to Rubio’s piece is that society has robbed the working man of dignity. The logic is tangled, to say the least, and it crescendos to the conclusion that a shadowy, disembodied society has mugged the working man and taken his dignity (or sense of status). The basic equations appear to be as follows:

  • burst property bubble = conspiracy to deny ownership, at least dignified ownership, to average Joe
  • loss of jobs for unskilled labor = denial of right to work;
  • lack of workforce-owned corporations = some vague travesty that is tied to denial of ownership.

For this last, Johnson in “The Promotion of Thrift in America” specifically mentioned such corporate arrangements as complete failures because they created hostility between employers and employees, and in fact reduced employee-loyalty, since the latter believed that such arrangements were intended to defraud them of their wages.

In his two articles, Johnson frankly argued that only the middle-classes and up practiced thrift and engaged in capital acquisition and investment. He dedicated most of “The Promotion of Thrift in America” to expounding upon the ways that the “wage earners (polite speak for working class back in the 1920s),” farmers, and the lower-middle class not refused to pursue capitalist behavior but would respond with active hostility when thrift campaigners (yes, there was an save-and-be-a-capitalist campaign in the aftermath of World War I, right up there with the temperance movement![2]) suggested that they should. The crux of the issue, one which Rubio’s article refuses to recognize, is that, like in the 1920s, the roots of broader societal complaints lie decades in the past. The efforts to create a quick fix are therefore both futile and infantile. Every couple of decades a specific subgroup, be it the overleveraged farmers of the early twentieth century or the unskilled oil workers of Peterson’s youth, discovers that its values are defective and that the signs its members believed to be markers of status are liabilities. Eventually the price of the decisions built on these misplaced values and symbols must be paid. In order for the payment to occur in a way that does not unjustly burden the rest of society, we must recognize that the scarcity experienced by the indebted subgroup lies more in their contempt for the genuine capitalist way of life than in any wrong society has inflicted upon them.

[1] Despite attribution to Seneca, I have been unable to find this aphorism among his works currently in my library, and I would be very appreciative if someone could tell me in which work he wrote this phrase (if indeed he did).

[2] Interesting which of the two movements gained enough political clout to have its agenda inscribed in the Constitution.  

Perspective and riches

Sometimes working in the arts can be quite disorienting, especially in terms of what comes out of the mouths of colleagues. For example, a close friend was in rehearsal and an ensemble member, having spent the first hour staring at her, suddenly demanded: 

“Are those real diamonds [pointing at a simple crystal strand bought at H&M]?” 

“What?! These?! No.”

“Oh, okay. I was trying to figure out how rich you are.” 

There were so many things wrong with this actual exchange that it is hard to know where to start. The main, collective reaction was: “Who openly admits to sitting there thinking things like that?” The episode embarrassed everyone except the person who asked the offensive question. Aside from the immediate disruptive effect it had, the incident was indicative of a greater socio-cultural problem, a shameless voyeurism that, while not new, has reached a fevered pitch today.

While one could easily say that reality TV and Instagram are primary causes, there are plenty of examples which predate these media, most memorably Gustave Flaubert’s Madame Bovary and its prescient view of tabloid and celebrity culture. What is new, though, is the idea that the envious and their curiosity have any legitimacy. We have come from Flaubert’s view that Emma Bovary was a colossal idiot to articles published by the BBC lamenting “invisible poverty.” The BBC writer’s examples of “invisible poverty” were an inability to afford “posh coffee,” a qualifier which he declined to define, and neighbors wondering if a “nice car” was bought on auto loan or owned outright. Like the question about diamonds, not only should such matters be outside the concern of others, to think that they are appropriate, or even a valid source of social strife, is disgusting and disturbing. 

In his book Down and Out in Paris and London, George Orwell complained about being sent to Eton, where he spent his school years feeling as though everyone around him had more material wealth. The essence of his lament was that he wished his parents had sent him to a small grammar school where he could have been the richest student. He also claimed, in a wild generalization, that his feelings on the matter were universal through the British upper-middle class. Further, he said that it was his time in secondary school, not as commonly claimed his time as a civil servant, which fueled his turn toward Marxism, following the traditional logic of grabbers – “they have so much and therefore can spare some for me.” 

The most baffling part for Orwell was the way that the upper-middle class, which included his family, was willing to move to far-flung corners of the globe and live in conditions the lowest British laborer would not accept in exchange for educational opportunity for their children and a high-status, reasonably wealthy retirement for themselves. For a comprehensive analysis of the phenomenon of self-sacrifice, its role in the development of capitalism, and why only the century upper- and upper-middle classes were the ones willing to make such exchanges, see Niall Ferguson’s Colossus

It is important today for us to become more critical regarding complaints about society and anecdotes that are presented as proof regarding unfair societal mechanisms that prevent social mobility. An example of the reason we must be careful is art recent article published by written by a Cambridge undergraduate for The Guardian, who identifying as working class and having many problems along those lines, cited as her biggest complaint the Cambridge Winter Ball. Her problem was not that she hasn’t been able to attend, but that she had had to work for an hour in order to get into the Ball for free. This is a questionable example of social immobility. Her complaint about the Ball was that there were others who could pay the £100 entrance fee upfront. From this, she assumed a level of privilege that might not necessarily exist, i.e. “the other students could part with 100 pounds.” 

Another example of failure to understand the availability of resources and extrapolating a false conclusion of social immobility is the Columbia University FLiP (First-generation, Low-income People) Facebook page, which was, through 2018, their primary platform. In response to Columbia University’s study on their first-generation low-income population, many of the complaints related to books and the libraries. FLiP students didn’t know that books were available in the library, and so they had purchased study materials while their “wealthier” peers simply borrowed the library copy or spent the necessary number of hours in the library working. Ironically, this complaint is not valid if you also consider that Columbia does an immersive orientation in which new students are taken into the libraries and are shown the basics of the book search system, card operations, checkout procedure, etc. In response to the publicity surround the FLiP drive[1] the university opened a special library for these people where there is no official checkout; all loans are on the honor system. On a hilarious side note, in the middle ages libraries would chain books to lecterns to keep the students from walking away with them.

While we may have moved away from a society that encouraged living modestly to avoid arousing the envy of one’s neighbors, we now live in a culture in which our neighbors’ jealousy is too easily aroused. Chaos is the natural resting state of existence, but people have lost the ability to construct order for themselves out of it. It is possible to argue that modern people have not been taught to do so; after all, no one comes into the world knowing the underlying skills that are the foundation of the “invisible poor” complaints, e.g. social interactions, sartorial taste, self-sacrifice, etc. To tell the truth, mankind’s natural state is closer to the savages of the middle ages whose covetous inclinations necessitated the chaining of library books. On the one hand, we have progressed tremendously past such behavior and in doing so created order from chaos; but on the other hand, the external signs of progress are now under fire as symbols of privilege. Chillingly, the anti-civilization narrative, because that is ultimately what it is, is being incorporated into an anti-capitalist agenda through the conflation of “civilized” with “privileged,” which in turn is conflated with “rich.” 

[1] It is also revealing that the sign off for these people while the drive lasted was FLiP [school name]. Yes, one must wonder if even the acronym was picked for its stunning vulgarity. 

Early 20th century socio-economic commentary: history in the making

Several years ago, I used to watch the television show Bones. The only quote I remember from that show was surprisingly pithy given its origins. Regarding a serial killer the team has finally tracked down and neutralized, the resident psychologist, Dr. Lance Sweets, says: “I was right. He was nobody – angry at history for ignoring him.” Contemplating the second part of the quote, one realizes that the potentially histrionic line holds some alarming applicability today.

Tom Palmer wrote a magnificent article, “The Terrifying Rise of Authoritarian Populism,” which he examined the way that failed individuals and communities turn to a collective identity to bolster their self-esteem, which in turn creates a dynamic conducive to populist ideologies of all stripes. The pressing question is: Why does the majority feel entitled to dictate to the minority, in a form of mob-rule wrapped in the husk of democracy? In order to understand, though never to solve, this question in America, the one country whose founders openly designed it specifically to avoid tyranny, both of the majority and the minority, one must look to a mixture of factors.

In Ayn Rand’s Atlas Shrugged, there is a snippet of story about the history professor “who couldn’t get a job because he taught that the inhabitants of slums were not the men who made this country.” Quite literally because none of the Founding Fathers came from insignificance, outside of Alexander Hamilton, the illegitimate son of a Scottish gentleman, a man who was rather blatantly waiting around for a woman of rank to become available and who didn’t leave his son any of his extensive property.[1]Given that Hamilton’s early promise belied his later invention of the early federal reserve and his apologetics for tariffs, the suspicion of historians – and Hamilton’s own peers if private letters among Jefferson, the Adams, and others are to be believed – that he had some bitterness toward the propertied class on the basis of his childhood is justifiable. Benjamin Franklin was very proud of the fact that he managed to make his own fortune – having parted acrimoniously with his solidly middle-class extended family. To be fair, Franklin never claimed to be “self-made,” just to have had to be self-reliant at an unusually young age for a man of his class. There is much to be admired in Franklin’s rigidly honest self-definition, especially today. To return to the quote from Rand, the idea expressed was not a comment upon the literal Founding Fathers but rather upon the building of identities and the falsity contained therein. 

The visualization graphic linked from FEE shows clearly the extent to which incomes have increased over the years. The discontent connected and displayed through dramatic claims about “shrinking middle-class,” “stagnant wages,” “1 percent,” etc. was predicted in 1907 by economist Alvin Saunders Johnson (1874 – 1971) in his study “Influences Affecting the Development of Thrift.” Starting with the question:

If it is proposed, through legislation, to liberate a given social class from some of the uncertainties and hardships of the laissez-faire regime, one of the first questions to be raised is: “What will be the effects upon the habits of saving of the class concerned?” 

After laying out in great detail why redistributive policies were bound to fail fiscally and socially, Johnson took direct aim at what he perceived to be the source of the problem:

To-day the working class is rising into an autonomous position. The workingman of to-day repudiates the term “the lower classes.” His position is not the same as that of the property owner, but it is not in his opinion inferior. It follows that any line of conduct rising normally out of his position as a wage earner will be held in honor by him. It is pertinent, therefore, to inquire what attitude toward thrift the exigencies of his situation lead him to adopt. 

It is no part of the workingman’s view of progress that each individual should become the owner of a capital whose earnings may supplement those of his labor. No such supplementary income should, in the laborer’s view, be necessary; and the work- man who endeavors to secure it for himself, instead of bending his efforts to the winning of better conditions for labor in general, is likely to be blamed for selfishness rather than praised for self-restraint. […]

Light-handed spending in time of prosperity, mutual aid in time of distress- such appears to be the approved conduct of a permanent body of property-less laborers. And if this is true, we may be quite certain that such practices will in the end be idealized, and that middle-class schemes of cultivating thrift among the working classes will meet with increasing resistance. Already it is easy to find bodies of intelligent workmen who express the greatest contempt for the fellow workman who is ” salting down ” a part of his earnings. 

All of these factors, predicted Johnson, would lead to increase inequality, social and financial, and anger with the socio-economic system. The inequality would stem, not from literal economic inequality, but from the loss of the “laborers” to discern genuine investment, in self, family, and business, from mere consumption, leading to a knowledge- and know-how gap.

At the time Johnson wrote his study, the Progressive movement and its acolytes were running rampant in the US, promoting what we would call today a “soft” socialist state, and the campaigners were experiencing unusual popularity in response to an agriculture bubble due to subsidies that inflated land prices and a more general move toward socialism among urban workers. While the prototype socialists blamed consumption, adopting eagerly the vocabulary of Thorstein Veblen’s The Theory of the Leisure Class(1899), Johnson rejected the idea completely:

“Conspicuous consumption” is a proof of economic success, and wherever it is the most telling proof, the standard of economic success is likely to be a standard of consumption. But wherever economic success is better displayed in some other way, as for example by increase in one’s visible assets or productive equipment, the standard of consumption exercises little influence upon economic conduct. A standard of conspicuous possession or of productive power takes its place.[2]

Instead, the root problem was a mass loss of will to be capitalists and to engage in and with the capitalist system. This in turn stemmed from a desire for dignity, a pursuit doomed to failure because it was built not on the dignity of work and the worthiness of independence but upon class identity. Exacerbating the situation, as Tom Palmer explored, is the fact that this identity is collective, which fits with a rejection of capitalist pursuit because entrepreneurship is inherently a singular, individual effort.  

Today, we are facing the consequences of the rejection of what Margaret Thatcher called “the strenuous life of liberty and enterprise.” Those who embrace this lifestyle ideal are the ones who have made and continue to determine history. While they may be mappable as a network or a general type of group, all of their achievements lie outside a collective identity. Any set of people can be distilled down to a select set of characteristics that give the impression of a collective unity; for example, one can make a blanket statement along the lines of “the majority of tech billionaires are Ivy Plus dropouts” which would be true in a literal sense and false in its reductionist view. 

The collective view of the social peer must fail of necessity. It is what Johnson meant when he mentioned the derision directed by working-men at those of their fellows who stepped outside a collective concept of “place” and tried to become capitalists through saving. The policing of the peer in America has failed miserably as Palmer described when he wrote of individuals seeking solace in the notion that their community is successful, even if they are not. The illogic of this position escapes them: it is impossible for a community of individual underachievers to become successful merely through combining into a collective. History shows many times over that such a situation only increases the multiplication of failure. And it is the inexorability of history – though not, heeding Karl Popper’s admonition, historicism – that is the source of the anger today. The collective from the slums does not make history, and those who make up the collective are now angry at history for ignoring them.        

[1]To be fair to Hamilton Sr., not much is known about the circumstances of his estate. It is perfectly possible that it was entailed and therefore could not be bequeathed at will. Hamilton Sr did pay for an elite, in a Caribbean-colonies context, education and funded his son’s early ventures in New York City. Also a good proof for the idea that the bank of mom and dad is NOT a Millennial invention.  

[2]Johnson is an American economist who really deserves greater recognition. He grew up on the Midwestern plains, and in these fairly isolated circumstances, he articulated a theory of economics which he later recognized as part of the Austrian School. He co-founded The New School in NYC and was single-handedly responsible for the university becoming a home to Austrian and other central European scholars forced to flee from the Nazis.