When hard work doesn’t equal productive work

In March 2020, David Rubenstein gave an interview in which he lamented the vanishing of a system in which “hard work” guarantees success. While the source of nostalgia is understandable, there is an epistemological problem with the conjoined assumptions underlying the concept of hard work and also what a “system” promises, i.e. if one works hard, then one become successful. The issue appears to be one of qualifying and quantifying “hard work.”

My previous article “An aspirational paradox” mentions Abigail Fisher and her failed lawsuit against University of Texas – Austin over her non-acceptance to the institution. The case was a painful example of the disillusionment which must follow when believers in the exceptionality of the commonplace are finally made aware of its mediocrity. The Fisher saga represents the modern tragedy of familial ambition: a child’s parents place her on a systemic path, promised by wise public-school teachers and caring guidance counselors to lead to success, only to discover that the end is the furnace of Moloch. Caveat emptor.

The strange, disembodied entity called “the system” doesn’t fail; what fails is individual and collective concepts of what the system is and what it requires. Mankind has a capacity for filling a void of ignorance with figments of its imagination. In general, such practice is harmless. But when a person believes his own creation and builds his future upon it, that is when the ‘systemic failure’ narrative begins.

Drawing again from my own encounters, for many years I knew a music teacher who believed that one must never listen to repertoire. Yes, you read that correctly: a teacher of an aural art form believes that listening to music is detrimental. The person had many long, pseudo-pedagogical explanations for this peculiar belief. His idea was atypical. Professors at the world’s top conservatories and musicians from major ensembles all emphasize listening as a crucial part of study. Listening as a formal component of music study dates to the invention and mass distribution of the phonograph in the early 20th century. Even further back, students attended live concerts.

This teacher had a pedagogical system built around his beliefs, which included that students should neither learn basic keyboard skills nor how to play with accompaniment. Unsurprisingly, students who adhered to his system didn’t progress very well. Problems ranged from poor intonation and lack of ensemble skills to arriving for college auditions with no grasp of appropriate repertoire. Feedback from competitions was kind but completely honest. The more students failed, the more obstinately he insisted that political maneuverings or class biases were to blame. “The system,” by which he meant auditions, was “broken,” designed to not give people a “fair chance.”

Sadly, this man affected a large number of students, many of whom worked hard – practicing long hours, racking up credits, participating in multiple ensembles – only to discover that their “system” was a fraud. All of their hard work was for naught.

There was one particularly heartbreaking case of a young woman who applied to a fairly prominent private university. By her own account, her audition was catastrophic. In the lead up to the audition, she did her best to ensure success; she had two lessons a week, increased her daily practice time by an hour, and played along to background recordings. The amount of work she did, measured in terms of effort and time spent, was brutal. But she didn’t pass the audition and was understandably devastated.

A system she had followed religiously since fourth grade had failed her; moreover, her hard work was guaranteed to fail. There was no way for her to succeed based upon her training. In some ways, this girl’s story parallels Abigail Fisher’s history. For years both put in hours of effort only to discover that they had misjudged and misplaced their energies. Bluntly, these young women worked hard but not strategically.

The failure of these girls was unrelated to the broader “system,” whether that system was auditions or college applications. To argue that “the system” is broken on the basis that hard work is not rewarded is irrational, albeit understandable on an emotional basis. Before rushing off to denounce “the system” for not rewarding hard work, one should critically examine the foundational premise and ask: Was this hard work or was it productive work?

An aspirational paradox

In general, contemporary society disparages people extremely focused on their careers, labelling them as “careerists.” No real objections are presented; instead there is simply a vague simmering contempt for “careerists.” When an anti-careerist manages to articulate an objection, it is usually couched as a social justice problem: careerists cause unfair societies by being ahead of everyone else. When practiced appropriately, i.e. looking after one’s own best interests, “careerist” mentalities and behaviors, such as discipline, planning, and diligence, are necessary for prosperity, both personal and societal.

During the early stages, the opposite of careerism isn’t drifting: it’s aspirational behavior. An illustrative anecdote: a pre-med major in my year failed part of her science requirements repeatedly, mostly due to partying. Despite her inadequate academic record in her major field area, she applied to Harvard Medical in her final year. Unsurprisingly, she wasn’t accepted. As a mutual acquaintance said: “one’s going up against people who’ve been working to get into Harvard Medical since middle school; they [Harvard] don’t need someone like her.” Although the result was only to be expected, it came as a complete shock to the girl and her parents since they had all believed that she was destined to attend Harvard Med. To listen to the former pre-med student talk today, she didn’t get to go to Harvard Med. It is as if some external force denied her a chance.

As a quick explanation to non-American readers, because the US system requires that students take courses outside of their major field, there is a high tolerance for poor marks in general education requirements; the trade-off is that one is expected to earn reasonably high marks in one’s own field in order to advance to the next level. For an institution such as Harvard Med, that a pre-med student earned anything below average marks in a science course would be unacceptable, unless there was a very good reason clarified in the application statement. A good reason would be a family tragedy or some life event beyond one’s control, but not partying.

The sheer reality is that my friend was right: Harvard Med receives applications from candidates who shown single-minded focus in pursuit of a goal since age twelve. In comparison to that, a twenty-three-year-old woman whose transcript screams “unfocused” is not a prize. Even the act of applying to Harvard would count against her since assessors would conclude that she hadn’t bothered to read the guidance and fit sections, i.e. the pages where expected grades and MCAT scores are listed, on Harvard Med’s website.

The case of Fisher vs. University of Texas 2013 and 2016 (the Supreme Court heard the case twice) is an example of the dichotomy between aspiration and careerism. Abigail Fisher applied to University of Texas – Austin but was turned away as her grades, though above average for her public school, were below the university’s automatic admission standards. The crux of her suit was that UT – Austin both didn’t take into account her extracurriculars and replaced her with an affirmative action candidate. Eventually, the Supreme Court ruled – both times – that although UT – Austin ought to have been more transparent in their admissions process, there was no evidence that the university had discriminated against Fisher.

The aspirational part of this story was that Fisher’s extracurriculars, Habitat for Humanity, local youth orchestra without holding a principle position, and miscellaneous charitable activities, were not genuine distinctions in a candidate pool such as that commanded by UT – Austin. Based on my own experience as someone who was in youth orchestras from middle school until college – if one counts my first training ensembles, then I’ve been in orchestras from the age of six up – should an applicant wish to use music involvement as proof of merit, then youth ensemble participation is a simple expectation. Unless one was a section leader, youth orchestra membership is not a sign of exceptionality. According to Habitat for Humanity’s North America page, the annual number of volunteers is around two million. Volunteering with the charity is generous and worthy, but doing so does not make the volunteer stand out in a candidate pool.

While society can discuss endlessly the merits and demerits of affirmative action, the Fisher case indicates that the policy has taken on a role no one could have predicted: scapegoat. The policy has become an escape hatch for aspirationalists seeking to avoid facing their own inadequacies and lack of proper preparation or career focus. Instead of blunt conversations about the real reasons a person didn’t qualify for a desirable professional school or first-choice university, aspirational individuals can offload blame onto the policy. While one can hardly blame the policy for being a scapegoat, one must acknowledge that such use has potential to be very damaging to the social fabric.

The importance of biography

There is a now-out-of-print children’s book series entitled “Childhood of Famous Americans,” published as a subdivision of the Landmark Books series between 1950 and 1970. When I was between the ages of six and ten, I was fortunate to be able to read almost all of the books, which were, unsurprisingly, the biographies of prominent Americans written for children. Even when I was little, the books were fairly ancient: the most recent subjects they covered were Eleanor and Franklin D. Roosevelt and Albert Einstein. Despite, or even because of, their relative antiquity, these books had a major impact on my own trajectory.

This is not to say that they weren’t flawed since they were. Often they were riddled with historical inaccuracies, the quality of writing varied wildly from author to author, and the content could be outright offensive in regards to religion and races. The overall series, however, did a very good job of including biographies of Americans from minority groups, but, depending on the subject, author, and time period, the portrayals of other races could be quite insensitive.

The books all followed the Joseph Campbell theory of story to a T, with the result that they were very good stories. A critic might argue that these biographies lionized or apotheosized individuals in an unrealistic way. While such an accusation would be true, the series was titled “Famous Americans,” not “average Joe Americans.” The important trait of these books though was that they all shared a common theme: stature was a choice and one that was made in childhood or adolescence.

Using Campbell-ian terms, the moment of awakening was almost invariably an episode where the subject realized that the people surrounding him or her were stupid, fearful, and conventional – Mark Twain being expelled from multiple schools, Abraham Lincoln denied an education by his illiterate father (as I said, not all of the stories were tremendously accurate), Henry Clay fighting for his inheritance rights against his extended family, Jim Thorpe struggling against racial and social prejudice throughout his sporting career.

On a side note, there was a remarkable absence of American fine arts figures in the series. Mark Twain was one of a handful of writers that included Edgar Allan Poe and Louisa May Alcott; I don’t recall that some of the more sophisticated writers, such as Washington Irving, Henry James, or Edith Wharton, received the honor. One could say that the absence of fine artists was countered by an equal absence of career military men. Dwight D. Eisenhower had a biography, as did George Armstrong Custer (his was most uncomplimentary). Robert E. Lee and Ulysses S. Grant both received a book. I’m sure that there’s room for analysis of a vision of civil society expressed in who the series’ editors decided to cover.

The “Childhood of Famous Americans” series only rarely had a specific antagonist. Some combination of self-satisfied parents, authority figures attached to a status quo, and parochial small-mindedness served as the villains. The subject’s daily obstacles were educational and cultural mediocrity, societal complacency, intellectually inferior peers, and timorous and incapable mentors, who by extension weren’t very good at their job.

Fundamentally, the goal of the series was to create role models for young readers. The model proposed was complete rejection of (and a little healthy contempt for) existing systems. The unifying theme among all the people selected was the tradition of “rugged individualism” and the idea that progress was due to the action of individuals, not that of their communities (recall, the village inhabitants were invariably shown as small-minded, poor spirited morons).

Carl J. Schramm argued in his 2006 book The Entrepreneurial Imperative that the “rugged individual” ethos was an American casualty of post-World War II society. Americans turned more toward the concept of the “workforce,” with its communal overtones, and away from individual achievement and success. The peak of statist, stagnant communitarianism came in the 1970s, the decade in which the “Childhood of Famous Americans” also ceased publication.

Both biography and entrepreneurial spirit speak of a path to personal greatness, a way for individuals to emancipate themselves from their origins if they have sufficient will. The loss of biography and an entrepreneurial ethos indicate an impoverishment in role models. Without role models of individualistic thought or practice, most people lack the originality to conceive of ways of life beyond their current existence. Discontent and feelings of betrayal by “the system,” society, or the status quo are the ultimate result.

Today, we are confronted by the implosion of the post-WWII status quo. To further complicate matters, the majority of the adult population lacks a blueprint for either challenging what remains of the status quo, or for forging a new path. Without the proper role models of individuality, shown in biography, such people are in thrall to the false promise of communitarianism.

Foundering in academia

For the last couple of weeks, I have been reading and re-reading Gerard Klickstein’s book The Musician’s Way: A Guide to Practice, Performance, and Wellness. Klickstein is a musician and professor who has spent much of his teaching career helping other musicians recover from physical injury or overcome psychological issues, such as performance anxiety. Klickstein argues that the vast majority of musicians’ problems, physical and psychological, are a result of poor formation at critical stages of development. Reversing problems engendered by “unqualified,” i.e. incompetent, teachers is an overarching theme in the book. Reading Klickstein’s anecdotes in which many of his students are recent college graduates, one becomes alarmed at the sheer number of incompetent teachers present in “higher education.”    

Several summers ago, at a music festival, I sat with an opera singer friend and we assembled her audition book. An audition book is a selection of opera arias which a singer provides to producers during the audition process. My friend and I were deep into research and consideration, when another musician, also a singer, joined us. His contribution was to question us as to why we bothered with the book at all.

He went on to reveal that he wasn’t planning on attempting the opera house and festival audition cycle, nor was he considering trying for a choral ensemble. Instead, he was applying for faculty positions at small colleges. He was a recent doctoral graduate from a university which is overall relatively famous but not particularly well-regarded for its music school. At that time, the three of us were roughly at the same level. His experience and education were slightly above average for the types of small, regional institutions he was targeting.  

Behind his dismissive behavior lay a mentality of minimal effort. Why should he go to the trouble of researching roles, evaluating musical suitability, and learning parts when his résumé would satisfy the expectations of small, provincial colleges? He lacked the vocabulary to explain his vision, but what he described was a sinecure. Before the festival ended, he had secured a full-time position at an institution in a backwater of the American southwest.

One side of the proverbial coin says that the institution was lucky to have him – his background certainly was above anything the college could expect on the basis of its own reputation and musical standing; the other side of the coin says that it is concerning that someone like him could see academia as a safety net. 

Now American colleges have begun to furlough staff. As you can imagine, many of my Facebook friends are people who attended and are now staff at small liberal arts colleges and small state universities throughout the country. In the atmosphere of uncertainty, my own FB feed has filled up with people lashing out against a society, which, they insist, doesn’t value them. There is an underlying financial element; few can afford to be furloughed. But there is a deeper issue present: a professional inactivity that has pervaded American small liberal arts academia for the last few decades.  

In truth, financial concerns are more a symptom of professional inactivity than they are representative of some overarching truth about poor pay for teachers. I recall how one of my Columbia professors told my class never to rely on a single income stream. He would talk about how all breaks are opportunities to be productive. He told us about how when he was starting his career in the 1960s, he deliberately accepted a part-time position, rather than a full-time one, so that he could finish writing his first book. In terms of his career, the book was more important than his job at a small city college because the book paved the way for the big opportunities. To tell the truth, it didn’t matter that he taught at a small city college, outside of gaining some official teaching experience which he could have obtained through teaching just one class. There’s a difference between being professionally active and simply being busy or being employed.

There is a species of person who follows the same MO as the singer from the music festival. Academia is a safety net, and the goal is to rush into a full-time position and sit there for a lifetime. Their attitude is that of a career teacher, not a professor. They lecture and grade, however there is no professional contribution or creativity on their part. Such people tend to be barren of original thought and to react with hostility to new ideas or concepts. A quick search of academic databases shows that they don’t write articles, they haven’t written books (their theses don’t count), and they don’t write for think tanks or journals. An egregious example is a college professor who writes movie reviews for popular art enthusiast magazines; he’s been passing this activity off as “publishing” and “being published” for years. 

There is, I know, a perception of a double standard on some level. For example, Kingsley Amis taught English Literature at Oxford for the majority of his career. He published comparatively little on the academic side in contrast to some of his peers, and much of his lighter work took the form of reviews, essays, and opinion pieces for newspapers and magazines, such as the London Times or The New Yorker. But he averaged a novel a year. Recognized in his own lifetime as a giant of twentieth century English literature, no one questioned his publication record or his ability to teach the field.

The subtle stagnation at the liberal arts college level has contributed to a culture of belief in talent and luck, rather than good decision making and hard – by which I mean calculated and carefully weighed – work. There are many people today who would classify my Columbia professor’s story as one of privilege and make assumptions about a background of wealth that allowed part-time work. In actual fact, he did not come from a particularly “privileged” background: he simply settled on his priorities, thought ahead, and made his decisions accordingly.

One thing one learns very quickly in the arts is that one must create without expectation of immediate payment. Singers learn arias, instrumentalists study concerti, filmmakers shoot reels all so that when the moment is right, they can produce a piece that demonstrates ability and wins a commission. One tidbit my professor included was that he had to write several critically acclaimed books before he began to receive advances for his work. The principle is the same: create first then receive a reward. A person who works according to the parameters of payment is a drone, and it is unsurprising that such people do not create new works, make discoveries, or have groundbreaking insights. If one considers that American small colleges have populated themselves largely with professional drones, one must reevaluate their worth to education.    

A Queens’ Marxist in the Lions’ Court

When I first walked into the conference room, two other girls were already there. One of them caught my eye and with a friendly nod indicated I should take the seat next to her. I did and then observed the girl on the other side of the table.

She was quite striking, well-dressed in the trendiest fashion, and clearly intelligent, but she exuded an agitation and antagonism that clashed with the sleepy serenity of the room and our own quiet desire for friendship. As our other six classmates trickled in, the Girl across the Table never relaxed and though she responded correctly to any friendly overture, she did so with an attitude of suspicion. Puzzled but too preoccupied to give it much thought, I turned my attention to the department chair who was opening orientation.

For the first couple of weeks I was much too in awe of my new surroundings at this Ivy League university to concern myself with anything more than adjusting as quickly as possible. Only one of us had attended an Ivy for undergraduate and she was one of the nicest people in the class. Recognizing how intimidating the new environment could be, she went out of her way to demystify the place for us, and with her help we soon realized that the tranquil, yet demanding, atmosphere of the first day was genuine. We were meant to become our best selves, not to compete insanely with each other. About three weeks in, our entering cohort of nine had settled into a social and academic routine with everyone participating in a cordial, collegial manner, everyone except one: the Girl across the Table – hereafter called GatT. 

Her hostility from the first day was unabated, and now we were its direct target. During lunch, if someone suggested a book, she had a snarky putdown, even if seconds later she would be raving about another book by the same author. One evening a group of the classical music lovers took advantage of free tickets from the school to go to the opera. GatT came with us. Stretching our legs at intermission time devolved into standing in  a circle and listening uncomfortably as GatT made snide comments about how everyone in the lobby was dressed. As we turned to go back in, I heard her mutter something about “bourgeois” under her breath. A light went on in my heard: GatT was a Marxist – puzzle solved! The next morning, GatT publicly avowed her Marxist leanings during a seminar discussion. 

The mystery of her hostility solved, we moved on with our social lives and pretty much managed to maintain a state of cautious détente with GatT. She made her desire to lead a jacquerie against us fairly clear a couple of times a week. This became funny once a casual lunch conversation revealed that eight of the nine of us had some familiarity with firearms; I commented to the friendly girl from the first day that this particular jacquerie wouldn’t end the way GatT thought. Eventually we became accustomed to her outbursts, and it took one of extraordinary absurdity to elicit any reaction from us. The closest anyone came to snapping at her was the time she claimed that our completing assignments on time was an act of class oppression against her. 

One of the other students was the daughter of two economists who had became ardent free-marketeers after spending their youths as equally ardent Marxists; consequently her grasp of both arguments was comprehensive. After losing a verbal bout with her, GatT refrained from practical arguments and retreated to social commentary. One day during our daily class coffee gathering, she proclaimed that if she had known our school was an Ivy, in order to show support for the proletariat, she would not have applied. As the “discussion” continued, she branded us as privileged elitists. Meanwhile, we quietly drank our cheap coffee and pondered the fellowships that made this our most affordable option. 

The remainder of our graduate studies passed in the pattern of endless writing and studying, intense debates on all sorts of topics, excursions to museums and evenings at the theatre or concerts, and of course simply socializing with each other. We tuned out GatT’s insulting nattering and someone always ensured she received an invitation to whatever activity was scheduled. Despite her clear resentment, she usually came. 

In the final term, when the course load was intentionally light to leave room for writing the Masters thesis, GatT disappeared for a few weeks. We learned through her social media that she was participating in anti-austerity protests in Europe and was immediately sprayed with tear gas during a raucous demonstration. Soon after she returned to school, I ran into her. She told me that she hadn’t started writing her thesis yet: the submission deadline was three weeks away.

I haven’t seen GatT since that last meeting, but the rest of us stay in touch. During a dinner with some of the gang a few months ago we tallied where everyone is now. GatT was the only one we couldn’t account for; because of her propensity for agitating, we suspect she might be locked away in a third-world prison somewhere. We also wonder if she ever managed to complete her thesis.     

Eco’s ‘How to eat ice cream’

A friend recently gifted me a vintage copy of some of Umberto Eco’s essays translated to English. One of the essays, titled “How to eat ice cream,” opened with an anecdote Eco said was based on his childhood. In the story, there was an ice cream vendor who sold regular cones for two cents and ice cream pie cones for four cents. Eco said his parents and grandmother would buy him which ever type he requested, but there was a limit. Young Eco envied the neighbor children who would parade down the street carrying a regular cone in each hand. But whenever he asked for four cents to buy two cones, the adults would flatly refuse and tell him that he could have a pie cone instead. As an adult, he mused:

[…] I realize that those dear and now departed elders were right. Two two-cent cones instead of one at four cents did not signify squandering, economically speaking, but symbolically they surely did. It was for this precise reason that I yearned for them: because two ice creams suggested excess. And this was precisely why they were denied me: because they looked indecent, an insult to poverty, a display of fictitious privilege, a boast of wealth. […] And parents who encouraged this weakness, appropriate to little parvenus, were bringing up their children in the foolish theatre of “I’d like to but I can’t.” They were preparing them to turn up at tourist-class check-in with a fake Gucci bag bought from a street peddler on the beach at Rimini.

The parenting method must have worked because he became Umberto Eco. What Eco recognized was that his parents had inoculated him against false consumerist behavior. The preventative measures were not against the so-called consumerist society but ostentatious display, the process of “keeping up with the Joneses.”

Around January 2018, there was a meme floating around social media. It said something along the lines of “Entrepreneur: someone who lives a few years the way most people won’t so that they can spend the rest of their lives living the way most people can’t.” I very belatedly discovered Carl Schramm’s 2004 book The Entrepreneurial Imperative. Schramm identified the 1950s as the time when American society ceased valorizing business ownership and virtuous risk in favor of material security. As part of the “security first” mentality, children and young people were openly discouraged from seeking independence or from being different in a positive way. The world was one of ossification and stagnation, even as the federal government and media pushed a strong Keynesian message of “consume to grow.” On a side note, now that I think about it, Keynesian economics resemble the children’s video game Snake: one guides the snake to food so that it will grow but eventually it becomes so big that it bites itself – Game over.

Even given the massive propaganda effort put into promoting Keynesian theories, scapegoating “consumerism” or “consumerist society” is a form of escapist thought, a dodging of responsibility. Eco spotted the cause and effect nature of being a parvenu. The desire for “fictitious privilege” creates a set of priorities that cause one to spend his wherewithal thoughtlessly. In turn a “boast of wealth” strategy leads to “’I’d like to but I can’t’” through ensuring that there is no money when real opportunity arrives. The world becomes one of abundant middle as the effort to possess everything spirals.

The Good Life vs. reality

Recently, a former classmate badgered me into accompanying her on a run to the supermarket. As we were checking out, I, as a person who is very dedicated to the principle of self-interest, used a handful of coupons and a discount card to lower my final tally. My companion had a judgmental reaction to the proceedings: she gave me to understand that she never sought discounts or used coupons because to do so was beneath her station. Oddly, she could see no connection between her attitude and her continuous complaints about being short on funds. It was only much later that I connected her attitude at the cash register with her frequent monologues about a “broken society,” a slight fixation on “inequality,” and an overweening sense of entitlement.  

In 1971, NBC produced a sitcom called The Good Life, not to be confused with the British series of the same name. The American series was unsuccessful, in comparison to its competition, and it was canceled after fifteen episodes. I have never seen the show as NBC has never rerun it or provided a home release of it. I first heard of The Good Life in a book, whose title I have regrettably forgotten (for a long time I thought the book was Greg Easterbrook’s The Progress Paradox but now I can’t find any allusion to the tv show in Easterbrook’s book.). The author of the forgotten book alluded to The Good Life as a watershed moment in tv history with its portrayal of the so-called super-rich – the one bit I remember was that the book described the show as “the most luxurious show [in terms of portrayal of lifestyle]” and connected the show to a sudden increase in a broad sense of entitled victimhood throughout society. The Good Life  was also, apparently, part of creating the environment conducive for the success of the soap opera Dallas (1978 – 1991).

The plot behind The Good Life is that a middle-class couple become exhausted with the pressures of suburban life and maintaining a lifestyle that’s beyond their means. Consequently, the pair decide to scam their way into the household of an industrialist multimillionaire by disguising themselves as a butler and housekeeper. The theme which (apparently) underlay the show was the idea that there is a class of people who live extravagant, exotic lives (the proverbial good life) and therefore can afford to support some sponging malcontents.

When researching the show, one thing that struck me about it was how prescient it was in terms of foretelling some of the themes which are present in our current socio-political discourse. The two con-artists are reasonably successful college graduates who believe that society promised them the good life as a reward for going to college and having careers; however, when the pair see the lifestyle shown in glossy magazines – mansions, tennis courts, Rolls-Royce cars – the couple feels that society has reneged on its promise. The logic of the show’s premise is that the couple has been pushed by society – that wicked, amorphous “they” – toward a life of deception because there is no other path to riches open to them.

LitHub ran an article titled “How the well-educated and downwardly mobile found socialism.” The article isn’t worth reading, but the title touches on what began as the fictional premise of The Good Life and has become a full blown, ideologically fraught, issue today. What happens when perception of status is overblown and there is no sense of timeframe to temper expectations? 

Thinking of the popularity of AOC or Andrew Yang and the manner in which they have successfully tapped into the tropes of “unjust society” of “inequality,” the modern millennial (my own generation) seems to have embraced the premise of The Good Life. The tv show contained a very subtle, and completely subversive, inversion of the moral order: because “society’s promises” were broken, the dishonesty of the protagonists was not immoral. The extension of such reasoning is that the industrialist was obligated to support the swindlers anyway due to his greater wealth.

Capx just ran a terrific article by Jethro Elsden, “Jane Austen, the accidental economist,” in response to the new film version of Emma. One of the interesting tidbits the author found was that in modern terms, Mr Darcy’s £10,000 per annum income is probably equivalent to £60 million today, which would make his wealth around £3 billion. Even then Elizabeth Darcy had to “make small economies” once she decided to support her sponging sister and feckless brother-in-law. Granted the economies might have been the result of not telling her husband, but still the point remains that no one can long support spongers.

Elsden alluded to the logic of social pressure and the malignant effect it had on Austen’s characters who feel compelled to engage in an “arms race.” A major reason the swindlers of The Good Life turn to dishonesty is that they feel pressured to look like successful suburban college graduates. The problem was that  in the case of Austen’s characters and the tv show from 154 years later, the definition of “success” in relation to appearances was fungible. Rationally, it is ridiculous for the youngish couple of The Good Life to be in same place financially and socially as their mark, the middle-aged, widower industrialist whose lifestyle (but not work ethic) they covet.

To return, finally, to the anecdote regarding my shopping expedition, the episode is an example of a type of path that begins with frivolous preconceptions and ends with The Good Life on the comic end and the rise of Andrew Yang, Bernie Sanders, or Elizabeth Warren on the other. These politicians have located a demographic which has no sense of progression of time, stages of development, or realistic expectations. A perfect example is my ex-classmate, who has subjected herself to a fantasy regarding her own realistic expectation and now believes that the social contract has been broken.  For such a demographic, the emotional trumps the rational. It is easier to believe themselves wronged than as merely victims of their own imaginations.

The veil of nostalgia

In article for Worth, titled “A new wealth gap is growing – attention inequality,” authors Joon Yun and Eric Yun of the Yun Family Foundation, an institute dedicated to “transforming the way people think,” argued that “attention inequality” is having a destructive force on society and expressed nostalgia for the days of “monoculture.” They defined this idyllic time as one where all attention was focused on one or two people or groups, e.g. the Beatles, and on no one else. The idea expressed by the Yuns is that the new internet world where everyone may take his best shot at fame is unfair, and a veil that should not have been lifted has been removed. In the meantime, everyone, described as “the heart-broken masses,” wanders through the selection at will, as customers as well as fame-seekers. The Yuns’ complaint is very similar to a running theme in the works of Michel Houellebecq: the free market of choice has created winners and losers and in doing so has destroyed the dreams and self-respect of the last group.

Perhaps the question is whether existing in a world of dreams, one in which a person could feel good about himself using the “might have been” fantasy, is an acceptable burden to thrust upon society. After all, in his short story “The secret life of Walter Mitty [which the Ben Stiller film butchered],” satirist James Thurber’s point was that living in dreams replaces action, allowing people to imagine themselves as people filled with unrecognized abilities. Even Thurber’s picture of the type for whom such an existence is necessary was probably accurate: a passive middle-aged man who had missed opportunities in his youth (implied WWII vet, so both chances to be a military hero and cash in benefits to start a business, further education, etc.) and resents his wife as the cause and the personification of the mediocrity of his existence.

But are we better off with the veil of mediocre monoculture lifted? Is the fact that revelation may not be pleasant for those who discover that they are unappealing to the modern market really a justifiable cause for concern? Is the old world of “monoculture” really something to look back upon with nostalgia?

My former composition and counterpoint teacher was also a concert pianist, who trained at The Juilliard School. While still a student my teacher was signed by a major record label. One of the tidbits I learned from him was that back in “those days (mid-20th century)” practically the only way a young (classical) artist had of obtaining notice was to be at an elite conservatory since that is where the scouts went almost exclusively.

The MO for finding the “latest new thing” made perfect sense for the time period. There was (and still is) a tremendous amount of investment on the part of the label that went into publicity for and grooming of a young artist. Further, in my teacher’s case, the label handled studio and recording expenses, created and booked concert tours, and handled venue costs. The artist did not have to repay the funding; however, total expenses would be deducted from any royalties should he/she become successful. The investment risk meant that going to places where the already-succeeding were clustered was the safest bet for the big labels. There was very little room in the equation for a person who was not already positioned to join the upper professional echelons, or someone who had no insider access.

Was a situation where the major labels acted as gatekeepers and only considered people who fit a certain profile really better than the current one where the internet and digital tools allow artists to perform directly to the audience? The nostalgia for a time of “monoculture” speaks to a yearning for a closed, stratified world. The world where my teacher grew up and worked was a world in which someone with big dreams could imagine himself as simply undiscovered, an unrecognized talent whose gifts would never benefit society. There is some security, a perverse comfort, in such a dynamic. A person never has to confront the idea that maybe he has no talent, maybe his music is not good enough, maybe what he does is something no one finds interesting, perhaps there is no market for him to fill.

The breakup of the “monoculture” has forced average Joe dreamer to confront these possibilities. Instead of only playing and dreaming in his garage, he can now release his own albums on iTunes and Prime Music, upload videos to YouTube and Daily Motion; he can have his own website and create his own publicity. He can wait to see if his work is accepted and if there is an audience for it. The Yun family has argued that the process of exposure and competition is cruel, that it breaks up human contact, that it consigns the vast majority who desire to be part of the “culture” to being part of the “heartbroken masses.” But the real question is: How is average Joe dreamer any better off under the old system? Isn’t a situation in which he at least has a chance to be seen, to make it big, better than one in which he is simply locked out? 

Contempt for capitalism

Scarcity is merely contempt for that which is easily obtained.

                                                                                               ~ Seneca (attributed)[1]

While I am aware that many of the NOL writers do not agree with Jordan Peterson, I think he is correct when he says that many of the contemporary problems faced by society are the product of prosperity. The essence of his argument is that in an evolutionary sense mankind has two sets of priorities: 1) necessities of life and 2) social standing attached to life meaning. For the first time in history, we are living at a point where the necessities of physically staying alive, e.g. housing, food, medical care, etc. are obtainable and now people are free to fixate on the second set of priorities. The point where I personally diverge from Peterson and those of a traditional “conservative” inclination is that I do not think it is incumbent upon broader society to provide a sense of meaning or status for others.

In an interview with Christina Hoff Sommers and Danielle Crittenden of AEI for their Femsplainers podcast, Peterson illustrated his argument with an anecdote from his childhood on the Canadian plains. He described how, despite college being free in Canada at the time, he was the only one from his high school class to go onto tertiary education. Some but not all of the cause, he said, was ignorance. He observed that much of the reasoning related to an inability to delay gratification. At the time, his hometown was an oil center and was flush with money from the boom of the 1980s. Consequently, going to work immediately and making a lot of money was more attractive for high school seniors than was attending university. For the next decade this strategy worked. But, as Peterson recounted, when he was in the final year of his PhD, two things happened: 1) the wells supporting the town ran dry and 2) the oil bubble burst.

Unsurprisingly, what were a blip on the radar and a minor hiccup in the global and historical scope were disastrous for Peterson’s hometown. What became apparent in the fallout was a softening of character and mind. Counterintuitively, men who had no problem rising at 4:00 AM and working in subzero temperatures for sixteen hours were shown to be fragile in the face of greater adversity. Having manufactured an identity that emphasized the pointlessness of skill acquisition but also placed great importance upon high income, these beneficiaries of the oil bubble were devastated when their incomes disappeared and their lack of relevant skills left them unemployable and consequently without status. In the following decades, the town became one of many others of the rustbelt variety, plagued by unemployment, substance abuse, and apathy. It was this sequence that Peterson said caused him to start thinking in terms of the evolutionary stages of development. For us, the most relevant takeaway is the limitations of money on conferring status.  

As Alvin Saunders Johnson (1874 – 1971) wrote in his article “Influences affecting the development of thrift (1907)” in which he lamented the loss of capitalist mentality among the American people,

It is obvious that the more closely society is knit together, the greater the influence of social standards upon economic conduct. The solitary frontiersman shapes his economic conduct with very little regard for the opinion of other frontiersmen with whom he may occasionally associate. The one rich man in a community of poor men is scarcely at all under the dominance of social economic standards.

Saunders ultimately concluded that increased status anxiety or social dissatisfaction was, however counterintuitively, a sign of progress. After all, the real life of a frontiersman was quite different from that romanticized in Zane Grey books. Therefore, assuming that no one really wants to live a subsistence existence, a world in which people are materially comfortable, even if subjected to social pressures, is inherently better.

The concerns induced by social pressure are external to material wellbeing since a positive environment for the latter ensures that people can live above subsistence without difficulty. Social pressure is hardly objective, and because of its variable nature, symbolized in Johnson’s essay by a “suburbanite touring car,” which was both unproductive and quickly superseded by newer vehicles, sweeping claims about feelings of social pressure must be denied validity. Johnson presented the thirst, which he observed in 1907, for status symbols as both positive and negative. Positive because it indicated greater prosperity; Negative because it created a different set of societal problems. As society became increasingly removed from a subsistence existence, the human cycle of craving and acquisition – which in a hunter-gatherer environment meant survival of winter – developed into an obstacle, rather than a necessary tool. The contemporary extension to Johnson’s line of reasoning correlates to Peterson’s assertion that the fixation of broader society on status is a natural sequitur to achieving a level of prosperity sufficient for the basic needs of all to be addressed.

While the topic of interest to Johnson was thrift and capitalist mentality, or rather the lack of it, on the part of the American people, his essay contained an interesting and pertinent observation regarding land practices. Remembering that he wrote this particular essay in 1907, he remarked on an agricultural real estate bubble existing in the Midwest, his native region. The reason he thought it was important was that he maintained it demonstrated two points in the spending-thrift, status- (genuine) capitalist paradigm. Property prices and valuations had risen, he argued, past the productive potential and therefore market value of the real estate in question. This rise was driven, he said, by the farmers themselves who spent lavishly on land because of an association of ownership with status, even if they were then unable to use the land effectively and therefore unable to recoup the investment. He predicted that land purchased under these circumstances was not a good investment. Further he added that when the bubble collapsed, not only would there be monetary loses, there would also be ego and identity crises as those farmers who saw the value of their holdings diminish would feel as though they had lost social standing, even if they didn’t lose a single acre. However since, Johnson observed that many of the farmers he knew, and upon whom he based his hypotheses, had repeatedly mortgaged their current holdings in order to buy additional pieces, the economist predicted property forfeiture as well.

A prototype Austrian economist, though born and raised on the Nebraska plains, Johnson made no moral judgements about the acquisitive instinct or the needless purchase of property. The pettiness exemplified by Thorstein Veblen (1857 – 1929) and his claim that curved driveways was a pointless display of wealth and status because they used space that might be put to better use (The Theory of the Leisure Class, 1899) was simply not a factor in Johnson’s works. Johnson’s conclusion on improper land acquisition and use was based purely on dollars and cents: for a farmer, unprofitable land was not an investment, and no amount of wishful thinking would make it into one. Johnson’s predictions about an impending time of property price collapse and hurt egos came to pass in the 1920s with the agricultural depression, which was in full swing by 1924. The bubble burst was possibly delayed by World War I. The Dust Bowl and Great Depression of John Steinbeck’s The Grapes of Wrath was an epilogue to a situation that had begun many decades before.

In 1922, shortly before the agricultural bubble burst spectacularly, Johnson wrote another article, “The Promotion of Thrift in America,” in which he warned, again, that the American land-owning farmer was overleveraged. He also identified the existence of a tenant farmer class whose status was due to lending policies which encouraged current owners to become overleveraged through easy mortgages, thereby inflating property prices, but prevented first-time buyers from entering the market. The dispute between NIMBY and YIMBY is hardly new. To further exacerbate the situation, the agricultural sector had begun to receive federal subsidies as part of the war effort. While Johnson didn’t mention this in his article, the subsidies further distorted the connected real estate market as they made farming appear more profitable than it was. In response to criticism for his decision to refuse to subsidize agriculture in any way, Calvin Coolidge, who was desperately trying to salvage the American economy and took a firm anti-subsidy, anti-price control approach, said, “well, farmers never have made much money.” Johnson’s predictions from 1907 came true, but Coolidge’s 1926 decision on subsidies received the blame.   

What relevance, one might reasonably ask, has a set of observations from 1907 on the investment attitudes of, primarily Midwestern, American farmers have today? Why should these observations be combined with comments on status, work, and society made by a controversial academic? The reason is best exemplified by Sen. Marco Rubio’s article in the National Review titled “The case for common good capitalism.” It was an apology of soft socialism swaddled in pseudo-religious sanctimony culled piecemeal from various Catholic encyclicals. Since Kevin Williamson already dissected the article and its flaws, there is no need to rebut Rubio at this juncture. The overarching thesis to Rubio’s piece is that society has robbed the working man of dignity. The logic is tangled, to say the least, and it crescendos to the conclusion that a shadowy, disembodied society has mugged the working man and taken his dignity (or sense of status). The basic equations appear to be as follows:

  • burst property bubble = conspiracy to deny ownership, at least dignified ownership, to average Joe
  • loss of jobs for unskilled labor = denial of right to work;
  • lack of workforce-owned corporations = some vague travesty that is tied to denial of ownership.

For this last, Johnson in “The Promotion of Thrift in America” specifically mentioned such corporate arrangements as complete failures because they created hostility between employers and employees, and in fact reduced employee-loyalty, since the latter believed that such arrangements were intended to defraud them of their wages.

In his two articles, Johnson frankly argued that only the middle-classes and up practiced thrift and engaged in capital acquisition and investment. He dedicated most of “The Promotion of Thrift in America” to expounding upon the ways that the “wage earners (polite speak for working class back in the 1920s),” farmers, and the lower-middle class not refused to pursue capitalist behavior but would respond with active hostility when thrift campaigners (yes, there was an save-and-be-a-capitalist campaign in the aftermath of World War I, right up there with the temperance movement![2]) suggested that they should. The crux of the issue, one which Rubio’s article refuses to recognize, is that, like in the 1920s, the roots of broader societal complaints lie decades in the past. The efforts to create a quick fix are therefore both futile and infantile. Every couple of decades a specific subgroup, be it the overleveraged farmers of the early twentieth century or the unskilled oil workers of Peterson’s youth, discovers that its values are defective and that the signs its members believed to be markers of status are liabilities. Eventually the price of the decisions built on these misplaced values and symbols must be paid. In order for the payment to occur in a way that does not unjustly burden the rest of society, we must recognize that the scarcity experienced by the indebted subgroup lies more in their contempt for the genuine capitalist way of life than in any wrong society has inflicted upon them.


[1] Despite attribution to Seneca, I have been unable to find this aphorism among his works currently in my library, and I would be very appreciative if someone could tell me in which work he wrote this phrase (if indeed he did).

[2] Interesting which of the two movements gained enough political clout to have its agenda inscribed in the Constitution.  

Perspective and riches

Sometimes working in the arts can be quite disorienting, especially in terms of what comes out of the mouths of colleagues. For example, a close friend was in rehearsal and an ensemble member, having spent the first hour staring at her, suddenly demanded: 

“Are those real diamonds [pointing at a simple crystal strand bought at H&M]?” 

“What?! These?! No.”

“Oh, okay. I was trying to figure out how rich you are.” 

There were so many things wrong with this actual exchange that it is hard to know where to start. The main, collective reaction was: “Who openly admits to sitting there thinking things like that?” The episode embarrassed everyone except the person who asked the offensive question. Aside from the immediate disruptive effect it had, the incident was indicative of a greater socio-cultural problem, a shameless voyeurism that, while not new, has reached a fevered pitch today.

While one could easily say that reality TV and Instagram are primary causes, there are plenty of examples which predate these media, most memorably Gustave Flaubert’s Madame Bovary and its prescient view of tabloid and celebrity culture. What is new, though, is the idea that the envious and their curiosity have any legitimacy. We have come from Flaubert’s view that Emma Bovary was a colossal idiot to articles published by the BBC lamenting “invisible poverty.” The BBC writer’s examples of “invisible poverty” were an inability to afford “posh coffee,” a qualifier which he declined to define, and neighbors wondering if a “nice car” was bought on auto loan or owned outright. Like the question about diamonds, not only should such matters be outside the concern of others, to think that they are appropriate, or even a valid source of social strife, is disgusting and disturbing. 

In his book Down and Out in Paris and London, George Orwell complained about being sent to Eton, where he spent his school years feeling as though everyone around him had more material wealth. The essence of his lament was that he wished his parents had sent him to a small grammar school where he could have been the richest student. He also claimed, in a wild generalization, that his feelings on the matter were universal through the British upper-middle class. Further, he said that it was his time in secondary school, not as commonly claimed his time as a civil servant, which fueled his turn toward Marxism, following the traditional logic of grabbers – “they have so much and therefore can spare some for me.” 

The most baffling part for Orwell was the way that the upper-middle class, which included his family, was willing to move to far-flung corners of the globe and live in conditions the lowest British laborer would not accept in exchange for educational opportunity for their children and a high-status, reasonably wealthy retirement for themselves. For a comprehensive analysis of the phenomenon of self-sacrifice, its role in the development of capitalism, and why only the century upper- and upper-middle classes were the ones willing to make such exchanges, see Niall Ferguson’s Colossus

It is important today for us to become more critical regarding complaints about society and anecdotes that are presented as proof regarding unfair societal mechanisms that prevent social mobility. An example of the reason we must be careful is art recent article published by written by a Cambridge undergraduate for The Guardian, who identifying as working class and having many problems along those lines, cited as her biggest complaint the Cambridge Winter Ball. Her problem was not that she hasn’t been able to attend, but that she had had to work for an hour in order to get into the Ball for free. This is a questionable example of social immobility. Her complaint about the Ball was that there were others who could pay the £100 entrance fee upfront. From this, she assumed a level of privilege that might not necessarily exist, i.e. “the other students could part with 100 pounds.” 

Another example of failure to understand the availability of resources and extrapolating a false conclusion of social immobility is the Columbia University FLiP (First-generation, Low-income People) Facebook page, which was, through 2018, their primary platform. In response to Columbia University’s study on their first-generation low-income population, many of the complaints related to books and the libraries. FLiP students didn’t know that books were available in the library, and so they had purchased study materials while their “wealthier” peers simply borrowed the library copy or spent the necessary number of hours in the library working. Ironically, this complaint is not valid if you also consider that Columbia does an immersive orientation in which new students are taken into the libraries and are shown the basics of the book search system, card operations, checkout procedure, etc. In response to the publicity surround the FLiP drive[1] the university opened a special library for these people where there is no official checkout; all loans are on the honor system. On a hilarious side note, in the middle ages libraries would chain books to lecterns to keep the students from walking away with them.

While we may have moved away from a society that encouraged living modestly to avoid arousing the envy of one’s neighbors, we now live in a culture in which our neighbors’ jealousy is too easily aroused. Chaos is the natural resting state of existence, but people have lost the ability to construct order for themselves out of it. It is possible to argue that modern people have not been taught to do so; after all, no one comes into the world knowing the underlying skills that are the foundation of the “invisible poor” complaints, e.g. social interactions, sartorial taste, self-sacrifice, etc. To tell the truth, mankind’s natural state is closer to the savages of the middle ages whose covetous inclinations necessitated the chaining of library books. On the one hand, we have progressed tremendously past such behavior and in doing so created order from chaos; but on the other hand, the external signs of progress are now under fire as symbols of privilege. Chillingly, the anti-civilization narrative, because that is ultimately what it is, is being incorporated into an anti-capitalist agenda through the conflation of “civilized” with “privileged,” which in turn is conflated with “rich.” 


[1] It is also revealing that the sign off for these people while the drive lasted was FLiP [school name]. Yes, one must wonder if even the acronym was picked for its stunning vulgarity. 

Early 20th century socio-economic commentary: history in the making

Several years ago, I used to watch the television show Bones. The only quote I remember from that show was surprisingly pithy given its origins. Regarding a serial killer the team has finally tracked down and neutralized, the resident psychologist, Dr. Lance Sweets, says: “I was right. He was nobody – angry at history for ignoring him.” Contemplating the second part of the quote, one realizes that the potentially histrionic line holds some alarming applicability today.

Tom Palmer wrote a magnificent article, “The Terrifying Rise of Authoritarian Populism,” which he examined the way that failed individuals and communities turn to a collective identity to bolster their self-esteem, which in turn creates a dynamic conducive to populist ideologies of all stripes. The pressing question is: Why does the majority feel entitled to dictate to the minority, in a form of mob-rule wrapped in the husk of democracy? In order to understand, though never to solve, this question in America, the one country whose founders openly designed it specifically to avoid tyranny, both of the majority and the minority, one must look to a mixture of factors.

In Ayn Rand’s Atlas Shrugged, there is a snippet of story about the history professor “who couldn’t get a job because he taught that the inhabitants of slums were not the men who made this country.” Quite literally because none of the Founding Fathers came from insignificance, outside of Alexander Hamilton, the illegitimate son of a Scottish gentleman, a man who was rather blatantly waiting around for a woman of rank to become available and who didn’t leave his son any of his extensive property.[1]Given that Hamilton’s early promise belied his later invention of the early federal reserve and his apologetics for tariffs, the suspicion of historians – and Hamilton’s own peers if private letters among Jefferson, the Adams, and others are to be believed – that he had some bitterness toward the propertied class on the basis of his childhood is justifiable. Benjamin Franklin was very proud of the fact that he managed to make his own fortune – having parted acrimoniously with his solidly middle-class extended family. To be fair, Franklin never claimed to be “self-made,” just to have had to be self-reliant at an unusually young age for a man of his class. There is much to be admired in Franklin’s rigidly honest self-definition, especially today. To return to the quote from Rand, the idea expressed was not a comment upon the literal Founding Fathers but rather upon the building of identities and the falsity contained therein. 

The visualization graphic linked from FEE shows clearly the extent to which incomes have increased over the years. The discontent connected and displayed through dramatic claims about “shrinking middle-class,” “stagnant wages,” “1 percent,” etc. was predicted in 1907 by economist Alvin Saunders Johnson (1874 – 1971) in his study “Influences Affecting the Development of Thrift.” Starting with the question:

If it is proposed, through legislation, to liberate a given social class from some of the uncertainties and hardships of the laissez-faire regime, one of the first questions to be raised is: “What will be the effects upon the habits of saving of the class concerned?” 

After laying out in great detail why redistributive policies were bound to fail fiscally and socially, Johnson took direct aim at what he perceived to be the source of the problem:

To-day the working class is rising into an autonomous position. The workingman of to-day repudiates the term “the lower classes.” His position is not the same as that of the property owner, but it is not in his opinion inferior. It follows that any line of conduct rising normally out of his position as a wage earner will be held in honor by him. It is pertinent, therefore, to inquire what attitude toward thrift the exigencies of his situation lead him to adopt. 

It is no part of the workingman’s view of progress that each individual should become the owner of a capital whose earnings may supplement those of his labor. No such supplementary income should, in the laborer’s view, be necessary; and the work- man who endeavors to secure it for himself, instead of bending his efforts to the winning of better conditions for labor in general, is likely to be blamed for selfishness rather than praised for self-restraint. […]

Light-handed spending in time of prosperity, mutual aid in time of distress- such appears to be the approved conduct of a permanent body of property-less laborers. And if this is true, we may be quite certain that such practices will in the end be idealized, and that middle-class schemes of cultivating thrift among the working classes will meet with increasing resistance. Already it is easy to find bodies of intelligent workmen who express the greatest contempt for the fellow workman who is ” salting down ” a part of his earnings. 

All of these factors, predicted Johnson, would lead to increase inequality, social and financial, and anger with the socio-economic system. The inequality would stem, not from literal economic inequality, but from the loss of the “laborers” to discern genuine investment, in self, family, and business, from mere consumption, leading to a knowledge- and know-how gap.

At the time Johnson wrote his study, the Progressive movement and its acolytes were running rampant in the US, promoting what we would call today a “soft” socialist state, and the campaigners were experiencing unusual popularity in response to an agriculture bubble due to subsidies that inflated land prices and a more general move toward socialism among urban workers. While the prototype socialists blamed consumption, adopting eagerly the vocabulary of Thorstein Veblen’s The Theory of the Leisure Class(1899), Johnson rejected the idea completely:

“Conspicuous consumption” is a proof of economic success, and wherever it is the most telling proof, the standard of economic success is likely to be a standard of consumption. But wherever economic success is better displayed in some other way, as for example by increase in one’s visible assets or productive equipment, the standard of consumption exercises little influence upon economic conduct. A standard of conspicuous possession or of productive power takes its place.[2]

Instead, the root problem was a mass loss of will to be capitalists and to engage in and with the capitalist system. This in turn stemmed from a desire for dignity, a pursuit doomed to failure because it was built not on the dignity of work and the worthiness of independence but upon class identity. Exacerbating the situation, as Tom Palmer explored, is the fact that this identity is collective, which fits with a rejection of capitalist pursuit because entrepreneurship is inherently a singular, individual effort.  

Today, we are facing the consequences of the rejection of what Margaret Thatcher called “the strenuous life of liberty and enterprise.” Those who embrace this lifestyle ideal are the ones who have made and continue to determine history. While they may be mappable as a network or a general type of group, all of their achievements lie outside a collective identity. Any set of people can be distilled down to a select set of characteristics that give the impression of a collective unity; for example, one can make a blanket statement along the lines of “the majority of tech billionaires are Ivy Plus dropouts” which would be true in a literal sense and false in its reductionist view. 

The collective view of the social peer must fail of necessity. It is what Johnson meant when he mentioned the derision directed by working-men at those of their fellows who stepped outside a collective concept of “place” and tried to become capitalists through saving. The policing of the peer in America has failed miserably as Palmer described when he wrote of individuals seeking solace in the notion that their community is successful, even if they are not. The illogic of this position escapes them: it is impossible for a community of individual underachievers to become successful merely through combining into a collective. History shows many times over that such a situation only increases the multiplication of failure. And it is the inexorability of history – though not, heeding Karl Popper’s admonition, historicism – that is the source of the anger today. The collective from the slums does not make history, and those who make up the collective are now angry at history for ignoring them.        


[1]To be fair to Hamilton Sr., not much is known about the circumstances of his estate. It is perfectly possible that it was entailed and therefore could not be bequeathed at will. Hamilton Sr did pay for an elite, in a Caribbean-colonies context, education and funded his son’s early ventures in New York City. Also a good proof for the idea that the bank of mom and dad is NOT a Millennial invention.  

[2]Johnson is an American economist who really deserves greater recognition. He grew up on the Midwestern plains, and in these fairly isolated circumstances, he articulated a theory of economics which he later recognized as part of the Austrian School. He co-founded The New School in NYC and was single-handedly responsible for the university becoming a home to Austrian and other central European scholars forced to flee from the Nazis. 

Bourgeois IV: The Economics of difference

Economists Abhijit Banerjee and Esther Duflo studied the root causes of flawed decision making in their book Poor Economics. While much of the book is an applied economics redux of Ludwig von Mises’ more cerebral Human Action: A Treatise on Economics, there were several points that are particularly applicable in an examination on the difference between the bourgeois and the middle-class as defined only by income. The most important point is Banerjee and Duflo’s concept of the S-curve. According to this model, social mobility is not a sequence of steps or a diagonal line; it is shaped like the letter S, and each of the curves represents a significant hurdle on the path from the bottom left edge to the right top one. The first curve (obstacle) is a love of pointless material display, and the second is a desire for security and stability.

In Poor Economics, a crucial part is understanding the extent to which family plays in the equation. Banerjee and Duflo discovered that on the S-curve only the very top and the very bottom had more children than was the national average. The parents on both ends were more solicitous of their children’s educations and futures than those in the middle. In fact, having fewer children in the case of the middle which was recently elevated from the lower portion had an adverse effect on parental spending on education and opportunity, with the result that the middle became a place of stagnation. The economists explained that to some extent in cultures where having children is the retirement plan, middle parents felt as though they had less to spend because they had fewer children. But this did not explain similar gaps in cultures where reliance on grown children was not normative. 

Across the board, though, the top and bottom segments expressed the sentiment that they couldn’t afford to not invest in the very best for their children. For the top, the feeling was related to understanding that maintaining their position was contingent upon vast investment in the next generation; for the bottom, the only way having an above average number of children was worth the time and effort was for all of them to become highly successful. In other words, on both ends, the prevailing attitude was “can’t afford to fail.” Conversely, those in the middle of the S-curve aspired to security, rather than success, and the parents were only willing to spend as much as necessary to obtain that – it varied among the different countries studied, but no more than high school and a local college were quite common – even if the parents could afford much more and the children were capable of pursuing more. The correlation between more children and advanced, better quality education regardless of official social class was a shock to the researchers because it defied popular wisdom, which mandates that fewer children equals more opportunity and better education for them. Based on Banerjee and Duflo’s findings, parental indifference is more or less the root cause of modern “stagnation” and “inequality.”   

Given that today there is quite a bit of complaining in the developed world, particularly the US, about the “shrinking middle-class” and the ills, mostly portrayed as economic, associated with it. It is worth considering based on the data from Poor Economics that the middle-class is shrinking in a literal demographic sense, as well as a social one. The researchers found that it is common for families on the bottom half of the S to have, on average, four to five children (with as many as nine or twelve being usual in cultures with strong intergenerational dependency dynamics) and those at the top to have between three and four; middle families never had more than two. The image is that of an hourglass, with the “middle-class” being perceived as squeezed by virtue of the larger groups on the top and bottom. 

In July 2018, Brookings published a study on the subject of the “decline in social mobility,” with the surprise twist being that it was a downward drop from the American professional classes, rather than from an income-based general category:

As Aparna Mathur and Cody Kallen of AEI wrote in “Poor rich kids?”; “[P]erhaps the most puzzling – and least commented upon – finding is the large positive correlation between the parent’s income and the decline in absolute mobility over the years. Put more simply, the richer the parents, the larger has been the decline in mobility for their kids.”

While the “poor rich kids” phenomenon might be upsetting from the American mythos perspective, from the data collected by Banerjee and Duflo, it is completely to be expected. The researchers established that middle-S families experience diminishing returns over the course of multiple generations as a direct result of their priorities. For example, in India, one of the main countries studied, government bureaucratic jobs have been the favored, hereditary domain of middle-S families because of their security but over the course of the last-third of the 20thcentury and into the 21st these jobs have experienced wage stagnation, saturated markets, and, with the first two, declining social capital; in other words, they lose social mobility. However, middle-S families continue to persist in their established behavioral routines. Hence, Banerjee and Duflo diagnosed love of security as the second ill, the (almost) insurmountable “hump” in the quest for social mobility. 

According to the Brookings study, the fallen American middle-class has experienced all of these symptoms as well, and certainly the demagogues have happily adopted rhetoric relating to claiming a disappearance of the “middle-class.” Although, as the AEI study cited by Brookings cautioned, income is not a particularly good indicator of mobility, there is no doubt that there is a sharp decrease in perceived well-being among the children of the American “middle-class:”

The reason this is interesting is that it matches Banjeree and Duflo’s findings regarding the middle-S groups of all the countries they studied, which indicates that their research is applicable to developed and underdeveloped countries alike. 

But the loss of manufacturing jobs cannot explain what happened to the near-rich and the top 1%. Naturally, it may be difficult to surpass highly successful parents, but that does not explain why mobility rates have declined so sharply at the top income levels, especially if wealth and incomes are becoming more concentrated. Moreover, average incomes for the top 1% have remained at about 4 times the median income over these years. Yet, for the 95th percentile, absolute mobility fell from 84 percent for those born in 1940 to 20 percent for those born in 1984. And for those born in 1984, coming from a top 1% family essentially guarantees earning less than one’s parents, with a mobility rate of 1.2 percent.  

While there are some cultural differences that serve to obscure similarities, if one looks at American educational expenditure, for example, one sees that the average middle-American family spends more on semi-educational activities than other families in comparable situations in other cultures. However, viewed critically, very little of that expenditure is on efforts that advance career prospects, or even on pursuits that hold genuine cultural and intellectual value. This was the issue with the Abigail Fisher vs. University of Texas case. For those who might not be aware of the case, Fisher sued UT-Austin, claiming that the institution had racially discriminated against her using affirmative action; UT-Austin said that race hadn’t factored into its decision in regard to her because she simply wasn’t candidate material. After two hearings by the Supreme Court, the judges ruled in favor of the university. Ultimately, though, the case had little to do with affirmative action and everything to do with that all the extracurricular activities she cited as proof of extenuating circumstances for mediocre academic performance, e.g. involvement with Habitat for Humanity, were ultimately worthless. Cutting through the legalese, the lawyers for UT-Austin essentially explained that her “achievements” were not remarkable literally because every applicant put down something similar on his/her CV. It was a simple case of supply and demand.

From a Poor Economics perspective, the case fell within the bounds of middle-S behavior, the pursuit of security represented by conforming to “everyone else;” from a historical bourgeois view, it is proof that activities, or busyness, are not a replacement for true achievement and accomplishment. It is a classic example of value not always equaling cost with the twist that the cost of the cited extracurricular activities was not equal to their value. To map this firmly to the S-curve and the “squeezing” of the middle-class, it doesn’t matter if the refusal to invest is direct, as in middle-S parents interviewed by Banerjee and Duflo in developing countries, or insistence that average activities are equivalent to achievement, as in the Fisher case, the effect is the same: the ersatz middle-class with its aspirations to and mimicry of the bourgeois is revealed as simply inadequate, and it is so as a result of its choices.   

If there is one thing economist Tyler Cowen has been warning the country of for the last two decades, it is that many of the declines and discontents we face today stem squarely from a mania for stability that afflicted American post-War society. In the same vein, there was Kevin D. Williamson’s infamous, although completely justified, U-Haul article from right before the 2016 presidential election, which elicited anger because his prescription supposedly required people to “abandon their roots,” to reject something integral to themselves. 

One of the greatest pieces of wisdom from classical antiquity is that mimesis will ultimately fail; conversely, metamorphoses will ultimately succeed. Rising along the S-curve requires rejection, demolition of perceptions and of, possibly, mental values. It demands metamorphoses, not merely imitation. In The Anti-capitalistic Mentality, von Mises remarked that the resentment directed at multi-generational, hereditary prosperity and privilege overlooked that everyone involved went out and recaptured “every day” the ingredients for their own success. Their nice cars, big houses, fine clothes, etc. were simply the reward for their constant, invisible toil, one which Mises pointed out very specifically embraced the concept of sic transit gloria mundi. It is not an accident that the two hurdles of the S-curve are points of mimesis: the material and the perceived, which does not even exist. The question that people must ask is: Are we content to pretend, to wrap ourselves in the apparel of success and achievement, or do we wish to become?

Bourgeois III: Values in real-life

Today, there is a rising commercialization, a commodification of the character traits of the bourgeois. This in turn is leading toward a worldview in which bourgeois behavior is a “privilege,” instead of simply being an expectation of a civilized society. As part of such a change, the bourgeois, the real middle-class as exemplified by a set of genuine values and behaviors, not just income or the terrible Marxist clichés from authors such as Christina Stead (The Man Who Loved Children), have moved from being the group of idealized role-models to being portrayed as a slightly deviant clique. Now, words such as “censorious,” “sanctimonious,” “privileged [again],” “hectoring,” and “judgmental” are routinely thrown around concerning those of bourgeois persuasion by pundits and the commentariat on both sides of the political aisle.  

In 1960, psychologist Dr. Walter Mischel led the famous (or infamous depending on social leanings) marshmallow experiment. A young child, observed by researchers and parent(s), was left alone in a room deliberately devoid of stimulation outside of a candy marshmallow. The researcher who took the child into the room would tell the child he/she could eat the marshmallow but would promise a gift of two marshmallows if upon the adult’s return two minutes later, the original piece of candy was uneaten. Needless to say, once the researcher left the room, the majority of children promptly ate the thing. 

Mischel checked on them when they were teenagers and then located a smaller segment when they were almost 50 – those who had displayed deferred gratification had more successful lives than those who had gobbled the first marshmallow. Starting in 1990, when Mischel published the results of both the original test and the teenage checkup and revealed that those with self-control had achieved higher SAT scores and therefore received admission into the best universities, his work came under fire from the social justice coterie. The reason was that the results were an indictment of the difference in the parenting style of the majority, i.e. the single marshmallow gobblers, and the disciplined minority. In a logical world, the results should be cause for celebration because the two-marshmallow children came neither from a racially homogenous group nor from elite backgrounds, though the fact that the majority had one professional parent has formed a large part of “but the experiment is socially discriminatory” argument. 

The crux of the rejection is the evidence that the two-marshmallow children entered the experiment room having already learned self-control from their parents, while the candy wolfers had not. In an attempt to debunk Mischel’s study, a group of psychological researchers redid the experiment, publishing the results in 2018, with a larger sample and a thorough study of the home environments of the subjects. Instead discrediting Mischel, the second study upheld the macro principles of his work; those who did best on the “new and improved” version were children who came from homes with large numbers of books and had very attentive, responsive parents. In other words, two-marshmallow children came from a different type of family than was, or is, standard. Crucially, the study found that the culturally impoverished, those with money and possibly advanced degrees but no books, did not do significantly better than those from financial poverty, with both groups producing marshmallow snatchers. On an interesting side note on the book divide, parental attention did not seem to make much of a difference, though it is possible that this is because the book-readers were also the better parents – the language is a little ambiguous on this point. 

The Atlantic, for example, more or less gloated that Mischel and co.’s unconscious snobbishness had finally been unmasked to the world:

There’s plenty of other research that sheds further light on the class dimension of the marshmallow test. The Harvard economist Sendhil Mullainathan and the Princeton behavioral scientist Eldar Shafir wrote a book in 2013, Scarcity: Why Having Too Little Means So Much, that detailed how poverty can lead people to opt for short-term rather than long-term rewards; the state of not having enough can change the way people think about what’s available now. In other words, a second marshmallow seems irrelevant when a child has reason to believe that the first one might vanish. [….]

These findings point to the idea that poorer parents try to indulge their kids when they can, while more-affluent parents tend to make their kids wait for bigger rewards. Hair dye and sweet treats might seem frivolous, but purchases like these are often the only indulgences poor families can afford. And for poor children, indulging in a small bit of joy today can make life feel more bearable, especially when there’s no guarantee of more joy tomorrow.

It is important to note that Mischel’s critics completely distorted his findings. At no point did he claim one group was superior to the other in any sense. Nor, did he claim, as is often misrepresented, that “character was everything.” These were all things impugned to him by opponents. All he did was prove that patience and self-restraint at age five (the approximate age of his subjects) tended to translate into disciplined, high-achieving high school students.[1]

The real rub of the marshmallow test is that it forced recognition of the fact that different child-rearing methods convey value sets, and that these values can be a determining factor in success or failure. They have the potential to translate into material, cultural, and social capital, or to reveal the lack of these things to the world. To make things worse, from a social justice standpoint, the experiments appear to demonstrate that values of the upper echelons are better because they have consistently produced the same results. 

The excuse-seeking used by The Atlantic author, such as hair dye and sweets, is recognizable to anyone who has read George Orwell’s The Road to Wigan Pier because the logic’s template is his chapter on the dietary habits of the northern English working-class in the 1930s. What isn’t shown is Orwell’s own honesty. While he gave his subjects a pass on spending eight pounds[2] on sugar, as the only affordable indulgence, he excoriated them for spending on tinned corn beef and dried milk when fresh meat and dairy was both readily available and were much, much cheaper. For him, it was one of the little ironies in the social fabric that the middle-class, who preferred fresh products, lived cheaper and better than the working-class, whom he suspected bought canned meat and desiccated milk out of some misplaced notion that this was what the middle-class ate exactly because cans and boxes were more expensive.     

In his book The Wealth of Humans: Work and Its Absence in the Twenty-First Century, journalist-economist Ryan Avent wrote on the subject of social capital:

Wealth has always been sociable. The long process of cultural development that eventually yielded the industrial revolution was in many ways the process by which humanity learned ever better ways of structuring society in order to foster the emergence of complex economic activity. Wealth creation in rich economies is nurtured by a complex system of legal institutions (such as property rights and the courts that uphold them), economic networks (such as fast and efficient transportation and access to scientific communities and capital markets) and culture (such as conceptions of the ‘good life’, respect for the law, and the status accorded to those who work hard and become rich). No individual can take credit for this system; it was built and is maintained by society.

Avent is absolutely correct, especially in terms of this as the essence of a capitalist society. To return to the marshmallow test and its subjects, if one divides society into snatchers and waiters and then looks at Avent’s behavioral traits, such as “respect for the law,” or values – “property rights” – one immediately sees that one of the two groups is going to fare much better than the other. This is not because one group is inherently superior to the other; no, the difference is that one group received the value system, and therefore social capital, necessary to function acceptably in a capitalist system. The other did not. 

Training the values to be functional in a capitalist system has nothing to do with society at large, and everything to do with the family. That said, society does play a role in terms of upholding bourgeois values, but modern American society has, as previously indicated, not only disincentivized such values but has come to stigmatize them as unacceptable. Up next: examples of the bourgeois family!  


[1]Officially, he has not directly published the results of his check up with the available subjects at age 49; however, the APA report, hyperlinked above, indicates that the two-marshmallow children honored their potential and became successful, fulfilled adults. 

[2]He either observed that they spent eight pounds money, or that they bought eight pounds in weight; he was not clear in his wording.

Bourgeois II: place in the world

Quite recently, I was reading musicologist Martha Feldman’s book The Castrato: Reflections on Natures and Kinds, which is, unsurprisingly, a study on the castrato and the music written for the voice type during the 17thand 18thcenturies. The concept is exactly what it sounds like – a male singer whose physical development was surgically ended in order to preserve his access to the high soprano range. The surgery in theory created an ideal singer because his head and ribs continued to grow to normal male size, creating someone with tremendous lung capacity and also large head space which created greater resonance. 

For anyone who might be wondering, no, castrati did not sing female roles; the type was still male in identity and was often associated with nobility or demi-gods in character casting. Incidentally, the practice of castrati eventually led to the operatic custom, beginning in the late 18thcentury, of mid-range female singers (mezzo-sopranos) singing the roles of young males because as religious and civil laws cracked down on the creation of castrati, this particular type of singer gradually disappeared, even as the music written for them increased in popularity. By the time Mozart wrote Le Nozze di Figaro(1785 – 86) and La Clemenza di Tito(1789), the roles of the junior males, Cherubino and Annio respectively, were written for women singers from the outset. 

The aspect that was, for me, quite interesting about the castrati was the level to which a concept which musicians more or less take for granted stemmed from larger social, legal, and cultural changes in Europe in the late 1500s. Explaining the castrato’s history in Italy, where the practice originated, Feldman mentioned philosopher and sociologist Jürgen Habermas and his work, particularly on the “weakening of industrialization and the refeudalization of Europe” following the Renaissance. More directly related to the trend of castrati, Feldman wrote:

Most of the time first sons were excluded (from castration in order to make a castrato) because primogeniture was the rule in Italy, hence first sons were heirs, breeders, and eventual legatees, though very poor or very ambitious families sometimes did have first sons castrated, including the family of Handel’s principal castrato Senesino (Francesco Bernardi), whose older brother was a castrato, and first son Gaetano Berenstadt (1687 – 1734), of Tyrolean descent, who ended up caring extensively for his family’s needs (Feldman, 13).

And further explained,

If some form of patriarchy had long been the rule, patriliny by contrast took root in a historically precise way only around 1570, as we have glimpsed above [a preceding paragraph on the combination of increased lifespan and the introduction of estate entail in Italy]. Prior to that time the ideal had been to marry off all or most sons to increase a family’s power not just vertically but horizontally, within a wider network of kin, with the goal of fortifying the clan as a whole. With the marrying off of first sons only, a situation arose in which younger sons were typically consigned to military or ecclesiastical careers and thus formally speaking to legal or effective celibacy at the same time as most upper-class daughters entered convents. Both strategies intensified with the severe economic crisis of the seventeenth-century, but the practice continued afterward, albeit with increasing tendencies toward diversification (45).

Before 1570, the law of entail was not prevalent in continental Europe, which also tended to include females in the line of succession – Salic law applied only to the throne in the case of France, so noble women could and did inherit their parents’ property. Since one of the central points of the Counter-Reformation was ending the abuse of Catholic religious facilities, either as retirement homes for dowagers or as cold-storage for spare heirs once their elder brother fulfilled his duty, convents, monasteries, and the priesthood quickly became unviable career options, at least for the aristocracy. 

This little tweak to Canon Law had two effects: 1) the Catholic clergy gradually ceased being a profession as such, which resulted in an increased number of non-elites joining voluntarily and rising to high places, and 2) the performing arts, particularly music, exploded as the young men enrolled in ecclesiastical preparatory schools and originally destined for careers in the Church had to find new avenues for their skills. On a side note, the struggle to enforce the new regulation took centuries, was closely related to the battle for separation of church and state, and it is a story for another time.

The point to this tale is the response of the younger sons to their change in fortunes and status. Being in cathedral schools, and even more impractically in music-specialist cathedral schools, at first glance there was not much use for what these young men could do in the secular world. They were fluent in Latin, usually had a good command of Greek, frequently had a solid understanding of modern European languages and literature in general, and they were competent musicians. In a world that not only was still largely agrarian but was also “refeudalizing” into a system where they were, on the one hand, very much locked into the expectations of their caste – an impoverished younger son was still an aristocrat – while simultaneously being locked out of any claim to family property, the position of these men appeared hopeless. 

Instead of giving in to the circumstances, though, these men went out and turned their skills into an industry – classical music as we know it. They taught it, wrote it, and developed it into a dominant art form. Some found multivariant use for their “irrelevant” skills. For example, the castrato Carlo Broschi (1705 – 1782) didn’t use his real name out of respect to his aristocratic family, performing under the name Farinelli. However, his birth and skill with languages also caused him to be appointed a diplomat-at-large and it was not uncommon for him to be in cities, such as London or Madrid, for opera engagements and be suddenly called upon to go to the royal court and help sort out a diplomatic issue. When he died at the unusually old age of 77 (a perfect example of Jonah Goldberg’s point about Second Sons as both victims and beneficiaries of the upper-classes having better medicine), he left behind a fortune, which in a delightful ironic twist bailed out his elder brother’s family.[1]What is remarkable is not that he did this, but that he was only slightly unusual in terms of his financial success. 

In her book The Bourgeois Dignity, Deirdre McCloskey argued, rather controversially, that all movement, no matter how organic, comes top down in terms of the social pecking order. In the case of capitalism, the movement occurred, in part, because the group whom Goldberg termed “Second Sons” and McCloskey “bourgeois” had a particular knack for both recognizing and creating markets, even in very negative situations. The resilience evinced in the story of the castrati and their role in the history of music is a type of proof that McCloskey’s thesis is correct. 


[1]In fairness to the elder Broschi, he was a well-regarded musician in his own right and had a strong career up until he inherited the estate and, following convention, retired to it to become a penniless landed gentleman, rather than a wealthy performer, like his younger brother. The suspicion is that Farinelli covered most of his brother’s family’s living expenses; it is known that he paid for the education of his nephews and niece. 

Middle-class: questioning the definitions

  Vestis virum (non) redit

                                                                                    ~ Latin proverb

As Brexit drags on, there is a reassessment occurring of Margret Thatcher and her legacy. After all, in addition to being a stalwart defender of free markets, free enterprise, and free societies, she helped design the EEC, the precursor to the EU. So in the process of leaving the EU, there is a lingering question regarding whether the referendum is a rejection of Thatcher’s legacy. Those who equate the referendum with a repudiation of Thatcher offer explanations along the lines of: “Thatcher destroyed the miners’ way of life [NB: this is a retort spat out in a debate on the freedom-vs.-welfare split in the Leave faction; it has been edited for grammar and clarity.].” Sooner or later, Thatcher-themed discussions wend their way to the discontents of capitalism, especially the laissez-faire variety, in society, and the complaints are then projected onto issues of immigration, sovereignty, or globalism. 

On the other side of the Atlantic, Tucker Carlson’s January 2 polemical monologue contained similar sentiments and provoked a response that proved that the same issues have divided America as well. Much of Carlson’s complaints were predicated on the concept of “way of life,” and here lies a dissonance which has meant that swaths of society talk without understanding: merely possessing a way of life is not equal to having values. In all the discussion of the “hollowing out of the middle-class,” one question is never raised: What is the middle-class?

It is only natural that no one wants to ask this question; modern society doesn’t want to discuss class in a real sense, preferring to rely on the trite and historically abnormal metric of income. To even bring up the C- word outside of the comfortable confines of money is to push an unspoken boundary. Although there is an exception: if one is speaking of high-income, or high-net-worth individuals, it is completely acceptable to question an auto-definition of “middle-class.” Everyone can agree that a millionaire describing himself as middle-class is risible. 

It is the income dependent definition that is the source of confusion. The reality is that modernity has conflated having middle income with being middle-class. The two are not one and the same, and the rhetoric surrounding claims of “hollowing out” is truly more linked to the discovery that one does not equal the other. 

In his The Suicide of the West, Jonah Goldberg wrote on the subject of the back history of the Founding Fathers:

[….] British primogeniture laws required that the firstborn son of an aristocratic family get everything: the titles, the lands, etc. But what about the other kids? They were required to make their way in the world. To be sure, they had advantages – educational, financial, and social – over the children of the lower classes, but they still needed to pursue a career. “The grander families of Virginia – including the Washingtons – were known as the ‘Second Sons,’” writes Daniel Hannan [Inventing Freedom: How the English-Speaking Peoples Made the Modern World (HarperCollins, 2013)]. [….]

In fact, Hannan and Matt Ridley [The Rational Optimist: How Prosperity Evolves (Harper, 2010] suggest that much of the prosperity and expansion of the British Empire in the eighteenth century can be ascribed to an intriguing historical accident. At the dawn of the Industrial Revolution, the children of the affluent nobility had a much lower mortality rate, for all the obvious reasons. They had more access to medicine, rudimentary as it was, but also better nutrition and vastly superior living and working conditions than the general population. As a result, the nobility were dramatically more fecund than the lower classes. Consequently, a large cohort of educated and ambitious young men who were not firstborn were set free to make their way in the world. If you have five boys, only one gets to be the duke. The rest must become officers, priests, doctors, lawyers, academics, and business men.

Goldberg concisely summarized the phenomenon which Deirdre McCloskey argues created the original middle-class. But more importantly, Goldberg’s history lesson emphasizes that the origins of America, from an idea to a reality, have a deep socio-cultural angle, one that is tied to status, property rights, and familial inheritance, that is often lost in the mythology.

As Goldberg explained, the complex, painful history of disenfranchisement due to birth order was the real reason the Founding Fathers both opposed the law of entail while simultaneously focusing almost myopically on private property rights in relation to land. As Hernando de Soto studied in his The Mystery of Capital, George Washington, who held land grant patents for vast, un-surveyed tracts in Ohio and Kentucky, was horrified to discover that early settlers had established homesteads in these regions, before lawful patent holders could stake their claims. Washington found this a violation of both the rule of law and the property rights of the patent holders and was open to using military force to evict the “squatters,” even though to do so would have been against English common law which gave the property to the person who cleared the land (Washington was no longer president at that point, so there was no risk that the aggressive attitude he expressed in his letters would lead to real action.).   

Before repudiating Washington for his decidedly anti-liberty attitude on this score, we should turn to Deirdre McCloskey and her trilogy – The Bourgeois Virtues: Ethics for an Age of CommerceBourgeois Dignity: Why Economics Can’t Explain the Modern World, and Bourgeois Equality: How Ideas, Not Capital or Institutions Enriched the World – for an understanding on why this might be important, and why conflating way of life with social standing and identity is both fallacious and dangerous. 

Part II coming soon!