I am happy to report that I have survived the 2021-2021 political science academic job market. I will be a postdoctoral research associate at Princeton University starting September 2021. It is a dual appointment between the Department of Politics and the Center for the Study of Democratic Politics. I also have a few additional appointments, but I am waiting to finalize the paperwork before formally announcing them. Given that the 2020-2021 job market was one of the worst in recent memory due to covid19’s impact on university budgets, I think I did well enough.
I am writing this post in the hope that it can be of some help to others entering future political science job markets. Information is provided as is and I make no promises about getting a job.
My Statistics:
This was my first time on the job market. I come from a top 50 program. I had about 7 peer reviewed articles at the time. My publications in Political Analysis and Legislative Studies Quarterly got me considerable attention.
I applied to approximately 70~80 academic jobs, counting both tenure track and postdoc positions.
I got initial interviews for about a quarter of them. During these interviews (all on zoom), I was asked about my research and teaching. Most of these meetings were between 30 minutes – 1 hour.
I ended up getting job talks for tenure track positions at four research universities (two R1s, two R2s) and one teaching orientated university. Additionally I got offered two job talks for postdoc-to-tenure track positions at two additional research universities. Most of the job talks took approximately a day. The job talks consisted of a job talk (1 hour including Q/A), meeting with faculty and graduate students, and a teaching demonstration.
I ended up getting job offers from three of the above. Additionally, I was the 2nd choice for at least two of the other positions.
I also applied to a few industry jobs – mainly government and think tank research positions.
I highly encourage future candidates to join the slack: http://supportyourcohort.com/. Candidates on the slack keep each other updated about the progress of searches. Shout out to Alexis Lerner for organizing the slack channel this past cycle.
Types of Jobs:
There are six major types of jobs in the political science job market.
The first major category is tenure track jobs. These are the golden goose most of us are chasing. If you get a tenure track job you will be employed full-time and granted full benefits (healthcare, retirement, etc.). Salaries are in the 60-80k range. There is a lot of heterogeneity within tenure track jobs, but they can be broadly subdivided between research and teaching orientated universities.
Research orientated universities have salaries on the upper range of the salary range. Teaching loads tend to be around “2/2”. This means that you’ll be expected to teach about 2 classes per semester. There isn’t a magic number for tenure, but I was given a ball park estimate of needing 7-10 articles minimum once I went up for tenure.
Teaching orientated universities have salaries on the lower range of the salary range. Teaching loads fluctuate widely. I mostly saw “3/3” positions, but I saw a few positions that were 4 courses a semester. Tenure expectations were 2-5 articles minimum.
The second major category is postdocs. These are usually appointments of 1-2 years and their primary function is to give candidates a chance to spend more time applying for tenure track jobs. Salaries are in the 50-60k range. Most of these positions have minimal teaching obligations.
The third major category is postdoc-to-tenure track positions. These positions start out as postdocs, but have the potential to convert to tenure track positions. Similar to postdocs, these positions have minimal teaching obligations. These positions are increasingly common in midranking universities. Their purpose, as I was told, is to try to win over candidates that show potential. I think they’re also a clever way to solve the lemon problem. When a department hires a candidate they have minimal information about how they’ll fit in with the department’s culture. By hiring candidates as postdocs, the department has the option to not extend the tenure track offer to candidates that end up being lemons after they show up. Salary range is in the 50-60k range.
The fourth major category is adjunct/VAP positions. These are similar to postdocs in that they are usually appointed in the short term. Unlike postdocs, these have high teaching obligations. I have minimal information about these types of jobs, so I defer to others. My sense is to avoid these positions if you plan to go on the market again because their high teaching obligations eat up your time.
The fifth major category is community college jobs. Similar to adjunct/VAP positions, I have minimal information about these so I defer to others with more information. In a few states, including my home state of California, some of these positions get full benefits and are eligible for tenure. If you can get a tenure track community job, the initial salary range is 70-80k. Research obligations are minimal. I actually think these are really good jobs if your passion is teaching. They also offer a high degree of control over your location.
The last major category of jobs is industry. For political scientists these mostly means jobs in government, think tanks, and non-profits. I applied to a few industry jobs and had modest success. The salary range for these jobs seems to be 70-120k with full benefits. These jobs are really tempting because they give you a high degree of control over your location.
Once upon a time there lived a scholar named Andrei Illarionov, a prominent free market economist who at some point became a senior economic advisor to the Russian government at the end of the 1990s. Yet, in the early 2000s, he quit on the new Putin regime. Illarionov became disgusted with the growing authoritarianism of his boss, who was slowly but surely squashing private businesses, increasing the powers of the secret police (who are the untouchable ruling elite in current Russia) and enlarging governmental bureaucracy.
Andrei Illarionov
The place that gave Illarionov a chance to pursue his scholarship and to further exercise his criticism of the Putin regime was Cato Institute, a libertarian think tank in Washington, DC that hired him in 2006. Hiring a prominent dissident scholar who quit a lucrative and well-paid governmental position and who was raising his voice against the autocratic regime is very commendable and very libertarian. Furthermore, after securing his position, Illarionov returned to Russia a few times, where he took part in the antigovernment street protests, firmly supporting anti-Putin opposition forces.
The professor also became active in social media and on YouTube, drawing millions of viewers on various Russian-language channels and sites. Besides, he regularly published his pieces in a personal LiveJournal. I liked and agreed with some of his assessments, especially the ones that analyzed the 1990s reforms in Russia, and Putin’s crony capitalism. I also became drawn to his insights into existing threats to the values of Western civilization coming from the current US and European woke mainstream that increasingly breeds intolerance, “tribalism,” racial animosity, erodes the rule of law, and undermines constitutional values. At the same time, some of his other assessments aroused my skepticism.
As a popular social scholar, Illarionov became part of current debates in the Russian-speaking internet community, speaking on topics ranging from the Putin regime to the notorious corona and to the woke cancel culture that currently suffocates American pollical, intellectual, and cultural life. Some people agreed with him, whereas other rebuked him – a normal process in a normal democratic republic. And everything was OK in the life of the scholar until January 6, when the “storming” of the Capitol building took place and when suddenly the Cato Institute decided to quickly get rid of him.
Now I must expand on what Illarionov said about the January 6 event. This is not to convince the reader whether he was right or wrong but to give some context to the story that will be unfolded below. For this reason, I ask you to bear with me. First, the scholar dared to question the validity of the voting in the five swing states and suggested in his Russian-language blog that the whole “storming” business and the passivity of Washington DC officials and the Capitol police, several of whom let the “insurrectionists” in, somewhat reeked of the so-called Reichstag fire – an incident that had opened doors the ascension of totalitarianism in Germany in 1933.
Then, in the same posting he dared to come up with a few other “uncomfortable” statements. For example, Illarionov remarked that, if we went by the one person-one vote rule, the winner of the US elections was clearly Joe Biden; yet, if one went by the constitution (the electoral college), the results of the elections in swing states were rather murky, considering the lax corona mail-in voting rules that were railroaded into our society at the last moment. By refusing to even consider Trump elections lawsuits, the US court system failed to play the role of an independent umpire and missed the opportunity to validate the quality of the presidential elections in the eyes of people. Illarionov stressed that, since about 40% of American voters, including 73% Republicans, questioned the results of the swing states’ elections, it was essential to take extra legislative and judicial steps to check and verify those results to regain the popular trust into US electoral, judicial, and political system rather than to simply jump to announce the winner of the 2020 elections.
The scholar also shared his personal experience of being in downtown Washington on January 6 among tens of thousands of protesters who were walking along the Pennsylvania Avenue toward the Capitol and who were insulted by tiny groups of BLM supporters who shouted obscenities at the demonstrators. The latter either responded with such phrases as “Join us” or warned each other by saying, “No violence,” and “Don’t touch them, they want to instigate a fight.”
Illarionov also drew our attention to another “uncomfortable” fact: none of the “insurrectionists” who broke into the Capitol building used weapons against police. Later, FBI confirmed that among all arrested for “storming” the Capitol no one faced firearms-related charged and no arms were recovered. It was in fact the Capitol police that shot one of the protestors: a veteran air force officer named Ashlie Babbitt; Illarionov nevertheless found it necessary to add that a Capitol police officer was hit in his head by a fire extinguisher, which turned out to be a fake information spread by the mainstream media, including New York Times. In reality, the man died after the incident and the cause of his death was completely different. Yet, the mainstream media and democratic legislators for the whole month cynically exploited the original “fact” of his death to amplify “insurrectional” dimension of the January 6 break-in and present the officer as a martyr for the cause of democracy.
A few days after Illarionov came up with those and other LiveJournal remarks, popular Politico condemned him , distorting his utterances and ascribing to him what he never said. Politico insisted that the scholar denied the results of the US elections and argued that January 6 event was a trap set by police following a deliberate provocation by BLM activists with a silent agreement of Democrats. Ironically, after Cato rushed to ditch Illarionov, it was revealed that, besides Trump supporters who rushed into the building, there was indeed an Antifa and BLM sympathizer, a provocateur named John Earle Sullivan, who too took part in the “storming” of the Capitol. Sullivan was arrested for vandalism and directly inciting violence while being inside the building. Dressed in a “Trump garb,” he was caught on tape by encouraging the right to be more assertive and aggressive. But the story does not end there. An additional and cruel irony was a publication in Time magazine, a mainstream liberal publication, which literally bragged about how Democrats, “decent” Republicans, Big Tech, and their radical Antifa and BLM informal allies worked together in a united “shadow campaign” to orchestrate changing election rules, purging media of “wrong” opinions, and enhancing mass street protests to “fortify” the elections in the “correct” direction for the greater cause of “saving democracy.”
What was stunning in that situation was not Politico’s public condemnations and bot the content of the scholar’s utterances but a reaction of the Cato Institute, Illariniov’s employer, that too denounced the scholar and began an internal investigation of what he said about the 2020 elections and how he said it, which led to his immediate dismissal a few days later.
The whole Illarionov incident reveals not only how quickly our intellectual mainstream has degenerated for the past year by moving fast forward toward the elimination of the constitution’s first amendment. What is the most appalling here is that it was the administration of the libertarian think tank that, instead of dismissing outright any attempt to penalize the scholar for what he said and how he said it, followed the lead of an academic snitch, initiating an investigation and purging him in a lightning speed. It is hard to figure out what drove the minds of the Cato scholar-bureaucrats when they made an instant decision to crucify Illarionov for saying things they did not agree with. Did Cato fear that, being in the den of the DC “deep state” area where 91% of people vote Democrat, they could be cut from the available publicity venues and networks they established in the Washington area? Or could it be a simple opportunism – a fear that Cato could become a target of the woke mob if it did not throw the sacrificial lamb to the pack of the cancel culture wolves?
I never heard about progressive scholars losing their jobs over calling Trump an illegitimate president and insisting that the 2016 elections were a fraud perpetrated by massive Russian interference; in fact, it was an acceptable mainstream discourse for the past four years. Moreover, neither academic nor state and federal bureaucrat was penalized by losing his or her job for endorsing BLM, whose mass rallies last summer were responsible for urban pogroms (destructive property damages in the amount of more than $1 billion, 25 people killed , and 2037 police officers wounded). No academic or politician was disciplined for raising funds to bail out “racial justice” rioters that were looting and burning stores, court houses, smashing statutes, and intimidating people. Both the left and the right know full well that, except spent time, public posturing as a “civil rights activist” or “minority advocate” hardly cost a person anything morally, politically, or financially. In fact, in government and especially in academia, which is currently held in tight grips of the left hegemony (I have borrowed the latter word from the leftist jargon), such posturing can be an excellent career booster. The first analogy that comes to my mind in this case is the officially endorsed and politically correct activism, which motivated millions of opportunists and would-be young apparatchiks in “good” old socialist countries of Eastern Europe and Soviet Union.
I am sure many in academia carry the abovementioned activities on their sleeves as the badge of honor. In fact, as early as 2011, in several universities, left-leaning instructors began incentivizing students by trying to make a participation in demonstrations for “progressive causes,” including the 2018 Kavanaugh hearings, part of their social science course work and offering students brownie points in a form of extra credits. By the way, among the leftist protesters who “stormed” the Capitol that year, 227 were arrested for obstructing the hearings and harassing congress people they did not like. Nobody (and rightly so) ever thought about treating them as insurrectionists, and their only penalty was meager fines of $35 to $50. Better than anything, such state of things tells us about who currently represents the power elite in the country and who really calls the shots in our political, intellectual, academic, and cultural mainstream.
The Illarionov incident is not something extraordinary. It is unfortunately a manifestation of the systemic (another favorite word of choice among the current left) impact of the cancel culture or, putting it simply, an ideological witch hunt, in our cultural and intellectual mainstream. The cancel culture flourished in earnest last summer, when those people who refused to endorse urban pogroms and false BLM claims about thousands of unarmed black people being murdered by white police were routinely silenced, ostracized, and fired. National Association for Scholars recorded 128 of cases in US and Canadian universities, where people with predominantly conservative and libertarian views (along with several leftist academics!) were silenced by their schools when they expressed “incorrect” opinions on various racial and political issues. Among them one can find, for example, Professor Gordon Klein, UCLA, who “incorrectly” responded to a request to postpone final exams for black students in his online course (in the wake of the George Floyd death) by saying that he intended to treat everybody equally irrespective of their skin color. Legal scholar John Eastman from Chapman University made a mistake to speak at the January 6 rally that prompted the university to force him into retirement. In its turn, Duquesne University immediately fired its professor of educational psychology Gary Shank for using N-word in his class to simply illustrate how the word was used in the past: the professor wanted to make a point about the progress of race relations in US! Even though several of the fired people were restored in their employment, the draconian McCarthyism-like message is very clear: toe the line or else.
Petrified of potential accusations in racism and bigotry, corporations, universities, state institutions, just in case, literally bow down to the aggressive woke mob, morally disarming themselves and the whole society, planting in our midst the atmosphere of fear and self-censorship that reeks of Stalin’s Russia and Mao’s China during their “best” days. Not infrequently, for the past months, we have been witnessing the communist-style practices when “progressive” people have been routinely denouncing their colleagues and relatives (on many occasions retroactively for past “sins”) for being “reactionaries” and “racists.” Moreover, most recently, the cancel culture practice has reached grotesque proportions, when family members report to FBI their relatives if they happened to be participants of the Trump January 6 rally.
The Cato Institute that shut down Illarionov should remember the famous “First they came after” confession credo from German Lutheran pastor Martin Niemöller (1892–1984) who referred to the cowardice and compliance of German intellectuals and clergy during the time of national socialism. And sure enough, we already have radical voices among Democrats who have suggested to phase out and deprogram not only conservatives but also libertarians. If current woke censorship and self-censorship escalates further, tomorrow there might be nobody left to protect Cato. This liberty institute claims that it works to enlighten our society to “better understand and appreciate the principles of government that are set forth in America’s Founding documents.” Something tells me that the woke mob, which does not care about these founding documents, will not spare the cautious and politically correct preachers of constitutionalism.
Andre Van Doren, a humanities scholar of a Polish-American extract, who is interested in the issues of political economy and culture. He can be contacted at borismoriarti@gmail.com, the list of his publications can be found at https://muckrack.com/andre-van-doren-1
I recently learnt that Harvard philosophy professor Michael Sandel has become the Wilt Chamberlain of the anti-commodification discussion circuit. He commands academic-salary-sized fees for single webinar appearances. Although I disagree with some of his views, I appreciate the fact that he brings enormous value to whoever purchases his services.
These claims are empirically suspect. Virgil Storr, Ginny Choi, Meagan Teague and Rosemary Fike have shown markets improve social relations and moral behaviour on most dimensions we can measure. In particular, societies with more economic freedom are less obsessed with material wealth than those with more regulation. Why might that be? In Basic Economic Liberties: John Rawls and Adam Smith Reconciledand my forthcoming book Neoliberal Social Justice, I argue that participation in voluntary market relations improves people’s sympathy for strangers and ultimately encourages them to weigh their interests in their personal moral calculus. Markets themselves generally make us more moral, not less.
But Armen Alchian and Lester Tesler could offer a much simpler explanation: the law of demand (I am much indebted to Brian Albrecht’s and Josh Hendrickson’s Economic Forces Substack for this insight). Presume that people generally care about fairness among other values. How much would they be willing to sacrifice to be fair in practice. It depends on how expensive being fair is. When the personal loss to them of being fair is low, they’ll be fairer. When it’s high, they won’t. Presuming diminishing marginal utility of wealth and income, a society composed of the relatively well-off will start to care more about non-material values. People will be slow to sacrifice money for food or rent for the sake of fairness, but they will be willing to give up on the prospect of having a second Tesla for the sake of benefiting the less advantaged.
Mark Carney is an in-person example. He is a multi-millionaire who has worked hard most of his life in finance. Even when acting as a public servant, he has benefitted enormously and unequally to his advantage from living in a commercial society. But now he can afford to worry about the bigger picture issues like fairness and what we will offer to future generations (although I guess he still likes to earn a bit of money on book sales while thinking big). So, if you want people to care about fairness, the lesson is not to bash markets: it is to make everyone as rich and comfortable as Mark Carney.
It is fascinating how foreign societies that loathe someone else’s nationalist movements over time amplify the revitalized traditions from such movements in their mass culture and celebrate a parodied version of it. A case in point is the St. Patrick’s day celebration.
Ireland’s vibrant folklore tradition was revitalized in the late 19th and early 20th centuries with a vigorous nationalist movement. Following this movement, a tiny rural Irish folklore tradition carried over to the Irish American context, like leprechauns decorating St. Patrick’s day cards, for example. The communal identity of the pastoral Irish homeland revitalized by their nationalist movement was vital for the Irish group consciousness in America over many generations, even for the new generations who didn’t visit their old sod.
Today the Irish group has to remind itself of its kinship and traditions as it continues to fade into the American mass culture. As an outsider, what I get from the American mass culture— that supposedly celebrates multiple cultures—is not the rich Irish folklore and tradition that arrived on the scene but mere stereotypes of it. The 19th century WASP anti-Irish caricature can be seen even today. Implicit in every Irish joke is either the image of a drunken Irish devoid of any cultural sophistication or a fighting Irish who is endlessly combative. Had the Irish been a brown or black community, would such a depiction—in a less mean spirited form or not—carry forward in today’s hypersensitive, race-obsessed American society?
Did you know that Ireland contributes significantly to global science and technology?
St. Patrick’s Day, for the most part, a quiet religious holiday in Ireland, started as an occasion to demonstrate Irish heritage for those living in the United States. Instead, one primarily receives an American mass culture-induced Irish self-parody: holiday associated with alcoholic excess. How much of the self-parody was consciously nurtured by early Irish Americans is debatable. But, I recognize the Irish have resisted from being entirely consumed by the American mass culture in several ways, for example, by employing traditional Irish names such as Bridget, Eileen, Maureen, Cathleen, Sean, Patrick, Dennis, etc.
If you think you know the Irish, then think again.
As a Hindu immigrant myself, I realize it is essential for immigrant groups to assimilate in several ways, like speaking English, which the Irish did effortlessly. Isn’t it the American mass culture’s duty to comprehend a certain authentic Irishness or Hinduness in popular culture without caricatures? As a Hindu, I have faced disdainful “holy cow” jokes from Muslims and Christians in the United States, but of course, Hinduphobia isn’t a politically dominant thing you see. In this light, when just about anything goes on in the name of a “melting pot,” I don’t see cultural salad bowls as regressive but protective. Interestingly, folks who sermonize blending of cultures, dilution of conserved cultural traits like names, etc., as the only form of progressive new beginnings in the social setting also shore up conserved group identities for certain communities in politics.
Not everything is healthy with the immigrants who wish to conserve their authentic identity as well. There is a bias among such immigrants to regard the United States as lacking a unique, respectable culture. Such immigrant-held prejudices get magnified when one-half of the country goes about canceling all faces on Mount Rushmore and actively devalues every founding document and personality of the United States. The impact of immigration and assimilation is complex. It requires the appreciation of traditions maintained by the immigrants and the immigrants appreciating the culture of the land they have entered.
Over time, however, colorful cultural parades that aim to link immigrant folklore and traditions to public policy and popular culture via song, dance, and merriment also dilute the immigrant’s bond with their authentic cultural heritage. In a generation or two, immigrants embody their projected, simplistic self-image parading in the mass-culture. Soon, the marketed superficial traits become deeply authentic pop-cultural heritage worthy of conservation in a melting pot. A similar phenomenon is taking over the diaspora Hindus and some communities back home in India, where a certain “Bollywood-ization” of Indic rituals and culture is apparent. I call this a careless collective self-objectification.
When a critical mass of people recognize a weakening of valuable cultural capital, reviving it is natural. If not for the revivalist Irish cultural nationalism, would there be a sense of pride, a feeling of collaboration among the Irish Americans in the early years? Would there be a grand sweep of Irish heritage for the melting pot to—no matter how superficially—celebrate?
Former Brazilian president Luiz Inácio Lula da Silva is currently considered innocent and can run for president in 2022 if he wishes. Lula was arrested in April 2018 under Operation Car Wash, conducted by judge Sérgio Moro in the Brazilian southern city of Curitiba, in the state of Paraná. In November 2019, the Supreme Federal Court ruled that incarcerations with pending appeals were unlawful and Lula was released from prison as a result. Yesterday, March 8, 2021, the Supreme Court Justice Edson Fachin ruled that all Lula’s convictions must be nullified because Lula was tried by a court that did not have proper jurisdiction over his case. This is so complicated that I had to check in Wikipedia to make sure I’m getting the basic facts straight.
Now I wonder: what changed between April 2018 and November 2019? And what changed yesterday? Don’t know! Was Lula illegally arrested in April 2018? What kind of country is this, in which people are arrested unlawfully?! Why it took Edson Fachin almost three years to realize that Sérgio Moro had no jurisdiction in this case?! Is Brazilian law really so complicated that it takes even to a supreme court judge three years to realize that something is wrong? What is going to happen to Lula now? After all, he was in jail unlawfully for more than a year! But mind this: Edson Fachin didn’t say that Lula is innocent! He said that Sérgio Moro had no jurisdiction to judge him. Theoretically, Lula can be judged by a new court, with the same proofs, and be condemned… again. You know, Seinfeld was right:
“What are lawyers, really? To me a lawyer is basically the person that knows the rules of the country. We’re all throwing the dice, playing the game, moving our pieces around the board, but if there’s a problem, the lawyer is the only person that has read the inside of the top of the box. I think one of the fun things for them is to say, ‘objection.’ ‘Objection! Objection, Your Honor.’ Objection, of course, is the adult version of, ‘’fraid not.’ To which the judge can say two things, he can say, ‘overruled’ which is the adult version of ‘’fraid so,’ or he could say, ‘sustained,’ which is the adult version of ‘Duh.’”
I’m afraid that in the case of Brazil, if the supreme court judges don’t quite understand the rules of the game, neither can I.
I am reading a lot on federation lately, for an article I would like to contribute to Brandon’s special issue of Cosmos + Taxis. I am going back to the debate about federalizing (parts of the) the democratic world which was very lively in the 1930s and 1940s. Reading the texts, for example the best-selling Union Now! (1939) by American journalist Clarence Streit, you can feel the scare for the authoritarian rulers and their nationalistic and militaristic policies. As an anti-dote, Streit proposed the federation of all the grown democracies in the world at that time, 15 in total, spread over the globe. This Union of the North Atlantic had to include a union citizenship, a union defense force, a union customs-free economy, union money and union postal and communications system After the war broke out, Streit published a new version, now calling for a union between Britain and the USA. Needless to say, none of these or other proposals went anywhere. Still some interesting perpetual questions remain.
Ludwig von Mises and Friedrich Hayek also wrote on federation during this period, as I described in Classical Liberalism and International Relations Theory (2009). I now went back to their writings, which is a treat. It is nice to have a fresh look, I also have deeper insights now (at least – I think!) than I had about 15 years ago when first encountering these ideas.
One of the divides between Mises and Hayek (which they never openly discussed, as far as I am aware) revolved around the alleged pacifying effect of federations. Mises made the point that joining a federation would lead to a larger loss of sovereignty than was normally conceived in the debate. It was not just about pooling some powers at the federal level. In an interventionist world, Mises argued, the number of policies that are dealt with from the center, or the capitol, continually rise. After all, the call for intervention will be made from all corners of the federation, all the time. This leads to a call for equal treatment, which in turn lead to a larger number of policies and regulations administered from the capitol. Consequently, the member states increasingly lose sovereignty and eventually end up as mere provinces. This would be a new cause of division, especially when the member states of the new federation used to be powerful countries on their own. Hence, a federation divides, not unites. Therefore, he proposed a much more radical solution in his plan for Eastern Europe: no federation but a strict central union (administered by foreigners, in a foreign language he even once suggested) where the members would basically have no say at all over all the important legislation normally associated with sovereignty. The laws and regulations would be limited, ensuring maximum economic and political freedom for the individual citizen.
This blog is not meant to discuss the merits of Mises’ ideas. It solely aims to point at a division between Mises and Hayek. Hayek, and most thinkers on federation with him, Streit included, had different expectations about the political effects of federation. They expected that federation would be a force of unity. In a federation you arrange the most difficult and divisive policies at the center (for example defense, foreign policy and foreign trade), while leaving all other policies to the constituent parts. This allows room for different policies in those states, while taking away their instruments to start violent conflict. Yes, this would mean less sovereignty, but also less trouble, while the freedom within the federation still ensured as much or as little additional policies as the individual states see fit. Hayek would favor his idea the rest of his life, also proposing it for the Middle East, for example.
Who was right? That is impossible to say, I think. There are elements of both Misesian and Hayekian arguments in the real-life experiences of federations around the globe. For some it is indeed a good way to pool the core of sovereignty, while remaining as diverse as possible. Although most them do not disintegrate with violent conflict, the increase of all kind of policies at the federal center has certainly happened. However, this is not unique to federations and most importantly, it is not a question of formal legal organization. It is a question of mentality of both politicians and populations. This is another reason to keep fighting ‘the war of ideas’, because ideas have the power to change societies.
In the early twentieth century, cancer assumed a more prominent place in the popular imagination as the threat of contagious diseases reduced and Americans lived longer. In this regard, the American Society for the Control of Cancer (ASCC), founded in 1913, had identified three goals: education, service, and research. However, until midcentury, largely due to the limited budget, the society contributed little to cancer research.
Enter Mary Woodard Lasker
One of the most powerful women in mid-twentieth-century New York City, and perhaps North America, Lasker demonstrated that women could command transformations in medical institutions. Born in Wisconsin in 1900, Mary Lasker, at the age of four, found out that the family’s laundress, Mrs. Belter, had cancer treatment. Lasker’s mother explained, “Mrs. Belter has had cancer and her breasts have been removed.” Lasker responded, “What do you mean? Cut off?” Decades later, Mary Lakser would mention this early memory that had inspired her to engage in cancer work.
After having established herself as an influential New York philanthropist, businesswoman, and political lobbyist, when Mary Lasker inquired about the role of the ASCC in 1943, she learned that the organization had no money to support research. Somewhat astounded by this discovery, she immediately contemplated ways to reorganize the society. She wanted to recreate the society into a powerful organization that prioritized cancer research.
Well-versed in public relations, connected to the financial and political circles of the country, Lasker played a central role in the mid-1940s. Despite notable opposition, Lasker convinced the ASCC to change the composition of the board of directors to include more lay members and more experts in financial management. She urged the council to adopt a new name, the American Cancer Society. She also convinced them to earmark one-quarter of its budget for research. This financial reorganization allowed the ACS to sponsor early cancer screening, including the early clinical trials. The newly formed American Cancer Society articulated a mission that explicitly identified research funding as its primary goal.
By late 1944, the American Cancer Society had become the principal nongovernmental funding agency for cancer research in the country. In its first year, it directed $1 million of its $4 million in revenue to study. Her ardent advocacy for greater funding of all the medical sciences contributed to increased funding for the National Institutes of Health and the creation of several NIH institutes, including the Heart, Lung, and Blood Institute.
Mary Lasker continued to agitate for research funds but resisted any formal association. As she explained it, “I’m always best on the outside.” Undoubtedly, Mary Lasker’s influence and emphasis on funding cancer research contributed to promoting the Pap smear in U.S. culture. As a permanent monument to her efforts, in 1984, Congress named the Mary Woodard Lasker Center for Health Research and Education at the National Institutes of Health.
A portion of a letter ACS administrative director Edwin MacEwan wrote to Lasker encapsulates her contribution to our society. He wrote, “I learned that you were the person to whom present and potential cancer patients owe everything and that you alone really initiated the rebirth of the American Cancer Society late in 1944.”
“If you think research is expensive, try disease.” —Mary Woodard Lasker (November 30, 1900 – February 21, 1994)
Yesterday, I came across this scoop on Twitter; New York Post and several other blogs have since reported it.
Regardless of this scoop’s veracity, the chart of Eight White identities has been around for some time now, and it has influenced young minds. So, here is my brief reflection on such identity-based pedagogy:
As a non-white resident-alien, I understand the history behind the United States’ racial sensitivity in all domains today. I also realize how zealous exponents of diversity have consecrated schools and university campuses in the US to rid the society of prevalent racial power-structures. Further, I appreciate the importance of people being self-critical; self-criticism leads to counter-cultures that balance mainstream views and enable reform and creativity in society. But I also find it essential that critics of mainstream culture don’t feel morally superior to enforce just about any theoretical concept on impressionable minds. Without getting too much into the right vs. left debate, there is something terribly sad about being indoctrinated at a young age —regardless of the goal of social engineering— to accept an automatic moral one-‘downmanship’ for the sake of the density gradient of cutaneous melanin pigment. Even though I’m a brown man from a colonized society, this kind of extreme ‘white guilt’ pedagogy leaves me with a bitter taste. And in this bitter taste, I have come to describe such indoctrination as “Affirmative Guilt-Gradient.”
You should know there is something called the Overton Window, according to which concepts grow larger when their actual instances and contexts grow smaller. In other words, well-meaning social interventionistas easily view each new instance in the decreasingly problematic context of the problem they focus on with the same lens as they consider the more significant problem. This leads to unrealistic enlargement of academic concepts that are then shoved down the throats of innocent, impressionable school kids who will take them as objective realities instead of subjective conceptual definitions overlaid on one legitimate objective problem.
I find the scheme of Eight White identities a symptom of the shifting Overton Window.
According to Thomas Sowell, there is a whole class of academics and intellectuals of social engineering who believe that when the world doesn’t reconcile to their pet theories, that shows something is wrong with the world, not their theories. If we are to project Thomas Sowell’s observation on this episode of “Guilt-Gradient,” it is perfectly reasonable to expect many white kids and their parents to refuse to adopt these theoretically manufactured guilt-gradient identities. We can then —applying Sowell’s observation—predict academics to declare that opposition to the “Guilt Gradient” is evidence for many covert white supremacists in the society who will not change. Such stories may then get blown up in influential Op-Eds, leading to the magnification of a simple problem, soon to be misplaced in the clutter of naïve supporters of such theories, the progressive vote-bank, and hard-right polemics.
We should all acknowledge that attachment to any identity—be it majority or minority—is by definition NOT a hatred for an outgroup. Assistant Professor of Political Science at Duke University, Ashley Jardina, in her noted research on the demise of white dominance and threats to white identity, concludes, “White identity is not, a proxy for outgroup animus. Most white identifiers do not condone white supremacism or see a connection between their racial identity and these hate-groups. Furthermore, whites who identify with their racial group become much more liberal in their policy positions than when white identity is associated with white supremacism.” Everybody has a right to associate with their identity, and equating one’s association with an ethnic majority identity is not automatically toxic. I feel it is destructive to view such identity associations as inherently toxic because it is precisely this sort of warped social engineering that results in unnecessary political polarization; the vicious cycle of identity-based tinkering is a self-fulfilling prophecy. Hence, recognizing the Overton Window at play in such identity-based pedagogy is a must if we have to make progress. We shouldn’t be tricked into assuming that the non acceptance of the Affirmative Guilt Gradient is a sign of our society’s lack of progress.
Finally, I find it odd that ideologues who profess “universalism” and international identities choose schools and universities to keep structurally confined, relative identities going by adding excessive nomenclature so they can apply interventions that are inherently reactionary. However, isn’t ‘reactionary’ a pejorative these ideologues use on others?
Every great civilization has simultaneously made breakthroughs in the natural sciences, mathematics, and in the investigation of that which penetrates beyond the mundane, beyond the external stimuli, beyond the world of solid, separate objects, names, and forms to peer into something changeless. When written down, these esoteric percepts have the natural tendency to decay over time because people tend to accept them too passively and literally. Consequently, people then value the conclusions of others over clarity and self-knowledge.
Talking about esoteric percepts decaying over time, I recently read about the 1981 Act the state of Arkansas passed, which required that public school teachers give “equal treatment” to “creation science” and “evolution science” in the biology classroom. Why? The Act held that teaching evolution alone could violate the separation between church and state, to the extent that this would be hostile to “theistic religions.” Therefore, the curriculum had to concentrate on the “scientific evidence” for creation science.
As far as I can see, industrialism, rather than Darwinism, has led to the decay of virtues historically protected by religions in the urban working class. Besides, every great tradition has its own equally fascinating religious cosmogony—for instance, the Indic tradition has an allegorical account of evolution apart from a creation story—but creationism is not defending all theistic religions, just one theistic cosmogony. This means there isn’t any “theological liberalism” in this assertion; it is a matter of one hegemon confronting what it regards as another hegemon—Darwinism.
So, why does creationism oppose Darwinism? Contrary to my earlier understanding from the scientific standpoint, I now think creationism looks at Darwin’s theory of evolution by natural selection not as a ‘scientific theory’ that infringes the domain of a religion but as an unusual ‘religion’ that oversteps an established religion’s doctrinal province. Creationism, therefore, looks to invade and challenge the doctrinal province of this “other religion.” In doing so, creation science, strangely, is a crude, proselytized version of what it seeks to oppose.
In its attempt to approximate a purely metaphysical proposition in practical terms or exoterically prove every esoteric percept, this kind of religious literalism takes away from the purity of esotericism and the virtues of scientific falsification. Therefore, literalism forgets that esoteric writings enable us to cross the mind’s tempestuous sea; it does not have to sink in this sea to prove anything.
In contrast to the virtues of science and popular belief, esotericism forces us to be self-reliant. We don’t necessarily have to stand on the shoulders of others and thus within a history of progress, but on our own two feet, we seek with the light of our inner experience. In this way, both science and the esoteric flourish in separate ecosystems but within one giant sphere of human experience — like prose and poetry.
In a delightful confluence of prose and poetry, Erasmus Darwin, the grandfather of Charles Darwin, wrote about the evolution of life in poetry in The Temple of Nature well before his grandson contemplated the same subject in elegant prose:
Organic life beneath the shoreless waves Was born and nurs’d in Ocean’s pearly caves; First forms minute, unseen by spheric glass, Move on the mud, or pierce the watery mass; These, as successive generations bloom, New powers acquire, and larger limbs assume; Whence countless groups of vegetation spring And breathing realms of fin, and feet, and wing.
The prose and poetry of creation — science and the esoteric; empirical and the allegorical—make the familiar strange and the strange familiar.
In 2009, the film An Education came out. It was a Bildungsroman of sorts but it was beyond a coming-of-age story. It showed the life of the aspiring, post-World War II nouveau riche middle-class. The protagonist is a schoolgirl whose aspirationally-minded parents send her to a private school. They have no notion of what a good school is, how to tell if one is good, or why a person should attend one; they only know that “all the best” send their children to fee-paying schools. In the process, they’re swindled. The film’s characters descend into wallowing self-pity as nothing works out quite right for the protagonist. The story showcases a reality that we’re dealing with today. A fiction created by and for the post-War nouveau riche is collapsing. Those caught in its net do not recognize that what is collapsing is a fantasy, believing instead that the social order is declining, that a social contract has been broken.
Adam Smith wrote in his Theory of Moral Sentiments (1759),
The rich man glories in his riches, because he feels that they naturally draw upon him the attention of the world, and that mankind are disposed to go along with him in all those agreeable emotions with which the advantages of his situation so readily inspire him. At the thought of this, his heart seems to swell and dilate itself within him, and he is fonder of his wealth, upon this account, than for all the advantages it procures him. The poor man, on the contrary is ashamed of his poverty.
In the twenty-first century, achievement replaces riches as a font of glory. And that is good. It means that we have reached a point where being monied by itself is no longer a distinction. Achievement is the currency of the realm now (if anyone wants to imagine me saying that in the voice of Cutler Beckett from Pirates of the Caribbean, please feel free). What we face today is a society of rich men whose hearts have dilated but who haven’t reinvested their wealth to procure advantage. And herein lies the rub: procuring advantage requires savoir-faire and those who built and benefitted from the society of post-War nouveau riche for the better part don’t have it.
The savoir-faire needed is a type of street-smarts, but for life and careers rather than the literal streets (though it’s good to have that as well). This knowledge is not particularly secret, yet when caught out, the nouveau riche middle-class squawks and cries foul. Take for example Abigail Fisher and her lawsuit against UT-Austin: the young woman required the Supreme Court to validate UT-Austin’s assertion that graduating above average from a public school, playing in a section of a youth orchestra, and being one of two million Habitat for Humanity volunteers did not make her remarkable. The Abigail Fisher story is not one of racism or affirmative action. It’s a story of lack of savoir-faire. A person who thinks that any of the extracurriculars entered into evidence at the trial were résumé enhancing is as deluded as the parents from An Education who thought that merely forking out money ensured social mobility, a better life, the prospect of great things.
Making unfounded assumptions is part of the nouveau riche middle-class’ lack of savoir-faire. A lawyer I know has made it well into adulthood without knowing that advanced degrees from Ivy League schools are fully funded, i.e. free. Believing that he could neither afford an academic advanced degree nor that it would pay off professionally, he pursued law at a school where he received in-state tuition. He suffers from a stagnant career and fears that he doesn’t have a vocation for law. In contrast, this lawyer has already been surpassed by another attorney I know who is still in her twenties. In addition to being fiercely intelligent overall, she is worldly-wise. When she decided on law school, she pursued only Ivy Plus schools exactly because they were the ones with the best funding and scholarships; name recognition was an aside. She received merit-based full tuition funding, a stipend, and major professional opportunities because she was a prize winner.
Nor is it an anomaly that the highest tiers of education are free, or less expensive than people assume. Those who have read George Elliot’s Daniel Deronda know that Daniel was on track to win a fellowship which would refunded his tuition at Cambridge to his father Sir Hugo Mallinger. This was a huge honor which had almost nothing to do with money. Sir Hugo was fabulously wealthy; he didn’t need the money refunded – though of course it would have been nice. The honor was the primary attraction for him. “Daddy paid all my tuition” can’t be put on a résumé; “___ Fellow” or “winner of ___ Scholarship” can be. To be clear, I fully support parents saving for their children’s college. But I have had genuine conversations with people who told me that even though their children qualified for merit scholarships or grants, they wouldn’t apply for them because “those are for poor people.” The middle-middle class’ false pride of refusing to apply for scholarships and grants, euphemistically call “paying their way,” has only disadvantaged middle-middle class children as they arrive at the ages of twenty-two, twenty-four, twenty-six without plums, without proof of their abilities, without signs that someone, some institution took a bet on them and that they held up their end of the bargain. Further as Daniel Deronda’s story shows, this false pride has never been the way of the upper-class.
A lack of savoir-faire affects in career shaping as well. I know two artists, one of whom is quite young, just over thirty, and highly successful; the other has had an unnecessarily disappointing career. Artistically, both are remarkable. The difference between the two is that since undergraduate the first has submitted her work to art journals, competitions, galleries, any opportunity where her art could be seen. Acceptance rates are less than one percent, and it is entirely standard for artists to apply multiple times to a single opportunity. The first artist has learned to take rejection on the chin, get up, and reapply. She said to me once, “all grant applications want to know what journals you’ve been published in, what shows you’ve had and where; all the journals and competitions want to know what grants you’ve won, where you’ve been published, and what shows you’ve had. The only thing to do is to keep having shows, keep applying, keep building.” Her persistence shows as she wins more and more prizes which in turn lead to bigger opportunities. She’s already developed a name and reputation within the professional community.
In contrast, the second artist submitted her work to a handful of opportunities when she graduated but gave up when the everything ended in rejections. Now, decades later, she understands how the “numbers game” works, and she’s submitting her work for consideration again. Just because she “didn’t know” she has lost decades she could have spent building her professional reputation and career. She lives in a kind of disappointed daze, wondering why things never quite worked out. She’s reached the point where she can afford to support herself by art alone, but she feels as though somehow some vague, inchoate rules of the game have been broken.
This is not about having money. The artist who has drifted and the first lawyer both come from and have enough money to do anything they might desire. This is about knowing how things work. In the age of the internet, such ignorance and lack of savoir-faire is inexcusable, and it is only right for it to be treated as a flaw. Back in 1905, my great-grandfather who had grown up in unfathomable, though genteel, poverty managed to figure out that Harvard doctorates were fully funded. He pursued one to the benefit of himself and his children, grandchildren, and great-grandchildren. If he could gather information and take action in 1905, in a part of the country without telephone, running water, electricity, or proper roads, what excuse is there for people today?
Gathering information and then taking action has another name: meritocracy. Both the left and the right have united against meritocracy, as predicted by Michael Dunlop Young, the man who coined the term “meritocracy.” Both sides agree that the meritocracy is bad because it is unfair, it breaks social bonds, however fictious. The rub with meritocracy is that it favors those who have savoir-faire over those who do not.And perhaps it is unfair. For example, the artists are equal in that each is technically strong and expresses interesting concepts with their art. Subjectively speaking, I would pay to go see the work of either of them in galleries. Yet one is ahead of the other in her career because she knew what was needed to cultivate her professional reputation and promote her work.
The second lawyer, the first artist, the UT-Austin students who had superb applications all earned their laurels because they took the extra steps of gathering information and learning how the world works. We owe nothing to people who won’t inform themselves, who create fictions about how the world around them works and then become angry when their ideas are shown to be fantasy. Saying that we are in a social contract with such people is coercive because such a contract compels those with savoir-faire to be the custodians of those without it. It would mean that Abigail Fisher would have to be admitted to the school of her choice because she “didn’t know” that all her extracurriculars were ordinary; it would mean that the first lawyer would have to be elevated to equality with the second one, even though the first one is literally less knowledgeable about the law, because he “didn’t know” that better quality legal training was in his grasp; it would mean that we would have to hold the artist who “didn’t know” to submit to journals and gallery competitions as equal to one who has a strong career and recognition because she submits her work regularly. How is this world, a society of excusing ignorance, pandering to an affluent but uninformed, uninquiring people, more fair than a meritocracy?
This fascinating isochrone map—how many days it took to get anywhere in the world from London in 1914—at first blush evokes the cliché that the world has now shrunk. Obviously it hasn’t shrunk. While the distance between London and New Delhi is still 4,168 miles, what has shrunk is time, and this has had profound aftermaths on our lives.
One of the great ironies in the remarkable proliferation of time-saving inventions is they haven’t made life simple enough to give us more time and leisure. By leisure, I don’t mean a virtue of some kind of inertia, but a deliberate organization based on a definite view of the meaning and purpose of life. In the urban world, the workweek hasn’t shortened. We still don’t have large swaths of time to really enjoy the good life with our families and friends.
In 1956, Sir Charles Darwin, grandson of the great Charles Darwin, wrote an interesting essay on the forthcoming Age of Leisure in the magazine New Scientist in which he argued: “The technologists, working for fifty hours a week, will be making inventions so the rest of the world need only work twenty-five hours a week. […] Is the majority of mankind really able to face the choice of leisure enjoyments, or will it not be necessary to provide adults with something like the compulsory games of the schoolboy?”
He is wrong in the first part. The world may have shrunk but cities have magnified. Travel technologies have incentivized us to live farther away and simply travel longer distances to work and attract anxiety attacks during peak hour traffic [Google Marchetti’s constant]. So, rather than being bored to death, our actual challenge is to avoid psychotic breakdowns, heart attacks, and strokes resulting from being accelerated to death.
Nonetheless, Sir Charles Darwin is accurate about “compulsory games” for adults. What else are social media platforms, compulsive eating, selfies, texts, Netflix bingeing, and 24/7 news media that dominate our lives? They are the real opiates of the masses. We have been so conditioned to search for happiness in these anodyne pastimes that defying these urges appears to be a denial of life itself. It is not surprising that we can no longer confidently tell the difference between passing pleasure and abiding joy. Lockdown or no lockdown, we are all unwittingly participating in these compulsory games with unwritten rules, believing that we are now that much closer to the good life of leisure. But are we?
‘Time is the wealth of change, but the clock in its parody makes it mere change and no wealth.’— Rabindranath Tagore
What’s the first question in the field of public policy? According to the Indian Economist Ajay Shah, “What should the state do?” is the first question. He says, “A great deal of good policy reform can be obtained by putting an end to certain government activities and by initiating new areas of work that better fit into the tasks of government.”
This question is especially essential for a weak state like India. But what if people prefer government subsidies, assertive intermediaries and a weak state? I don’t know the answer to this question. The story of the Indian farm protest is an illustrative example; it is a rebellion to stay bound to the old status quo, fearful of free choice.
Protest Timeline
04 June 2020: Union Cabinet clears three ordinances meant for reforms in the Indian Agricultural sector. These reforms upgrade farmers from being just producers to free-market traders. Agriculture is a state subject in India, but state governments have had no political will to usher in these reforms. China reformed its agriculture sector first, followed by other industries. India is doing it the other way round and thirty years late. So, the union government followed constitutional means to usher in the reforms.
04 December 2020: The Union Govt offers a work-around the dilution of MSP. By the way, MSP sets an unnaturally high price and cuts out the competition, so the middlemen club in the farmer’s association of Punjab, Haryana, and U.P. want nothing less than the scraping of these reforms.
Bottom line: A) The Ordinances aim to liberalize Agri trade and increase the number of buyers for farmers. B) de-regulation alone may not be sufficient to attract more buyers.
Almost every economist worth his salt acknowledges the merit in point A) and welcomes these essential reforms that are thirty years late but better late than never. Ajay Shah says, “We [Indians] suffer from the cycle of boom and bust in Indian agriculture because the state has disrupted all these four forces of stabilization—warehousing, futures trading, domestic trade and international trade. The state makes things worse by having tools like MSP and applying these tools in the wrong way. Better intuition into the working of the price system would go a long way in shifting the stance of policy.”
However, the middlemen argue on point B), that acts as a broad cover for their real fears of squandering their upper-hand in the current APMC/MSP system. Although nobody denies that a sudden opening of the field for competition will threaten the income of these middlemen, such uncertainties should not justify violent protests, slandering campaigns, that look to derail the entire process of upgrading the lives of a great majority of poor farmers in the country.
Even worse, these events get branded in broad strokes as state violence and human rights abuses by pre-planned Twitter and street campaigns and unnecessary road blockades. Everybody questions internet outages during these protests but no one questions the ethics of protesters blocking essential roads in the city. A section of the Indian society and diaspora hates Prime Minister Modi for sure. I have no qualms with this, but the reckless hate shouldn’t negate all nuances in analyzing perfectly sane reforms. Social justice warriors legitimize the vicious cycle of dissent without nuance because they don’t take the trouble of even reading the farm bill but make it a virtue to reason from their “bleeding hearts.”
Ordinarily, the Indian state works inadequately, experiences confusion when faced with a crisis. It comes out with a communication of a policy package that attempts to address the problem in a short-term way and retreats into indifference. So, there are two aspects to its incompetence, one, there is a lack of political will because special interest groups persuade the government towards the wrong objectives. And two, the state capacity is so weak that it fails to achieve the goal. The farm protest is a hideous third kind of difficulty: a special interest group of assertive, influential middlemen want the strong-willed, long-term thinking Indian government policy—a rare entity— to sway towards short-termism under the pretext of human rights abuse. The hard left is actually supporting the Indian state to remain weak. They will also be the first to blame the state when it comes off as weak in the next debacle.
The most important historical question to help understand our rise from the muck to modern civilization is: how did we go from linear to exponential productivity growth? Let’s call that question “who started modernity?” People often look to the industrial revolution, which is certainly an acceleration of growth…but it is hard to say it caused the growth because it came centuries after the initial uptick. Historians also bring up the Renaissance, but this is also a mislead due to the ‘written bias’ of focusing on books, not actions; the Renaissance was more like the window dressing of the Venetian commercial revolution of the 11th and 12th centuries, which is in my opinion the answer to “who started modernity.” However, despite being the progenitors of modern capitalism (which is worth a blog in and of itself), Venice’s growth was localized and did not spread immediately across Europe; instead, Venice was the regional powerhouse who served as the example to copy. The Venetian model was also still proto-banking and proto-capitalism, with no centralized balance sheets, no widespread retail deposits, and a focus on Silk Road trade. Perhaps the next question is, “who spread modernity across Europe?” The answer to this question is far easier, and in fact can be centered to a huge degree around a single man, who was possibly the richest man of all time: Jakob Fugger.
Jakob Fugger was born to a family of textile traders in Augsburg in the 15th century, and after training in Venice, revolutionized banking and trading–the foundations on which investment, comparative advantage, and growth were built–as well as relationships between commoners and aristocrats, the church’s view of usury, and even funded the exploration of the New World. He was the only banker alive who could call in a debt on the powerful Holy Roman Emperor, Charles V, mostly because Charles owed his power entirely to Fugger. Strangely, he is perhaps best known for his philanthropic innovations (founding the Fuggerei, which were some of the earliest recorded philanthropic housing projects and which are still in operation today); this should be easily outcompeted by:
His introduction of double entry bookkeeping to the continent
His invention of the consolidated balance sheet (bringing together the accounts of all branches of a family business)
His invention of the newspaper as an investment-information tool
His key role in the pope allowing usury (mostly because he was the pope’s banker)
His transformation of Maximilian from a paper emperor with no funding, little land, and no power to a competitor for European domination
His funding of early expeditions to bring spices back from Indonesia around the Cape of Good Hope
His trusted position as the only banker who the Electors of the Holy Roman Empire would trust to fund the election of Charles V
His complicated, mostly adversarial relationship with Martin Luther that shaped the Reformation and culminated in the German Peasant’s War, when Luther dropped his anti-capitalist rhetoric and Fugger-hating to join Fugger’s side in crushing a modern-era messianic figure
His involvement in one of the earliest recorded anti-trust lawsuits (where the central argument was around the etymology of the word “monopoly”)
His dissemination, for the first time, of trustworthy bank deposit services to the upper middle class
His funding of the military revolution that rendered knights unnecessary and bankers and engineers essential
His invention of the international joint venture in his Hungarian copper-mining dual-family investment, where marriages served in the place of stockholder agreements
His 12% annualized return on investment over his entire life (beating index funds for almost 5 decades without the benefit of a public stock market), dying the richest man in history.
The story of Fugger’s family–the story, perhaps, of the rise of modernity–begins with a tax record of his family moving to Augsburg, with an interesting spelling of his name: “Fucker advenit” (Fugger has arrived). His family established a local textile-trading family business, and even managed to get a coat of arms (despite their peasant origins) by making clothes for a nobleman and forgiving his debt.
As the 7th of 7 sons, Jakob Fugger was given the least important trading post in the area by his older brothers; Salzburg, a tiny mountain town that was about to have a change in fortune when miners hit the most productive vein of silver ever found by Europeans until the Spanish found Potosi (the Silver Mountain) in Peru. He then began his commercial empire by taking a risk that no one else would.
Sigismund, the lord of Salzburg, was sitting on top of a silver mine, but still could not run a profit because he was trying to compete with the decadence of his neighbors. He took out loans to fund huge parties, and then to expand his power, made the strategic error of attacking Venice–the most powerful trading power of the era. This was in the era when sovereigns could void debts, or any contracts, within their realm without major consequences, so lending to nobles was a risky endeavor, especially without backing of a powerful noble to force repayment or address contract breach.
Because of this concern, no other merchant or banker would lend to Sigismund for this venture because sovereigns could so easily default on debts, but where others saw only risk, Fugger saw opportunity. He saw that Sigismund was short-sighted and would constantly need funds; he also saw that Sigismund would sign any contract to get the funds to attack Venice. Fugger fronted the money, collateralized by near-total control of Sigismund’s mines–if only he could enforce the contract.
Thus, the Fugger empire’s first major investment was in securing (1) a long-term, iterated credit arrangement with a sovereign who (2) had access to a rapidly-growing industry and was willing to trade its profits for access to credit (to fund cannons and parties, in his case).
What is notable about Fugger’s supposedly crazy risk is that, while it depended on enforcing a contract against a sovereign who could nullify it with a word, he still set himself up for a consistent, long-term benefit that could be squeezed from Sigismund so long as he continued to offer credit. This way, Sigismund could not nullify earlier contracts but instead recognized them in return for ongoing loan services; thus, Fugger solved this urge toward betrayal by iterating the prisoner’s dilemma of defaulting. He did not demand immediate repayment, but rather set up a consistent revenue stream and establishing Fugger as Sigismund’s crucial creditor. Sigismund kept wanting finer things–and kept borrowing from Fugger to get them, meaning he could not default on the original loan that gave Fugger control of the mines’ income. Fugger countered asymmetrical social relationships with asymmetric terms of the contract, and countered the desire for default with becoming essential.
Eventually, Fugger met Maximilian, a disheveled, religion-and-crown-obsessed nobleman who had been elected Holy Roman Emperor specifically because of his lack of power. The Electors wanted a paper emperor to keep freedom for their principalities; Maximilian was so weak that a small town once arrested and beat him for trying to impose a modest tax. Fugger, unlike others, saw opportunity because he recognized when aligning paper trails (contracts or election outcomes) with power relationships could align interests and set him up as the banker to emperors. When Maximilian came into conflict with Sigismund, Fugger refused any further loans to Sigismund, and Maximilian forced Sigismund to step down. Part of Sigismund’s surrender and Maximilian’s new treaty included recognizing Fugger’s ongoing rights over the Salzburg mines, a sure sign that Fugger had found a better patron and solidified his rights over the mine through his political maneuvering–by denying a loan to Sigismund and offering money instead to Maximilian. Once he had secured this cash cow, Fugger was certainly put in risky scenarios, but didn’t seek out risk, and saw consistent yearly returns of 8% for several decades followed by 16% in the last 15 years of his life.
From this point forward, Fugger was effectively the creditor to the Emperor throughout Maximilian’s life, and built a similar relationship: Maximilian paid for parties, military campaigns, and bought off Electors with Fugger funds. As more of Maximilian’s assets were collateralized, Fugger’s commercial empire grew; he gained not only access to silver but also property ownership. He was granted a range of fiefs, including Arnoldstein, a critical trade juncture where Austria, Italy, and Slovenia border each other; his manufacturing and trade led the town to be renamed, for generations, Fuggerau, or Place of Fugger.
These activities that depended on lending to sovereigns brings up a major question: How did Fugger get the money he lent to the Emperor? Early in his career, he noted that bank deposit services where branches were present in different cities was a huge boon to the rising middle-upper class; property owners and merchants did not have access to reliable deposit services, so Fugger created a network of small branches all offering deposits with low interest rates, but where he could grow his services based on the dependability of moving money and holding money for those near, but not among, society’s elites. This gave him a deep well of dispersed depositors, providing him stable and dependable capital for his lending to sovereigns and funding his expanding mining empire.
Unlike modern financial engineers, who seem to focus on creative ways to go deeper in debt, Fugger’s creativity was mostly in ways that he could offer credit; he was most powerful when he was the only reliable source of credit to a political actor. So long as the relationship was ongoing, default risk was mitigated, and through this Fugger could control the purse strings on a wide range of endeavors. For instance, early in their relationship (after Maximilian deposed Sigismund and as part of the arrangement made Fugger’s interest in the Salzburg mines more permanent), Maximilian wanted to march on Rome as Charlemagne reborn and demand that the pope personally crown him; he was rebuffed dozens of times not by his advisors, but by Fugger’s denial of credit to hire the requisite soldiers.
Fugger also innovated in information exchange. Because he had a broad trading and banking business, he stood to lose a great deal if a region had a sudden shock (like a run on his banks) or gain if new opportunities arose (like a shift in silver prices). He took advantage of the printing press–less than 40 years after Gutenberg, and in a period when most writing was religious–to create the first proto-newspaper, which he used to gather and disseminate investment-relevant news. Thus, while he operated a network of small branches, he vastly improved information flow among these nodes and also standardized and centralized their accounting (including making the first centralized/combined balance sheet).
With this broad base of depositors and a network of informants, Fugger proceeded to change how war was fought and redraw the maps of Europe. Military historians have discussed when the “military revolution” that shifted the weapons, organization, and scale of war for decades, often centering in on Swedish armies in the 1550s as the beginning of the revolution. I would counter-argue that the Swedes simply continued a trend that the continent had begun in the late 1400’s, where:
Knights’ training became irrelevant, gunpowder took over
Logistics and resource planning were professionalized
Early mechanization of ship building and arms manufacturing, as well as mining, shifted war from labor-centric to a mix of labor and capital
Multi-year campaigns were possible due to better information flow, funding, professional organization
Armies, especially mercenary groups, ballooned in size
Continental diplomacy became more centralized and legalistic
Wars were fought by access to creditors more than access to trained men, because credit could multiply the recruitment/production for war far beyond tax receipts
Money mattered in war long before Fugger: Roman usurpers always took over the mints first and army Alexander showed how logistics and supply were more important than pure numbers. However, the 15th century saw a change where armies were about guns, mercenaries, technological development, and investment, and above all credit, and Fugger was the single most influential creditor of European wars. After a trade dispute with the aging Hanseatic League over their monopoly of key trading ports, Fugger manipulated the cities into betraying each other–culminating in a war where those funded by Fugger broke the monopolistic power of the League. Later, because he had a joint venture with a Hungarian copper miner, he pushed Charles V into an invasion of Hungary that resulted in the creation of the Austro-Hungarian Empire. These are but two of the examples of Fugger destroying political entities; every Habsburg war fought from the rise of Maximilian through Fugger’s death in 1527 was funded in part by Fugger, giving him the power of the purse over such seminal conflicts as the Italian Wars, where Charles V fought on the side of the Pope and Henry VIII against Francis I of France and Venice, culminating in a Habsburg victory.
Like the Rothschilds after him, Fugger gained hugely through a reputation for being ‘good for the money’; while other bankers did their best to take advantage of clients, he provided consistency and dependability. Like the Iron Bank of Braavos in Game of Thrones, Fugger was the dependable source for ambitious rulers–but with the constant threat of denying credit or even war against any defaulter. His central role in manipulating political affairs via his banking is well testified during the election of Charles V in 1519. The powerful kings of Europe– Francis I of France, Henry VIII of England, and Frederick III of Saxony all offered huge bribes to the Electors. Because these sums crossed half a million florins, the competition rapidly became one not for the interest of the Electors–but for the access to capital. The Electors actually stipulated that they would not take payment based on a loan from anyone except Fugger; since Fugger chose Charles, so did they.
Fugger also inspired great hatred by populists and religious activists; Martin Luther was a contemporary who called Fugger out by name as part of the problem with the papacy. The reason? Fugger was the personal banker to the Pope, who was pressured into rescinding the church’s previously negative view of usury. He also helped arrange the scheme to fund the construction of the new St. Peter’s basilica; in fact, half of the indulgence money that was putatively for the basilica was in fact to pay off the Pope’s huge existing debts to Fugger. Thus, to Luther, Fugger was greed incarnate, and Fugger’s name became best known to the common man not for his innovations but his connection to papal extravagance and greed. This culminated in the 1525 German Peasant’s War, which saw an even more radical Reformer and modern-day messianic figure lead hordes of hundreds of thousands to Fuggerau and many other fortified towns. Luther himself inveighed against these mobs for their radical demands, and Fugger’s funding brought swift military action that put an end to the war–but not the Reformation or the hatred of bankers, which would explode violently throughout the next 100 years in Germany.
This brings me to my comparison: Fugger against all of the great wealth creators in history. What makes him stand head and shoulders above the rest, to me, is that his contributions cross so many major facets of society: Like Rockefeller, he used accounting and technological innovations to expand the distribution of a commodity (silver or oil), and he was also one of the OG philanthropists. Like the Rothschilds’ development of the government bond market and reputation-driven trust, Fugger’s balance-sheet inventions and trusted name provided infrastructural improvement to the flow of capital, trust in banks, and the literal tracking of transactions. However, no other capitalist had as central of a role in religious change–both as the driving force behind allowing usury and as an anti-Reformation leader. Similarly, few other people had as great a role in the Age of Discovery: Fugger funded Portuguese spice traders in Indonesia, possibly bankrolled Magellan, and funded the expedition that founded Venezuela (named in honor of Venice, where he trained). Lastly, no other banker had as influential of a role in political affairs; from dismantling the Hanseatic League to deciding the election of 1519 to building the Habsburgs from paper emperors to the most powerful monarchs in Europe in two generations, Fugger was the puppeteer of Europe–and such an effective one that you have barely heard of him. Hence, Fugger was not only the greatest wealth creator in history but among the most influential people in the rise of modernity.
Fugger’s legacy can be seen in his balance sheet of 1527; he basically developed the method of using it for central management, its only liabilities were widespread deposits from the upper-middle class (and his asset-to-debt ratio was in the range of 7-to-1, leaving an astonishingly large amount of equity for his family), and every important leader on the continent was literally in his debt. It also showed him to have over 1 million florins in personal wealth, making him one of the world’s first recorded millionaires. The title of this post was adapted from a self-description written by Jakob himself as his epitaph. As my title shows, I think it is fairer to credit his wealth creation than his wealth accumulation, since he revolutionized multiple industries and changed the history of capitalism, trade, European politics, and Christianity, mostly in his contribution to the credit revolution. However, the man himself worked until the day he died and took great pride in being the richest man in history.
If two politicians are equal in every other respect but one was better at basketball… I guess go with that one? I mean, all else equal they’re maybe a better team player or something. But that line of thinking doesn’t mean we should only ever vote for ex-NBA stars.
There are plenty of similar potentially attractive signals: veteran status, success in business and/or being a fake billionaire, academic success, acting, etc. Some signals are stronger, and some imply a smaller pool of candidates. If there are more successful business people in the world we should expect to observe more of them transitioning to politics than, say, world-class bowlers. Likewise, if the signal is more relevant (e.g. law degree vs. paleontology degree), it makes sense to see more of them in the wild.
That 18% of German politicians have PhD’s seems wild to me. Maybe I’m biased because I work in an organization full to the brim with PhD’s. But that many politicians with degrees seems about as reasonable and as likely as having half of Congress be elite athletes.