The case for Constructivism in IR

Part one: The fourth debate and the origins of Constructivism

Recently there has been a surge in blogpost dealing with International Relation (IR) theory on this blog. Dr. Rosi stated that he thinks the paradigm of Realism best explains world politics. On the other side, Dr. van de Haar has proven to be an expert in the liberal tradition of IR, putting forward nuanced explanations of different subcategories of liberalism and even making a distinction between liberal and libertarian IR theory (As an IR undergrad I can say, this is not a very common thing to do). Although it seems like kind of a mismatch in the first place, that an undergrad tries to argue against two scholars who have spend a significant time of their life doing research in this particular subject, I at least want to try making a convincing case for Constructivism in world politics. However, I do not want to boil down such a diverse and heterogeneous tradition of thought into a few hundred words, which why I try to do this as a series. After describing the historical circumstances of its emergence, I’d like to summarize the key point of constructivist philosophy and then conclude how this school of thought has developed since.

Constructivism – The origins

While the origins of Liberalism and Realism can be traced back to the beginnings of the 20th century, Constructivism is a quite new school of thought emerging after the end of the cold war in 1990. Looking back, this was one of the most interesting and turbulent times for IR scholars. The peaceful collapse of the soviet union left structural realists and neorealist puzzled: How could the security dilemma be solved peacefully? The end of the cold war caused the theoretical hegemony of neorealism to wane. Contrary, Liberalism as IR theory regained interest culminating in the liberal manifesto of Francis Fukuyama, who proclaimed a bloomy future for liberal democracy by conflating Hegel’s historical dialectics with democratic peace theory.

Simultaneously, the inter-paradigm debate (or arguably the 3rd big debate in IR) had gained momentum and showed that IR scholars saw a significant barrier in the neorealist – neoliberal distinction inhibiting actual research progress. Instead, it became clear, that scholars looked for a via media approach which would focus on predictions and results instead of a sharp theoretical distinction, leading to the so-called Neo-Neosynthesis. Keeping the end of the third big debate of International Relations in mind, it becomes more clear how the fourth big debate unfolded.

Since the 1980’s scholars began to question the positive research agenda as well as their limited methods to explain world politics. Yosef Lapid, Friedrich Krachtowil and Richard K. Ashely published excellent works which directly attacked the determinism of traditional IR Theory. During the next ten years, the criticism got harsher and harsher. Scholars feared a fallback into another sharp distinction of radical constructivist and traditional IR schools – the issue that once sparked the inter-paradigm debate.

In 1992 Alexander Wendt published his well-known paper “Anarchy Is What The States Make Of It”, which eventually became a (or maybe: the first?) catch-phrase in IR. If I would have to choose one essay in order to understand IR after 1990, It certainly would be this one. With this essay, Wendt effectively tries to build a bridge between the newly emerging tradition (radical constructivism) and the traditional schools of thought (namely: liberal institutionalist).

In Order to differentiate Constructivism from other IR schools of thought, it is useful to recall how they conceptualize anarchy. Although we firstly might associate anarchy with total chaos, it basically just describes the absence of a centralized authority, which basically all main schools agree on as state of the art in world politics. However, the conceptualization of anarchy is the key to the distinction of these schools of thought.

Power and the construction of anarchy

Contrary to the deterministic construction of anarchy in Realism, Liberalism and Marxism, Constructivism introduced anarchy as a dynamic variable in the international system. As stated in the title, “Anarchy is what the states make of it” refutes anarchy as an axiom from which we can derive theories. Instead, Alexander Wendt pronounces the vital moment of the “first contact” between states from which in a process of ego-alter construction an anarchic international system is constituted. The key is that the intersubjective perception of the “other” determines whether we are friendly, hostile or neutral towards another state. Hence, anarchy operates not every time in the same way. Wendt instead distinguishes at least three kinds of anarchy:

  • Hobbesian anarchy – States perceive each other as predator, no international cooperation
  • Lockesian anarchy – States perceive each other as neutral, constant adjustment of the balance of power operates as a regulatory principle
  • Kantian anarchy – States perceive each other as beneficial allies, international cooperation becomes possible and international organizations emerge.

In the real world, we can observe (at least in Europe) how the international system went through all of these stages: The pre-1648 Europe basically was a Hobbesian playground until the peace of Westphalia manifested a respected international system of sovereign nation-states from which we got into Lockesian anarchy. In the early 20th century the globalization began to gain momentum and suddenly international organizations and institutions began to flourish, hence heralding Kantian anarchy. this development, of course, does not prevent states from states self-interested behaviour, but it puts effective constraints on such acts by complex interdependent relation. Although I do not share the optimism (?!) of scholars to whom this development indicates a future transcendence into a Kantian-like world state, it is nonetheless remarkable how humankind has “tamed” (or at least constrained) anarchy in the international system.

This development goes hand in hand with the conceptualization of power. In the eyes of realists, power is determined by brute factors revolving mostly around military force. It was not until Joseph Nye and Robert Keohane introduced the complex interdependence theory that soft factors such as culture, ideology and identity of states were considered to be influential to international politics. The wide concept of power was subject to constant changes due to a process of “complex learning”. State identities are not stable but rather fluently dependent on institutional changes. Thus, states are able to learn and adapt to their political environment. Early constructivist (or moderate constructivist)  thought sought to build a bridge between liberals, who were convinced that state identities were not stable but could change due to institutional changes, and early radical constructivist thoughts, who clearly despised the positivist research agenda of traditional IR schools.

When we analyze power in the real world today, it becomes clear that a narrow focus on military force fails to capture real power politics. What is more dangerous for the US, 50 nuclear missiles in North Korea or 50 nuclear missiles in Canada? What is perceived to undermine western values in a more significant way: 50 Buddhist preachers or 50 imams? Power cannot only be measured in quantitative ways but one rather has to take soft factors into consideration and put the brute facts into a cultural context.

To conclude my first takes:
Traditional schools of thought in IR fail to recognize anarchy and power as heterogeneous and dynamic variables. Constructivism points out this blind spot and seeks to connect new concepts with a post-positivist ontology, epistemology and methodology.

In the next part of the series, I want to differentiate moderate Constructivism from Radical Constructivism, Postmodernism and Poststructuralism and demonstrate what new fields of research have opened up in the course of a constructivist approach.

Relicts of the past? The current challenges for diplomacy

The last few weeks were quite a blast for me: I’ve interned at the German embassy in Rome. A new job in a new city. I thought to process the experiences I made here in one (or a few?) articles.

It’s been quite a rough month for Germany’s Foreign Affairs department. First, Daniel Kriener, the German ambassador in Venezuela, was forced to leave the country after welcoming Interim President Guiadó at the airport of Caracas. Interestingly, although plenty of other diplomats joined him, he was the only one to be declared a “persona non grata” for interfering in Venezuela’s internal affairs. A few weeks later, a deputy speaker of the German Bundestag (who is also a member of the liberal party) demands to expel the US ambassador Grenell for the same offence. Prior, the US diplomat has criticized Germany’s plan to break their promise of contributing more to NATO’s defence budget. Albeit I politically agree with both actions of the diplomats in these cases, they delineate the ongoing structural changes in the diplomacy sector. To illustrate this, I will first provide a theoretical framework to analyze ongoing diplomatic challenges before trying to examine the role of diplomacy in the future.

Principal-Agent Theory and decreasing relevance

I conceive diplomacy as mostly a principal-agent based problem. I believe that many problems in diplomatic negotiations can be traced back to the classic effects of asymmetric information. Since two principals, in this case two states, cannot negotiate with each other directly in most cases, these arbitrations are carried out between various agents. Those agents are of course not always the ambassadors. In a broad meaning, one can apply the principal-agent paradigm to diplomacy by every negotiating process initiated by the state.

Through the lens of the principal-agent paradigm, I perceive the main task of diplomacy to achieve a good negotiating position, for example through an informational advantage. However, due to globalization, state-to-state diplomacy has been drastically weakened. The negotiating game is now mostly carried out within other institutions with lower transactions costs. Two countries want a new trade deal? Just orientate on WTO Rules. Sue another country? Call the International Criminal Court. A few voices made reasonable arguments even for abolishing unnecessary embassies and only keeping the crucial ones. The Trump administration, for example, seems not eagerly committed to fill the around 18 vacant ambassador positions hastily.

Certainly, the globalization combined with the expansion of robust institutions leaves little space for traditional diplomacy as a driving force in interstate relations. This is not necessarily a bad development: As Paul W. Meerts points out, this can be a huge chance for weaker states since negotiating in multilateral rather than bilateral constellations tends to weaken the position of stronger states. Thus, playing out the trump cards in negotiations will be harder for the hegemon. We can currently witness this in the Brexit debate: Even though the strong states, Germany and France, have a vast repertoire of power resources to use as leverage against GB in the negotiations, the can hardly deploy them through EU’s multipolar negotiating structure.

Contrary, there are also recent examples of deploying bilateral traditional diplomacy measures successfully. China’s initiation of Italy’s accession to the Belt Road Initiative (see Tridivesh Singh Maini’s great article here for a quick overview) is a prime example for this. But no other case shows the weaknesses of bilateral diplomacy in a more drastic way: China was able to transpose their tremendous power resources into a deal which heavily favours the Chinese economy. The very ambiguous agreement laid down a strategy of “closer economic collaboration.” The oppositional criticism of the deal coming from the very left and the right is based on economic nationalism and thus misses the important point. Chinese government exerts immense influence on key enterprises like  Tencent, Alibaba, and Badoo: Digital fundamental research topics such as AI were distributed to the firms not through competition but through the state ( I highly recommend Amy Webb’s EconTalk if you want to dig deeper into this.). Once they build sufficient digital infrastructure here in Europe, network effects and technological advantage will come into effect and engender high entry barriers and exit costs. This makes it easy for China to enforce its regulation rather than obeying European ones. Although it is hard to finally determine if multilateral negotiations would have secured a politically better deal, I favour higher short-term transaction cost of multilateral negotiations over the long-term threat showed above.

Embassies as service provider

Of course, taking care of a good interstate negotiation position is not the only task of an embassy. A popular counterargument is that the principal-agent perspective neglects the vital daily business of embassies to help their citizens abroad. Speaking of large and prestigious Embassies though, I estimate that their role as service provider for abroad living citizens will further decline. Most of their maintenance work for citizens living abroad will be redundant due to technological process and further institutionalization. Renewing a Passport, issuing visas and transporting back coffins (yep) are a frequent task, but easy to “source out” to private actors in the future.

But what is the role for ambassadors and embassies then?

This question is where it gets interesting in my opinion. Deeply rooted in international conventions and international customary law, discreet and silent work has been prerequisite for an ambassador. Carefully collecting small pieces of information and building bridges to local actors were the key for a good negotiating position. But as elaborated above, international institutions do the job more efficiently. A new role of ambassadors as advocates for concrete policy measures would be diametrically opposed to international conventions. Based upon the “legality creates legitimacy” premises, a further politicization of diplomacy seems not at present having a majority and thus is unlikely to be buttressed by legal means.

However, if we fall back into a narrative of nationalism, bilateral diplomacy will regain relevance. Otherwise, it will continue to slowly lose importance and eventually wane. Hence, the main challenge nowadays is to look for the right niche for traditional diplomacy – and it seems that it has not been found yet.

Libertarianism and Neoliberalism – A difference that matters?

I recently saw a thoroughgoing Twitter conversation between a Caleb Brown, which most of you presumably know from the Cato Daily Podcast, and the Neoliberal Project, an American project founded to promote the ideas of neoliberalism, regarding the differences between libertarianism and neoliberalism. For those who follow the debate, it is nothing new that the core of this contention goes way beyond an etymological dimension – it is concerned with one of the most crucial topics in the liberal scholarship: the relationship between government and free markets.

Arbitrary categories?

I can understand the aim to further structure the liberal movement into subcategories which represent different types of liberalism. Furthermore, I often use these different subcategories myself to distance my political ideology from liberal schools I do not associate with, such as paleo-libertarianism or anarcho-capitalism. However, I do not see such a distinct line between neoliberalism and libertarianism in practice.

As describes by Caleb Brown (and agreed on by the Neoliberal Project), neoliberalism wants to aim the wealth generated by markets at specific social goals using some government mechanism, whilst libertarianism focuses on letting the wealth created by free markets flow where it pleases, so to say. In my opinion, the “difference” between these schools is rather a spectrum of trust in government measures with libertarianism on one side and neoliberalism on the other.

I’ve often reached a certain point in the same discussion with fellow liberals:

Neoliberal: I agree that free markets are the most efficient tool to create wealth. They are just not very good at distributing it. By implementing policy X, we could help to correct market failure Y.

Libertarian: Yeah, I agree with you. Markets do not distribute wealth efficiently. However, the government has also done a poor job trying to alleviate the effects of market failures, especially when we look at case Z… (Of course, libertarians bring forth other arguments than public choice, but it is a suitable example.)

After reaching this point, advocating for governmental measures to fix market failures often becomes a moral and personal objective. My favourite example is emission trading. I am deeply intrigued by the theoretical foundation of the Coase-Theorem and how market participants still can find a Pareto-efficient equilibrium by just negotiating. Based on this theoretical framework, I would love to see a global market for carbon emission trading.

However, various mistakes were made during the implementation of emission allowances. First, there were way too many emission allowances on the market which engendered the price to drop dangerously low. Additionally, important markets such as air and ship transportation were initially left out. All in all, a policy buttressed by a solid theory had a more than rough start due to bad implementation.

At this point, neoliberals and libertarians diverge in their responses. A libertarian sees another failure of the government to implement a well-intended policy, whereas a neoliberal sees a generally good policy which just needs a bit further improvement. In such cases, the line between neoliberals and libertarians becomes very thin. And from my point of view, we make further decisions based on our trust in the government and on our subjective-moral relation to the topic as well.

I saw government too often fail (e.g. engaging in industry politics), which should be left nearly entirely to free markets. However, I also saw the same government struggling to find an adequate response to climate change. Contrary, I believe that officials should carry on with their endeavours to counteract climate change whereas they should stay out of industry politics.

Furthermore, in the recent past, there has been a tremendous amount of libertarian policy proposals put forth which remodeled the role of government in a free society: A libertarian case for mandatory vaccination? Alright. A libertarian case for UBI? Not bad. A libertarian case for a border wall? I am not so sure about that one.

Although these examples may define libertarianism in their own context, the general message remains clear to me: libertarians are prone to support governmental measures if they rank the value of a specific end higher than the risk of a failed policy. Since such an article is not the right framework to gather a robust amount of data to prove my point empirically, I rely on the conjecture, that the core question of where the government must interfere is heavily driven by subjective moral judgements.

Summary

Neoliberals and Libertarians diverge on the issue of government involvement in the economy. That’s fine.

Governmental policies often do not fully reach their intended goals. That’s also fine.

The distinction between neoliberals and libertarians is merely a threshold of how much trust one puts in the government’s ability to cope with problems. Both schools should not value this distinction too much since it is an incredibly subjective issue.

Evidence-based policy needs theory

This imaginary scenario is based on an example from my paper with Baljinder Virk, Stella Mascarenhas-Keyes and Nancy Cartwright: ‘Randomized Controlled Trials: How Can We Know “What Works”?’ 

A research group of practically-minded military engineers are trying to work out how to effectively destroy enemy fortifications with a cannon. They are going to be operating in the field in varied circumstances so they want an approach that has as much general validity as possible. They understand the basic premise of pointing and firing the cannon in the direction of the fortifications. But they find that the cannon ball often fails to hit their targets. They have some idea that varying the vertical angle of the cannon seems to make a difference. So they decide to test fire the cannon in many different cases.

As rigorous empiricists, the research group runs many trial shots with the cannon raised, and also many control shots with the cannon in its ‘treatment as usual’ lower position. They find that raising the cannon often matters. In several of these trials, they find that raising the cannon produces a statistically significant increase in the number of balls that destroy the fortifications. Occasionally, they find the opposite: the control balls perform better than the treatment balls. Sometimes they find that both groups work, or don’t work, about the same. The results are inconsistent, but on average they find that raised cannons hit fortifications a little more often.

A physicist approaches the research group and explains that rather than just trying to vary the height the cannon is pointed in various contexts, she can estimate much more precisely where the cannon should be aimed using the principle of compound motion with some adjustment for wind and air resistance. All the research group need to do is specify the distance to the target and she can produce a trajectory that will hit it. The problem with the physicist’s explanation is that it includes reference to abstract concepts like parabolas, and trigonometric functions like sine and cosine. The research group want to know what works. Her theory does not say whether you should raise or lower the angle of the cannon as a matter of policy. The actual decision depends on the context. They want an answer about what to do, and they would prefer not to get caught up testing physics theories about ultimately unobservable entities while discovering the answer.

Eventually the research group write up their findings, concluding that firing the cannon pointed with a higher angle can be an effective ‘intervention’ but that whether it does or not depends a great deal on particular contexts. So they suggest that artillery officers will have to bear that in mind when trying to knock down fortifications in the field; but that they should definitely consider raising the cannon if they aren’t hitting the target. In the appendix, they mention the controversial theory of compound motion as a possible explanation for the wide variation in the effectiveness of the treatment effect that should, perhaps, be explored in future studies.

This is an uncharitable caricature of contemporary evidence-based policy (for a more aggressive one see ‘Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials’). Metallurgy has well-understood, repeatedly confirmed theories that command consensus among scientists and engineers. The military have no problem learning and applying this theory. Social policy, by contrast, has no theories that come close to that level of consistency. Given the lack of theoretical consensus, it might seem more reasonable to test out practical interventions instead and try to generalize from empirical discoveries. The point of this example is that without theory empirical researchers struggle to make any serious progress even with comparatively simple problems. The fact that theorizing is difficult or controversial in a particular domain does not make it any less essential a part of the research enterprise.

***

Also relevant: Dylan Wiliam’s quip from this video (around 9:25): ‘a statistician knows that someone with one foot in a bucket of freezing water and the other foot in a bucket of boiling water is not, on average, comfortable.’

Pete Boettke’s discussion of economic theory as an essential lens through which one looks to make the world clearer.

The Impossible Trinity of Liberal Democracy

In the first part of my series on democracy published a few years ago, I made a distinction between four senses in which the term “democracy” is used. To briefly recap, I made they were: a) a term of empty political praise for policies which partisans like b) an institutional decision-making process emphasizing the primacy of majoritarian opinion c) a generic term for the type of procedures which have been prevalent in the west, and d) an overarching term for the ethical commitments of liberals. In that series, I focused on the tension b) and d), mostly ignoring a) and c). (For Present purposes, my highly speculative musings on anarchism are irrelevant.

In a recent podcast of the Ezra Klein show  (which I highly recommend) discussing his book The People vs. Democracy: Why Our Freedom Is in Danger and How To Save It, Harvard political theorist Yascha Mounk and Ezra Klein were debating how pessimistic we should be about the prospects for the future of American Democracy. I don’t really wish to comment on whether we should be pessimistic or not, but I want to make a further distinction that clarifies some of the disagreements and points towards a deeper issue in the workings of democratic institutions. I will argue that democracy consists of a liberal, majoritarian, and procedural dimension and these dimensions are not reconcilable for very long.

Mounk makes a similar distinction to the one I made between democratic majoritarianism and liberalism as a reason to be pessimistic. Klein tended to push back, focusing on the ways in which modern American political culture is far more ethically liberal than it has ever been, as seen through the decline in racism since the middle of the twentieth century and decline in homophobia since the 1990s. Mounk, however, emphasized how respect for procedure in the American political process has declined during the Trump Era, as evidenced by Trump’s disrespect for the political independence of courts and agencies like the Department of Justice.

However, throughout Klein’s and Mounk’s debate, it became clear that there was another distinction which needed to be made explicitly, and one which I have tended to heavily under-emphasize in my own thinking on the feasibility of democracy. It seems to me there are at least three dimensions by which to judge the functioning of democracies which are important to distinguish:

  1. Majoritarianism—the extent to which a democracy is sensitive to majority public opinion. Democracy, in this dimension, is simply the tendency to translate majority opinion to public policy, as Mounk puts it.
  2. Liberalism—this refers to the ethical content towards which democracies in the west try to strive. This is the extent to which citizens are justly treated as moral equals in society; whether minority religious freedoms are respected, racial and ethnic minorities are allowed equal participation in society (economically and politically), and the extent to which general principles of liberal justice (however they may be interpreted) are enacted.
  3. Legal proceduralism—the extent to which political leaders and citizens respect the political independence of certain procedures. This dimension heavily emphasizes the liberal belief in the rule of law and the primacy of process. This can include law enforcement agencies such as the Department of Justice or the FBI, courts, and respect for the outcomes of elections even when partisan opponents are victorious.

It seems that there are reasons why one would want a democracy to retain all three features. Majoritarianism could be desirable to ensure stability, avoiding populist revolutions and uprising, and perhaps because one thinks it is just for government to be accountable to citizens. Liberalism, clearly, is desirable to ensure the society is just. Proceduralism is desirable to maintain the stability of the society given that people have deep political and philosophical disagreements.

Klein and Mounk’s debate, considering this explicit triadic distinction, can be (crudely) seen as Mounk initially emphasizing the tension between majoritarianism and liberalism in modern democracies. Klein pushes back saying that we are more liberal today than we’ve ever been, and perhaps the current majoritarian populist turn towards Trump should be put in context of other far more illiberal majoritarian populist impulses in the past. Mounk’s response seems to be that there’s also been a decline in respect for legal procedure in modern American politics, opening a danger for the instability of American democracy and a possible rise of authoritarianism.

First, it seems to me that both Mounk and Klein overemphasize respect for procedure in the past. As Robert Hasnas has argued, it has never been the case that anyone treats the law as independent simply because “the law is not a body of determinate rules that can be objectively and impersonally applied by judges” and therefore “what the law prescribes is necessarily determined by the normative predispositions of the one who is interpreting it.” There is always an ethical, and even a partisan political dimension, to how one applies procedure. In American history, this can be seen in ways that courts have very clearly interpreted law in motivated ways to justify a partisan, often illiberal, political view, such as Bowers v. Hardwick. There has always been a tendency for procedures to be applied in partisan ways, from the McCarthyite House Unamerican Committee, to the FBI’s persecution of civil rights leaders. Indeed, has Hasnas argues, the idea that procedures and laws can be entirely normatively and politically independent is a myth.

It is true, however, that Mounk does present reason to believe that populism makes disrespect for these procedures explicit. Perhaps one can say that while procedural independence is, in a pure sense, a myth, it is a constructive myth to maintain stability. People believing that elections are not independent, Trump’s disrespect for the independence of courts and justice, allows for a disintegration of those institutions into nothing but a Carl Schmitt-style, zero-sum war for power that can undermine stability of political institutions.

On the other hand, it seems worth emphasizing that there is often a tension between respect for procedure and the ethics of liberalism. Klein points out how there was large respect for legal procedure throughout American history that heavily undermined ethical liberalism, such as southerners who filibustered anti-lynching laws. Indeed, the justification for things such as the fugitive slave law was respect for the political independence of the legal right to property in slaves. All the examples of procedure being applied in politically biased and illiberal ways given moments ago support this point There is nothing in the notion that legal and electoral procedures are respected that guarantees those procedures in place will respect liberal principles of justice.

I remain agnostic as to whether we should be more pessimistic about the prospects for democracy in America today than at any other point in American history. However, at the very least, this debate reveals an impossible trinity, akin to the impossible trinity in monetary policy, between these three dimensions of democracy. If you hold majority opinion as primary, that includes populist urges to undermine the rule of law. Further, enough ink has been spilled on the tensions between majoritarianism and liberalism or effective policy. If you hold respect for procedure as primary, that includes the continuation procedures which are discriminatory and unjust, as well as procedures which restrict and undermine majority opinion. If you hold the justice of liberalism as primary, that will generate a tendency for morally virtuous liberals to want to undermine inequitable, unjust procedures and electoral outcomes and to want to restrict the ability of majorities to undermine minority rights.

The best a conventional democrat can do, it seems to me, is to pick two. A heavily majoritarian democracy where procedures are respected, which seems to be the dominant practice in American political history, is unlikely to be very ethically liberal. An ethically liberal and highly procedural government, something like a theoretically possible but practically unfeasible liberal dictator or perhaps a technocratic epistocracy (for which Jason Brennan argues), is a possible option but might be unstable if majorities see it as illegitimate or ethically unpalatable to procedural democrats. An ethically liberal but majoritarian democracy seems unworkable, given the dangers of populism to undermine minority rights and the rational ignorance and irrationality of voters. This option also seems to be what most western democracies are currently trending towards, which rightly worries Mounk since it is also likely to be extremely unstable. But if there’s a lesson to be learned from the injustice of American history and the rise of populism in the west it’s that choosing all three is not likely to be feasible over the long term.

The Dictator’s Handbook

I recently pointed you towards a book that has turned out to be a compelling and interesting read.

At the end of the day, it’s a straightforward application of public choice theory and evolutionary thinking to questions of power. Easy to understand theory is bundled with data and anecdotes* to elucidate the incentives facing dictators, democrats, executives, and public administrators. The differences between them are not discrete: they all face the same basic problem of compelling others’ behavior while facing some threat of replacement.

Nobody rules alone, so staying in power means keeping the right people happy and/or afraid. All leaders are constrained by their underlings. These underlings are necessary to get anything done, but they’re also potential rivals. For Bueno de Mesquita and Smith the crucial facts of a political order are a) how big a coalition (how many underlings) the ruler is beholden too and b) how replaceable the members of that coalition are.

The difference between liberal and illiberal orders boil down to differences in those two parameters. In democracies with a larger coalition and less replaceable coalition members, rulers behave better.

 

I got a Calculus of Consent flavor from Dictator’s Handbook. At the end of the day, collective decision making will reflect some version of “the will of the people… who matter.” But when we ask about the number of people who matter, we run into C of C thinking. Calling for bigger coalitions is another way of calling for an approach to an effective unanimity rule (at least at the constitutional stage).

In C of C the question of the optimal voting rule (majority vs. super majority vs. unanimity) boils down to a tradeoff between the costs of organizing and the costs of externalities imposed by the ruling coalition. On the graph below (from C of C) we’re comparing organization costs (J) against externality costs (I) (the net costs of the winning coalition’s inefficient policies). The idea is that a unanimity rule would prevent tyranny of the majority (i.e. I is downward sloping), but that doesn’t mean unanimity is the optimal voting rule.

Figure 18.  Click to open in new window.

But instead of asking “what’s efficient” let’s think think about what we can afford out of society’s production, then ask who makes what decisions. In a loose sense, we can think of a horizontal line on the graph above representing our level of wealth. If we** aren’t wealthy enough to organize, then the elites rule and maximize rent extraction. We can’t get far up J, so whichever coalition is able to rule imposes external costs at a high level on I.

But I‘s height is a function of rent extraction. Rulers face the classic conundrum of whether to take a smaller piece of a larger pie.

The book confirms what we already know: when one group can make decisions about what other groups can or must do, expect a negative sum game. But by throwing in evolutionary thinking it shed light on why we see neither an inexorable march of progress nor universal tyranny and misery.

As you travel back in time, people (on average) tend to look more ignorant, cruel, and superstitious. The “default state” of humanity is poverty and ignorance. The key to understanding economics is realizing that we’ve bootstrapped ourselves out of that position and we aren’t done yet.

The Dictator’s Handbook helped me realize that I’d been forgetting that the “default state” of political power is rule by force. The liberalization we’ve seen over the last 500 years has been just the first part of a bootstrapping process.

Understanding the starting point makes it clear that more inclusive systems use ideas, institutions, capital, and technology to abstract upward to more complex levels. Something like martial honor scales up the exercise of power from the tribe (who can The Chief beat up) to the fiefdom (now the Chief has sub-chiefs). Ideology and identity can tie fiefdoms into nation-states (now we’ve got a king and nobility). Wealth plus new ideologies create more inclusive and democratic political orders (now we’ve got a president and political parties). But each stage is built on the foundation set before. We stand on the shoulders of giants, but those giants were propped up by the non-giants around them.

Our world was built by backwards savages. The good news is that we can use the flimsier parts of the social structure we inherited as scaffolding for something better (while maintaining the really good stuff). What exactly this means is the tricky question. Which rules, traditions, organizations, and processes are worth keeping? How do we maintain those? How/when do we replace the rest? And what does “we” even mean?

Changing the world involves uncertainty. There are complex interrelations between every part of reality. And the knowledge society needs is scattered through many different minds. To make society better, we need buy-in from our neighbors (nobody rules alone). And we need to realize that the force we exert will be countered by an equal and opposite force some plural, imperfectly identifiable, maybe-but-probably-not equal, and only-mostly-opposite forces. There are complex and constantly shifting balances between different coalitions vying for power and the non-coalitions that might suddenly spring into action if conditions are right. Understanding the forces at play helps us see the constraints to political change.

And there’s good news: it is possible to create a ruling coalition that is more inclusive. The conditions have to be right. But at least some of those conditions are malleable. If we can sell people on the right ideas, we can push the world in the right direction. But we have to work at it, because there are plenty of people pitching ideas that will concentrate power and create illiberal outcomes.


*I read the audiobook, so I’m basically unable to vouch for the data analysis. Everything they said matched the arguments they were making, but without seeing it laid out on the page I couldn’t tell you whether what they left out was reasonable.

**Whatever that means…

Adam Smith: a historical historical detective?

9781107491700

Adrian Blau at King’s College London has an on-going project of making methods in political theory more useful, transparent and instructive, especially for students interested in historical scholarship.

I found his methods lecture, that he gave to Master’s students and went onto publish as ‘History of political thought as detective work’, particularly helpful for formulating my approach to political theory. The advantage of Blau’s advice is that it avoids pairing technique with theory. You can be a Marxist, a Straussian, a contextualist, anything or nothing, and still apply Blau’s technique.

Blau suggests that we adopt the persona of a detective when trying to understand the meaning of historical texts. That is, we should acknowledge

  • uncertainty associated with our claims
  • that facts of the matter will almost certainly be under-determined by the available evidence
  • that conflicting evidence probably exists for any interesting question
  • that interpreting any piece of evidence through any exclusive theoretical lens is likely to lead us to error

To make more compelling inferences in the face of these challenges, we can use techniques of triangulation (using independent sources of evidence together). This could include arguing for an interpretation of a thinker’s argument based on a close reading of their text, while showing that other people in the thinker’s social milieu deployed language in a similar way (contextual), and also showing how helpful that argument was for achieving a political end that was salient in that time and place (motivation).

Continue reading