It’s been a heck of a year. Thanks for plugging along with Notes On Liberty. Like the world around me, NOL keeps getting better and better. Traffic in 2019 came from all over the place, but the usual suspects didn’t disappoint: the United States, United Kingdom, Canada, India, and Australia (in that order) supplied the most readers, again.
As far as most popular posts, I’ll list the top 10 below, but such a list doesn’t do justice to NOL and the Notewriters’ contribution to the Great Conversation, nor will the list reflect the fact that some of NOL‘s classic pieces from years ago were also popular again.
Nick’s “One weird old tax could slash wealth inequality (NIMBYs, don’t click!)” was in the top ten for most of this year, and his posts on John Rawls, The Joker film, Dominic Cummings, and the UK’s pornographer & puritan coalition are all worth reading again (and again). The Financial Times, RealClearPolicy, 3 Quarks Daily, and RealClearWorld all featured Nick’s stuff throughout 2019.
Joakim had a banner year at NOL, and four of his posts made the top 10. He got love from the left, right, and everything in between this year. “Elite Anxiety: Paul Collier’s ‘Future of Capitalism’” (#9), “In Defense of Not Having a Clue” (#8), and “You’re Not Worth My Time” (#7) all caused havoc on the internet and in coffee shops around the world. Joakim’s piece on Mr Darcy from Pride and Prejudice (#2) broke – no shattered – NOL‘s records. Aside from shattering NOL‘s records, Joakim also had excellent stuff on financial history, Richard Davies, and Nassim Taleb. He is also beginning to bud as a cultural commentator, too, as you can probably tell from his sporadic notes on opinions. Joakim wants a more rational, more internationalist, and more skeptical world to live in. He’s doing everything he can to make that happen. And don’t forget this one: “Economists, Economic History, and Theory.”
Tridivesh had an excellent third year at NOL. His most popular piece was “Italy and the Belt and Road Initiative,” and most of his other notes have been featured on RealClearWorld‘s front page. Tridivesh has also been working with me behind the scenes to unveil a new feature at NOL in 2020, and I couldn’t be more humbled about working with him.
Bill had a slower year here at NOL, as he’s been working in the real world, but he still managed to put out some bangers. “Epistemological anarchism to anarchism” kicked off a Feyerabendian buzz at NOL, and he put together well-argued pieces on psychedelics, abortion, and the alt-right. His short 2017 note on left-libertarianism has quietly become a NOL classic.
Mary had a phenomenal year at NOL, which was capped off with some love from RealClearPolicy for her “Contempt for Capitalism” piece. She kicked off the year with a sharp piece on semiotics in national dialogue, before then producing a four-part essay on bourgeois culture. Mary also savaged privileged hypocrisy and took a cultural tour through the early 20th century. Oh, and she did all this while doing doctoral work at Oxford. I can’t wait to see what she comes up with in 2020.
Aris’ debut year at NOL was phenomenal. Reread “Rawls, Antigone and the tragic irony of norms” and you’ll know what I’m talking about. I am looking forward to Dr Trantidis’ first full year at NOL in 2020.
Rick continues to be my favorite blogger. His pieces on pollution taxes (here and here) stirred up the libertarian faithful, and he is at his Niskanenian best on bullshit jobs and property rights. His notes on Paul Feyerabend, which I hope he’ll continue throughout 2020, were the centerpiece of NOL‘s spontaneity this year.
Vincent only had two posts at NOL in 2019, but boy were they good: “Interwar US inequality data are deeply flawed” and “Not all GDP measurement errors are greater than zero!” Dr Geloso focused most of his time on publishing academic work.
Alexander instituted the “Sunday Poetry” series at NOL this year and I couldn’t be happier about it. I look forward to reading NOL every day, but especially on Sundays now thanks to his new series. Alex also put out the popular essay “Libertarianism and Neoliberalism – A difference that matters?” (#10), which I suspect will one day grow to be a classic. That wasn’t all. Alex was the author of a number of my personal faves at NOL this year, including pieces about the Austro-Hungarian Empire, constructivism in international relations (part 1 and part 2), and some of the more difficult challenges facing diplomacy today.
Edwin ground out a number of posts in 2019 and, true to character, they challenged orthodoxy and widely-held (by libertarians) opinions. He said “no” to military intervention in Venezuela, though not for the reasons you may think, and that free immigration cannot be classified as a right under classical liberalism. He also poured cold water on Hong Kong’s protests and recommended some good reads on various topics (namely, Robert Nozick and The Troubles). Edwin has several essays on liberalism at NOL that are now bona fide classics.
Federico produced a number of longform essays this year, including “Institutions, Machines, and Complex Orders” and “Three Lessons on Institutions and Incentives” (the latter went on to be featured in the Financial Times and led to at least one formal talk on the subject in Buenos Aires). He also contributed to NOL‘s longstanding position as a bulwark against libertarian dogma with “There is no such thing as a sunk cost fallacy.”
Jacques had a number of hits this year, including “Poverty Under Democratic Socialism” and “Mass shootings in perspective.” His notes on the problems with higher education, aka the university system, also garnered plenty of eyeballs.
Michelangelo, Lode, Zak, and Shree were all working on their PhDs this year, so we didn’t hear from them much, if at all. Hopefully, 2020 will give them a bit more freedom to expand their thoughts. Lucas was not able to contribute anything this year either, but I am confident that 2020 will be the year he reenters the public fray.
Mark spent the year promoting his new book (co-authored by Noel Johnson) Persecution & Toleration. Out of this work arose one of the more popular posts at NOL earlier in the year: “The Institutional Foundations of Antisemitism.” Hopefully Mark will have a little less on his plate in 2020, so he can hang out at NOL more often.
Derrill’s “Romance Econometrics” generated buzz in the left-wing econ blogosphere, and his “Watson my mind today” series began to take flight in 2019. Dr Watson is a true teacher, and I am hoping 2020 is the year he can start dedicating more time to the NOL project, first with his “Watson my mind today” series and second with more insights into thinking like an economist.
Kevin’s “Hyperinflation and trust in ancient Rome” (#6) took the internet by storm, and his 2017 posts on paradoxical geniuses and the deleted slavery clause in the US constitution both received renewed and much deserved interest. But it was his “The Myth of the Nazi War Machine” (#1) that catapulted NOL into its best year yet. I have no idea what Kevin will write about in 2020, but I do know that it’ll be great stuff.
Bruno, one of NOL’s most consistent bloggers and one of its two representatives from Brazil, did not disappoint. His “Liberalism in International Relations” did exceptionally well, as did his post on the differences between conservatives, liberals, and libertarians. Bruno also pitched in on Brazilian politics and Christianity as a global and political phenomenon. His postmodernism posts from years past continue to do well.
Andrei, after several years of gentle prodding, finally got on the board at NOL and his thoughts on Foucault and his libertarian temptation late in life (#5) did much better than predicted. I am hoping to get him more involved in 2020. You can do your part by engaging him in the ‘comments’ threads.
Chhay Lin kept us all abreast of the situation in Hong Kong this year. Ash honed in on housing economics, Barry chimed in on EU elections, and Adrián teased us all in January with his “Selective Moral Argumentation.” Hopefully these four can find a way to fire on all cylinders at NOL in 2020, because they have a lot of cool stuff on their minds (including, but not limited to, bitcoin, language, elections in dictatorships, literature, and YIMBYism).
Ethan crushed it this year, with most of his posts ending up on the front page of RealClearPolicy. More importantly, though, was his commitment to the Tocquevillian idea that lawyers are responsible for education in democratic societies. For that, I am grateful, and I hope he can continue the pace he set during the first half of the year. His most popular piece, by the way, was “Spaghetti Monsters and Free Exercise.” Read it again!
I had a good year here, too. My pieces on federation (#3) and American literature (#4) did waaaaaay better than expected, and my nightcaps continue to pick up readers and push the conversation. I launched the “Be Our Guest” feature here at NOL, too, and it has been a mild success.
Thank you, readers, for a great 2019 and I hope you stick around for what’s in store during 2020. It might be good, it might be bad, and it might be ugly, but isn’t that what spontaneous thoughts on a humble creed are all about? Keep leaving comments, too. The conversation can’t move (forward or backward) without your voice.
I had a mentor at BYU, Prof. James McDonald, who tried to convince us that
- Econometrics is Fun.
- Econometrics is Easy.
- Econometrics is Your Friend.
One of his classes made a bronze plaque out of it for him. He also tried to convince us that Economics is Romantic because this one guy took a girl to his class on a date and she married him anyway. Because he was one of the economists I’ve tried to model my life after, I’ve always been on the lookout for ways to convince people that econometrics is, in fact, fun, friendly, easy, and romantic.
A while back, Bill Easterly blogged about how marriage search is like development, and in the process talking about how unromantic economists can be:
I recently helped one of my single male graduate students in his search for a spouse.
First, I suggested he conduct a randomized controlled trial of potential mates to identify the one with the best benefit/cost ratio. Unfortunately, all the women randomly selected for the study refused assignment to either the treatment or control groups, using language that does not usually enter academic discourse.
With the “gold standard” methods unavailable, I next recommended an econometric regression approach. He looked for data on a large sample of married women on various inputs (intelligence, beauty, education, family background, did they take a bath every day), as well as on output: marital happiness. Then he ran an econometric regression of output on inputs. Finally, he gathered data on available single women on all the characteristics in the econometric study. He made an out-of-sample prediction of predicted marital happiness. He visited the lucky woman who had the best predicted value in the entire singles sample, explained to her how he calculated her nuptial fitness, and suggested they get married. She called the police.
He goes on from there to describe how he eventually did find a mate and makes a comparison with development and over-reliance on econometric methods. As popular as it is in Libertarian circles to bash on econometrics, I’d like to defend empirics by pointing out that his regression advice was not sound:
1 – The suitor’s regressions ignored the self-selection bias. Regressions only tell us what the ‘average’ effects are, that is the effect for the ‘average’ person. Making the average guy happy is only relevant if he is the average guy. Economists being the strange lot we are, it is likely that it takes a special kind of person to marry one of us. He ought to have found a bunch of guys very similar to himself and examine the qualities that made a difference from among (and this is key) the population of women willing to marry guys like him – the women who self-select themselves into our group. If he then approached a women who was not in that group, no wonder he was rejected! I knew I had my work cut out for me since I was in junior high: a Latter-day Saint economist-in-embryo who read Shakespeare “in the original Klingon”, and who carried a briefcase to school? Small sample sizes indeed!
2 – He ignored endogeneity. Instead of trying to convince her that research showed she would make him happy, he needed to present research that demonstrated he would make her happy, and that’s the other half of the regression: male qualities on marital happiness. No wonder she rejected him: his regressions didn’t answer her question!
Personally, I took more of a Bayesian approach. Bayesians believe that a lot of things in life (like regression coefficients) are random and over time we get better and better signals about where the truth is, but we only ever approach it by degrees. First, by trying to become a friend, I identified if a woman was in the group of people who might marry someone like me. Each interaction gave me more information about the error term and the regression coefficients about fostering a happy, loving friendship that could endure. After any failed relationship, I had a new variable or two to add to my equations and I understood the ‘relationships’ between relationship variables better. That might be about finding out different things I needed (hunh, so her political affiliation isn’t as important as I thought and her willingness to smile at me is vital) or about learning more and better policies over time that I could enact to make her happier (tips for being a better listener or learn to identify her love languages and feed them to her regularly).
One of the most important regression-related romance tips I learned was to control the variables I could control, and leave the residual in God’s hands. I recall a graduate labor economics research seminar where the presenter claimed that the marriage market always cleared. I complained that I was willing to supply a great deal more marriage than had ever been demanded at prevailing prices. I was reassured that the marriage market clears in equilibrium, and I might not have found my equilibrium yet. The presenter’s prediction was, thankfully, prescient: I found a buyer a year later, and last week we celebrated 5250 days of married bliss.
Today, I’m reviving an old series I attempted to start last year that never came to fruition: The midweek reader. A micro-blogging series in which I try to link to stories that are related to each other to provide deeper insight into an issue. This week, we’re looking at the relationship between the Opioid Crisis and the drug war, and the academic debate around a controversial paper finding moral hazard in policies that try to increase access to Naloxone.
- At Harpers Magazine, Brian Gladstone has a fantastic long-form piece looking into how attempts to crack down on opioid addiction by targeting the prescription pain meds have left many patients behind and questioning the mainstream narrative that the rise of opioids was driven primarily by pain prescriptions. A slice:
Yet even the most basic elements of this disaster remain unclear. For while it’s true that the past three decades saw a staggering upsurge in the prescribing of opioid medication, this trend peaked in 2010 and has been declining since: high-dose prescriptions fell by 41 percent between 2010 and 2015. The question, then, is why overdose deaths continue to skyrocket, rising 37 percent over the same period — and whether restricting access to regulated drugs is actually pushing people toward more lethal, unregulated ones, such as fentanyl, heroin, and carfentanil, a synthetic opioid 10,000 times stronger than morphine.
- Similarly, at the Cato Institute, Jeffery A. Singer has a good piece exploring the relationship between America’s War on Drugs and the rise of opioid addictions. He concludes:
Meanwhile, President Trump and most state and local policymakers remain stuck on the misguided notion that the way to stem the overdose rate is to clamp down on the number and dose of opioids that doctors can prescribe to their patients in pain, and to curtail opioid production by the nation’s pharmaceutical manufacturers. And while patients are made to suffer needlessly as doctors, fearing a visit from a DEA agent, are cutting them off from relief, the overdose rate continues to climb.
- At Vox, philosopher Brendan de Kenessey of Harvard has a piece exploring the philosophy of the self and of rational choice to argue that it’s wrong to treat drug addiction as a moral failure. A slice:
We tend to view addiction as a moral failure because we are in the grip of a simple but misleading answer to one of the oldest questions of philosophy: Do people always do what they think is best? In other words, do our actions always reflect our beliefs and values? When someone with addiction chooses to take drugs, does this show us what she truly cares about — or might something more complicated be going on?
- An econometrics working paper by Jennifer L. Doleac of University of Virginia and Anita Mukherjee of the University of Wisconsin released earlier this month, which sparked spirited discussion, investigated the link between opioids and laws increasing access to Naloxone. They found the laws increased measurements of opioid use but did reduce mortality, which they theorize is because Naloxone increases moral hazard for addicts by reducing potential costs of an overdose. However, they conclude:
Our findings do not necessarily imply that we should stop making Naloxone available to individuals suffering from opioid addiction, or those who are at risk of overdose. They do imply that the public health community should acknowledge and prepare for the behavioral effects we find here. Our results show that broad Naloxone access may be limited in its ability to reduce the epidemic’s death toll because not only does it not address the root causes of addiction, but it may exacerbate them. Looking forward, our results suggest that Naloxone’s effects may depend on the availability of local drug treatment: when treatment is available to people who need help overcoming their addiction, broad Naloxone access results in more beneficial effects. Increasing access to drug treatment, then, might be a necessary complement to Naloxone access in curbing the opioid overdose epidemic.
- Alex Gertner, a PhD candidate at UNC-Chaple Hill, published a criticism of Doleac Murkhejee at Vox pointing out that their data linking Naloxone and opioid-related hospital visits are not necessarily due to a casual story involving moral hazard:
The authors find that naloxone access laws lead to more opioid-related emergency department visits, the premise being that naloxone access laws increase opioid overdoses. But there’s a far more likely explanation: People are generally instructed to seek medical care for overdose after receiving naloxone.
Overdose is a general term to describe experiencing the toxic effects of drugs. People can overdose, and often do, without either dying or seeking medical attention. If people who would otherwise overdose without medical attention are instead using naloxone and going to emergency rooms, that’s a good thing.
- The widest-ranging and most thorough critique of Doleac-Murkhejee comes from Frank, Pollack, and Humphries at the Journal of Health Affairs. They argue that the original authors (1) assume too much immediacy in effect of changes in Naloxone laws than is probably warranted (2) ignore a variety of exogenous variables like Medicare expansion. They conclude:
We believe the best interpretation of Doleac and Mukherjee’s findings is that their main treatment variable—naloxone laws—thus far have had little impact on naloxone use or nonmedical opioid use during the period studied. This disappointing pattern commands attention and follow-up from both public health practitioners and public health researchers.
Recently, I stumbled on this piece in Chronicle by Jerry Muller. It made my blood boil. In the piece, the author basically argues that, in the world of education, we are fixated with quantitative indicators of performance. This fixation has led to miss (or forget) some important truths about education and the transmission of knowledge. I wholeheartedly disagree because the author of the piece is confounding two things.
We need to measure things! Measurements are crucial to our understandings of causal relations and outcomes. Like Diane Coyle, I am a big fan of the “dashboard” of indicators to get an idea of what is broadly happening. However, I agree with the authors that very often the statistics lose their entire meaning. And that’s when we start targeting them!
Once we know that this variable becomes the object of target, we act in ways that increase this variable. As soon as it is selected, we modify our behavior to achieve fixed targets and the variable loses some of its meaning. This is also known as Goodhart’s law whereby “when a measure becomes a target, it ceases to be a good measure” (note: it also looks a lot like the Lucas critique).
Although Goodhart made this point in the context of monetary policy, it applies to any sphere of policy – including education. When an education department decides that this is the metric they care about (e.g. completion rates, minority admission, average grade point, completion times, balanced curriculum, ratio of professors to pupils, etc.), they are inducing a change in behavior which alters the significance carried by this variable. This is not an original point. Just go to google scholar and type “Goodhart’s law and education” and you end up with papers such as these two (here and here) that make exactly the point I am making here.
In his Chronicle piece, Muller actually makes note of this without realizing how important it is. He notes that “what the advocates of greater accountability metrics overlook is how the increasing cost of college is due in part to the expanding cadres of administrators, many of whom are required to comply with government mandates“(emphasis mine).
The problem he is complaining about is not metrics per se, but rather the effects of having policy-makers decide a metric of relevance. This is a problem about selection bias, not measurement. If statistics are collected without an intent to be a benchmark for the attribution of funds or special privileges (i.e. that there are no incentives to change behavior that affects the reporting of a particular statistics), then there is no problem.
I understand that complaining about a “tyranny of metrics” is fashionable, but in that case the fashion looks like crocs (and I really hate crocs) with white socks.
Yesterday, here at Notes on Liberty, Nicolas Cachanosky blogged about the minimum wage. His point was fairly simple: criticisms against certain research designs that use limited sample can be economically irrelevant.
To put you in context, he was blogging about one of the criticisms made of the Seattle minimum wage study produced by researchers at the University of Washington, namely that the sample was limited to “small” employers. This criticism, Nicolas argues, is irrelevant since the researchers were looking for those who were likely to be the most heavily affected by the minimum wage increase since it will be among the least efficient firms that the effects will be heavily concentrated. In other words, what is the point of looking at Costco or Walmart who are more likely to survive than Uncle Joe’s store? As such, this is Nicolas’ point in defense of the study.
I disagree with Nicolas here and this is because I agree with him (I know, it sounds confused but bear with me).
The reason is simple: firms react differently to the same shock. Costs are costs, productivity is productivity, but the constraints are never exactly the same. For example, if I am a small employer and the minimum wage is increased 15%, why would I fire one of my two employees to adjust? If that was my reaction to the minimum wage, I would sacrifice 33% of my output for a 15% increase in wages which compose the majority but not the totality of my costs. Using that margin of adjustment would be insensible for me given the constraint of my firm’s size. I might be more tempted to cut hours, cut benefits, cut quality, substitute between workers, raise prices (depending on the elasticity of the demand for my services). However, if I am a large firm of 10,000 employees, sacking one worker is an easy margin to adjust on since I am not constrained as much as the small firm. In that situation, a large firm might be tempted to adjust on that margin rather than cut quality or raise prices. Basically, firms respond to higher labor costs (not accompanied by greater productivity) in different ways.
By concentrating on small firms, the authors of the Seattle study were concentrating on a group that had, probably, a more homogeneous set of constraints and responses. In their case, they were looking at hours worked. Had they blended in the larger firms, they would have looked for an adjustment on the part of firms less to adjust by compressing hours but rather by compressing the workforce.
This is why the UW study is so interesting in terms of research design: it focused like a laser on one adjustment channel in the group most likely to respond in that manner. If one reads attentively that paper, it is clear that this is the aim of the authors – to better document this element of the minimum wage literature. If one seeks to exhaustively measure what were the costs of the policy, one would need a much wider research design to reflect the wide array of adjustments available to employers (and workers).
In short, Nicolas is right that research designs matter, but he is wrong in that the criticism of the UW study is really an instance of pro-minimum wage hike pundits bringing the hockey puck in their own net!
I see my craft as an economic historian as a dual mission. The first is to answer historical question by using economic theory (and in the process enliven economic theory through the use of history). The second relates to my obsessive-compulsive nature which can be observed by how much attention and care I give to getting the data right. My co-authors have often observed me “freaking out” over a possible improvement in data quality or be plagued by doubts over whether or not I had gone “one assumption too far” (pun on a bridge too far). Sometimes, I wish more economists would follow my historian-like freakouts over data quality. Why?
Because of this!
In that paper, Michael Clemens (whom I secretly admire – not so secretly now that I have written it on a blog) criticizes the recent paper produced by George Borjas showing the negative effect of immigration on wages for workers without a high school degree. Using the famous Mariel boatlift of 1980, Clemens basically shows that there were pressures on the US Census Bureau at the same time as the boatlift to add more black workers without high school degrees. This previously underrepresented group surged in importance within the survey data. However since that underrepresented group had lower wages than the average of the wider group of workers without high school degrees, there was an composition effect at play that caused wages to fall (in appearance). However, a composition effect is also a bias causing an artificial drop in wages and this drove the results produced by Borjas (and underestimated the conclusion made by David Card in his original paper to which Borjas was replying).
This is cautionary tale about the limits of econometrics. After all, a regression is only as good as the data it uses and suited to the question it seeks to answer. Sometimes, simple Ordinary Least Squares are excellent tools. When the question is broad and/or the data is excellent, an OLS can be a sufficient and necessary condition to a viable answer. However, the narrower the question (i.e. is there an effect of immigration only on unskilled and low-education workers), the better the method has to be. The problem is that the better methods often require better data as well. To obtain the latter, one must know the details of a data source. This is why I am nuts over data accuracy. Even small things matter – like a shift in the representation of blacks in survey data – in these cases. Otherwise, you end up with your results being reversed by very minor changes (see this paper in Journal of Economic Methodology for examples).
This is why I freak out over data. Maybe I can make two suggestions about sharing my freak-outs.
The first is to prefer a skewed ratio of data quality to advanced methods (i.e. simple methods with crazy-data). This reduces the chances of being criticized for relying on weak assumptions. The second is to take a leaf out of the book of the historians. While historians are often averse to advantaged data techniques (I remember a case when I had to explain panel data regressions to historians which ended terribly for me), they are very respectful of data sources. I have seen historians nurture datasets for years before being willing to present them. When published, they generally stand up to scrutiny because of the extensive wealth of details compiled.
That’s it folks.
I admit to being a happy man. While I am in general a smiling sort of fellow, I was delightfully giggling with joy upon hearing that another economic historian (and a fellow Canadian from the LSE to boot), Dave Donaldson, won the John Bates Clark medal. I dare say that it was about time. Nonetheless I think it is time to talk to economists about how to do economic history (and why more should do it). Basically, I argue that the necessities of the trade require a longer period of maturation and a considerable amount of hard work. Yet, once the economic historian arrives at maturity, he produces long-lasting research which (in the words of Douglass North) uses history to bring theory to life.
Economic History is the Application of all Fields of Economics
Economics is a deductive science through which axiomatic statements about human behavior are derived. For example, stating that the demand curve is downward-sloping is an axiomatic statement. No economist ever needed to measure quantities and prices to say that if the price increases, all else being equal, the quantity will drop. As such, economic theory needs to be internally consistent (i.e. not argue that higher prices mean both smaller and greater quantities of goods consumed all else being equal).
However, the application of these axiomatic statements depends largely on the question asked. For example, I am currently doing work on the 19th century Canadian institution of seigneurial tenure. In that work, I question the role that seigneurial tenure played in hindering economic development. In the existing literature, the general argument is that the seigneurs (i.e. the landlords) hindered development by taxing (as per their legal rights) a large share of net agricultural output. This prevented the accumulation of savings which – in times of imperfect capital markets – were needed to finance investments in capital-intensive agriculture. That literature invoked one corpus of axiomatic statements that relate to capital theory. For my part, I argue that the system – because of a series of monopoly rights – was actually a monopsony system through the landlords restrained their demand for labor on the non-farm labor market and depressed wages. My argument invokes the corpus of axioms related to industrial organization and monopsony theory. Both explanations are internally consistent (there are no self-contradictions). Yet, one must be more relevant to the question of whether or not the institution hindered growth and one must square better with the observed facts.
And there is economic history properly done. It tries to answer which theory is relevant to the question asked. The purpose of economic history is thus to find which theories matter the most.
Take the case, again, of asymetric information. The seminal work of Akerlof on the market for lemons made a consistent theory, but subsequent waves of research (notably my favorite here by Eric Bond) have showed that the stylized predictions of this theory rarely materialize. Why? Because the theory of signaling suggests that individuals will find ways to invest in a “signal” to solve the problem. These are two competing theories (signaling versus asymetric information) and one seems to win over the other. An economic historian tries to sort out what mattered to a particular event.
Now, take these last few paragraphs and drop the words “economic historians” and replace them by “economists”. I believe that no economist would disagree with the definition of the tasks of the economist that I offered. So why would an economic historian be different? Everything that has happened is history and everything question with regards to it must be answered through sifting for the theories that is relevant to the event studied (under the constraint that the theory be consistent). Every economist is an economic historian.
As such, the economic historian/economist must use advanced tools related to econometrics: synthetic controls, instrumental variables, proper identification strategies, vector auto-regressions, cointegration, variance analysis and everything you can think of. He needs to do so in order to answer the question he tries to answer. The only difference with the economic historian is that he looks further back in the past.
The problem with this systematic approach is the efforts needed by practitioners. There is a need to understand – intuitively – a wide body of literature on price theory, statistical theories and tools, accounting (for understanding national accounts) and political economy. This takes many years of training and I can take my case as an example. I force myself to read one scientific article that is outside my main fields of interest every week in order to create a mental repository of theoretical insights I can exploit. Since I entered university in 2006, I have been forcing myself to read theoretical books that were on the margin of my comfort zone. For example, University Economics by Allen and Alchian was one of my favorite discoveries as it introduced me to the UCLA approach to price theory. It changed my way of understanding firms and the decisions they made. Then reading some works on Keynesian theory (I will confess that I have never been able to finish the General Theory) which made me more respectful of some core insights of that body of literature. In the process of reading those, I created lists of theoretical key points like one would accumulate kitchen equipment.
This takes a lot of time, patience and modesty towards one’s accumulated stock of knowledge. But these theories never meant anything to me without any application to deeper questions. After all, debating about the theory of price stickiness without actually asking if it mattered is akin to debating with theologians about the gender of angels (I vote that they are angels and since these are fictitious, I don’t give a flying hoot’nanny). This is because I really buy in the claim made by Douglass North that theory is brought to life by history (and that history is explained by theory).
On the Practice of Economic History
So, how do we practice economic history? The first thing is to find questions that matter. The second is to invest time in collecting inputs for production.
While accumulating theoretical insights, I also made lists of historical questions that were still debated. Basically, I made lists of research questions since I was an undergraduate student (not kidding here) and I keep everything on the list until I have been satisfied by my answer and/or the subject has been convincingly resolved.
One of my criteria for selecting a question is that it must relate to an issue that is relevant to understanding why certain societies are where there are now. For example, I have been delving into the issue of the agricultural crisis in Canada during the early decades of the 19th century. Why? Because most historians attribute (wrongly in my opinion) a key role to this crisis in the creation of the Canadian confederation, the migration of the French-Canadians to the United States and the politics of Canada until today. Another debate that I have been involved in relates to the Quiet Revolution in Québec (see my book here) which is argued to be a watershed moment in the history of the province. According to many, it marked a breaking point when Quebec caught up dramatically with the rest of Canada (I disagreed and proposed that it actually slowed down a rapid convergence in the decade and a half that preceded it). I picked the question because the moment is central to all political narratives presently existing in Quebec and every politician ushers the words “Quiet Revolution” when given the chance.
In both cases, they mattered to understanding what Canada was and what it has become. I used theory to sort out what mattered and what did not matter. As such, I used theory to explain history and in the process I brought theory to life in a way that was relevant to readers (I hope). The key point is to use theory and history together to bring both to life! That is the craft of the economic historian.
The other difficulty (on top of selecting questions and understanding theories that may be relevant) for the economic historian is the time-consuming nature of data collection. Economic historians are basically monks (and in my case, I have both the shape and the haircut of friar Tuck) who patiently collect and assemble new data for research. This is a high fixed cost of entering in the trade. In my case, I spent two years in a religious congregation (literally with religious officials) collecting prices, wages, piece rates, farm data to create a wide empirical portrait of the Canadian economy. This was a long and arduous process.
However, thanks to the lists of questions I had assembled by reading theory and history, I saw the many steps of research I could generate by assembling data. Armed with some knowledge of what I could do, the data I collected told me of other questions that I could assemble. Once I had finish my data collection (18 months), I had assembled a roadmap of twenty-something papers in order to answer a wide array of questions on Canadian economic history: was there an agricultural crisis; were French-Canadians the inefficient farmers they were portrayed to be; why did the British tolerate catholic and French institutions when they conquered French Canada; did seigneurial tenure explain the poverty of French Canada; did the conquest of Canada matter to future growth; what was the role of free banking in stimulating growth in Canada etc.
It is necessary for the economic historian to collect a ton of data and assemble a large base of theoretical knowledge to guide the data towards relevant questions. For those reasons, the economic historian takes a longer time to mature. It simply takes more time. Yet, once the maturation is over (I feel that mine is far from being over to be honest), you get scholars like Joel Mokyr, Deirdre McCloskey, Robert Fogel, Douglass North, Barry Weingast, Sheilagh Ogilvie and Ronald Coase (yes, I consider Coase to be an economic historian but that is for another post) who are able to produce on a wide-ranging set of topics with great depth and understanding.
The craft of the economic historian is one that requires a long period of apprenticeship (there is an inside joke here, sorry about that). It requires heavy investment in theoretical understanding beyond the main field of interest that must be complemented with a diligent accumulation of potential research questions to guide the efforts at data collection. Yet, in the end, it generates research that is likely to resonate with the wider public and impact our understanding of theory. History brings theory to life indeed!
This question comes from co-blogger Warren Gibson’s piece in Econ Journal Watch titled “The Mathematical Romance: An Engineer’s View of Mathematical Economics”:
Mathematics can be very alluring. Professional mathematicians speak frequently of “beauty” and “elegance” in their work. Some say that the central mystery of our universe is its governance by universal mathematical laws. Practitioners of applied math likewise feel special satisfaction when a well-crafted simulation successfully predicts real-world physical behavior.
But while the mathematicians, some of them at least, are explicit about doing math for its own sake, engineers are hired to produce results and economists should be, too. It’s fine if a few specialists labor at the outer mathematical edge of these fields, but the real needs and real satisfactions are to be found in applications.
Western civilization has brought us an explosion of human welfare: prosperity, longevity, education, the arts, and so on. We very much need the wisdom that economists can offer us to help understand and sustain this remarkable record. What good are engineers’ accomplishments in crash simulations if the benefits are denied to the world by trade barriers, stifling regulation, congested highways, or bogus global warming restrictions? What can mathematical economics contribute to such vital issues?
You can read the whole thing here (it’s also in the newly-renovated ‘recommendations’ section). Highly, uh, recommended! Mathematics is an important aspect of economics, I think, but Dr. Gibson and other Austrian critiques are more focused on the models that economists create rather than all of the wonderful data that can be quantified for calculation purposes.
I may be wrong, and I hope that my co-bloggers or others will correct me if I am, but either way Dr. Gibson’s critique of the mathematical models employed in the economics profession needs another good look.