- Who’s who in Hamburg’s G20 protests
- “But, if Marxism is not inevitable, it is nothing. Ronald Reagan, with his abiding fear that the Evil Empire would spread without intervention, was, in this sense, a much better Marxist than David Roediger could ever hope to be.“
- It’s business as usual between Turkey and the EU
- “So far there is not much sign of the fresh dawn that IS’s downfall should bring.“
- Hell Makes the News
Yesterday, here at Notes on Liberty, Nicolas Cachanosky blogged about the minimum wage. His point was fairly simple: criticisms against certain research designs that use limited sample can be economically irrelevant.
To put you in context, he was blogging about one of the criticisms made of the Seattle minimum wage study produced by researchers at the University of Washington, namely that the sample was limited to “small” employers. This criticism, Nicolas argues, is irrelevant since the researchers were looking for those who were likely to be the most heavily affected by the minimum wage increase since it will be among the least efficient firms that the effects will be heavily concentrated. In other words, what is the point of looking at Costco or Walmart who are more likely to survive than Uncle Joe’s store? As such, this is Nicolas’ point in defense of the study.
I disagree with Nicolas here and this is because I agree with him (I know, it sounds confused but bear with me).
The reason is simple: firms react differently to the same shock. Costs are costs, productivity is productivity, but the constraints are never exactly the same. For example, if I am a small employer and the minimum wage is increased 15%, why would I fire one of my two employees to adjust? If that was my reaction to the minimum wage, I would sacrifice 33% of my output for a 15% increase in wages which compose the majority but not the totality of my costs. Using that margin of adjustment would be insensible for me given the constraint of my firm’s size. I might be more tempted to cut hours, cut benefits, cut quality, substitute between workers, raise prices (depending on the elasticity of the demand for my services). However, if I am a large firm of 10,000 employees, sacking one worker is an easy margin to adjust on since I am not constrained as much as the small firm. In that situation, a large firm might be tempted to adjust on that margin rather than cut quality or raise prices. Basically, firms respond to higher labor costs (not accompanied by greater productivity) in different ways.
By concentrating on small firms, the authors of the Seattle study were concentrating on a group that had, probably, a more homogeneous set of constraints and responses. In their case, they were looking at hours worked. Had they blended in the larger firms, they would have looked for an adjustment on the part of firms less to adjust by compressing hours but rather by compressing the workforce.
This is why the UW study is so interesting in terms of research design: it focused like a laser on one adjustment channel in the group most likely to respond in that manner. If one reads attentively that paper, it is clear that this is the aim of the authors – to better document this element of the minimum wage literature. If one seeks to exhaustively measure what were the costs of the policy, one would need a much wider research design to reflect the wide array of adjustments available to employers (and workers).
In short, Nicolas is right that research designs matter, but he is wrong in that the criticism of the UW study is really an instance of pro-minimum wage hike pundits bringing the hockey puck in their own net!
A recent study on the effect of minimum wages in the city of Seattle has produced some conflicted reactions. As most economists expected, the significant increase in the minimum wage resulted in job losses and bankruptcies. Others, however, doubt the validity of the results given that the sample may be incomplete.
In this post I want to focus just one empirical problem. An incomplete sample in itself may not be a problem. The issue is whether or not the observations missing from the sample are relevant. This problem has been pointed out before as the Russian Roulette Effect, which consists in asking survivors of the increase in minimum wages if the increase in minimum wages have put them out of business. Of course, the answer is no. In regards to Seattle, a concern might be that fast food chains such as McDonald’s are not properly included in the study.
The first reaction is, so what? Why is that a problem? If the issue is to show that an increase of wages above their equilibrium level is going to produce unemployment, all that has to be shown is that this actually happens, not to show where it does not happen. This concern about the Seattle study is missing a key point of the economic analysis of minimum wages. The prediction is that jobs will be lost first among less efficient workers and less efficient employers, not equally across all workers and employers. More efficient employers may be able to absorb a larger share of the wage increase, to cut compensations, delay the lay-offs, etc. This is seen by the fact that demand is downward sloping and that a minimum wage above its equilibrium level “cuts” demand in two. Some employers are below the minimum wage (the less efficient ones) and others are above the minimum wage (the more efficient ones.) Let’s call the former “Uncle’s diner” and the latter “McDonald’s.” This how it is seen in a demand and supply graph:
Surely, there is some overlapping. But the point that this graph is making is that looking at the effects minimum wage above the red line is looking at the wrong place. A study that is looking for the effect on employment needs to be looking at what happens with below the red line. This sample, of course, has less information available than fast food chains such as McDonald’s; this is a reason why some studies focus on what can be seen even if the effect happens in what cannot be seen (and this is a value added of the Seattle study.)
This is why it is important to ask: “what do minimum wage advocates expect to find by increasing the sample size?” To question that minimum wages increase unemployment, then the critics also needs to focus on the “Uncle’s diner” part of the demand curve. If the objective is to inquire about something else, than that has no bearing on the fact that minimum wage increases do produce unemployment in the minimum wage market and at the less efficient (and harder to gather data) portion of it first.
PS: I have a previous post on minimum wages that can be found here.
There are only four ways to spend money:
- You spend your money with yourself.
- You spend your money with others.
- You spend other’s money with yourself.
- You spend other’s money with others.
Just think about it for one second and you will agree: these are the only possible ways to spend money.
The best way to spend money is to spend your money with yourself. The worst is when you spend other’s money with others.
When you spend your money with yourself, you know how much money you have and what your needs are.
When you spend your money with others, you know how much money you have but you don’t know as well what other’s needs are.
When you spend other’s money with yourself you know your needs but you don’t know how much money you can actually spend. In a situation like this, some people will shy away from spending much money (when they actually could) and will end up not totally satisfied. Other people will have no problem like that and will spend as crazy, not noticing that they are spending too much. It doesn’t matter what kind of person you are: the fact is that this is not a good way to spend money.
When you spend other’s money with others you have the worst case scenario: you don’t know other’s needs as well as they know and you also don’t have a clear grasp of how much money you can actually spend.
In a true liberal capitalist society, the majority of the money is spent by individuals on their own needs. The more a society drifts away from this ideal, the more people spend money that is not actually theirs on other people. The more money is misused.
The government basically spend money is not theirs on other people. That’s why government spending is usually bad. Even the most well-meaning government official, more in line with your personal beliefs, will probably not spend your money as well as you could yourself.
Since yesterday was Independence Day, I thought I should share a recent piece of research I made available. A few months ago, I completed a working paper which has now been accepted as a book chapter regarding public choice theory insights for American economic history (of which I talked about before). That paper simply argued that the American Revolutionary War that led to independence partly resulted from strings of rent-seeking actions (disclaimer: the title of the blog post was chosen to attract attention).
The first element of that string is that the Americans were given a relatively high level of autonomy over their own affairs. However, that autonomy did not come with full financial responsibility. In fact, the American colonists were still net beneficiaries of imperial finance. As the long period of peace that lasted from 1713 to 1740 ended, the British started to spend increasingly larger sums for the defense of the colonies. This meant that the British were technically inciting (by subsidizing the defense) the colonists to take aggressive measures that may benefit them (i.e. raid instead of trade). Indeed, the benefits of any land seizure by conflict would large fall in their lap while the British ended up with the bill.
The second element is the French colony of Acadia (in modern day Nova Scotia and New Brunswick). I say “French”, but it wasn’t really under French rule. Until 1713, it was nominally under French rule but the colony of a few thousands was in effect a “stateless” society since the reach of the French state was non-existent (most of the colonial administration that took place in French North America was in the colony of Quebec). In any case, the French government cared very little for that colony. After 1713, it became a British colony but again the rule was nominal and the British tolerated a conditional oath of loyalty (which was basically an oath of neutrality speaking to the limited ability of the crown to enforce its desires in the colony). However, it was probably one of the most prosperous colonies of the French crown and one where – and this is admitted by historians – the colonists were on the friendliest of terms with the Native Indians. Complex trading networks emerged which allowed the Acadians to acquire land rights from the native tribes in exchange for agricultural goods which would be harvested thanks to sophisticated irrigation systems. These lands were incredibly rich and they caught the attention of American colonists who wanted to expel the French colonists who, to top it off, were friendly with the natives. This led to a drive to actually deport them. When deportation occurred in 1755 (half the French population was deported), the lands were largely seized by American settlers and British settlers in Nova Scotia. They got all the benefits. However, the crown paid for the military expenses (they were considerable) and it was done against the wishes of the imperial government as an initiative of the local governments of Massachusetts and Nova Scotia. This was clearly a rent-seeking action.
The third link is that in England, the governing coalitions included government creditors who had a strong incentives to control government spending especially given the constraints imposed by debt-financing the intermittent war with the French. These creditors saw the combination of local autonomy and the lack of financial responsibility for that autonomy as a call to centralize management of the empire and avoid such problems in the future. This drive towards centralization was a key factor, according to historians like J.P. Greene, in the initiation of the revolution. It was also a result of rent-seeking on the part of actors in England to protect their own interest.
As such, the history of the American revolution must rely in part on a public choice contribution in the form of rent-seeking which paints the revolution in a different (and less glorious) light.
In 1962, France and the Algerian nationalists came to an agreement about Algerian independence. That was after 130 years of French colonization and eight years of brutal war including war against civilians. I participated in the evacuation of large number of French civilians from the country as a little sailor. The number who wanted to leave was much greater than anyone expected. It was too bad that they left in such large numbers. It was a pity for all concerned. The events were a double tragedy or a tragedy leading to a tragedy. The Algerian independence fighters who had prevailed by shedding quantities of their blood were not (not) Islamists. In most respects, intellectually and otherwise, they were a lot like me.
The true revolutionaries were soon replaced however by professional soldiers that I think of as classical but fairly moderate fascists. I went back to Algeria six years after independence. I was warmly received and I liked the people there. People invited me to lunch; I shared with them the fish I caught and a baby camel tried to browse my hair in a cafe.
I still think the nationalists were on the right side of the argument but I miss Algeria nevertheless. It’s like a divorce that should not have happened. And I am very sorry about where French incompetence and rigidity led everyone, especially the Algerians who keep migrating to France in huge numbers because they can’t find what they need at home.
As a tribute to the great events that occurred 241 years ago, I wanted to recognize the importance of the unity of purpose behind supporting liberty in all of its forms. While an unequivocal statement of natural rights and the virtues of liberty, the Declaration of Independence also came close to bringing another vital aspect of liberty to the forefront of public attention. As has been addressed in multiple fascinating podcasts (Joe Janes, Robert Olwell), a censure of slavery and George III’s connection to the slave trade was in the first draft of the Declaration.
Thomas Jefferson, a man who has been criticized as a man of inherent contradiction between his high morals and his active participation in slavery, was a major contributor to the popularizing of classical liberal principles. Many have pointed to his hypocrisy in that he owned over 180 slaves, fathered children on them, and did not free them in his will (because of his debts). Even given his personal slaves, Jefferson made his moral stance on slavery quite clear through his famous efforts toward ending the transatlantic slave trade, which exemplify early steps in securing the abolition of the repugnant act of chattel slavery in America and applying classically liberal principles toward all humans. However, this very practice may have been enacted far sooner, avoiding decades of appalling misery and its long-reaching effects, if his (hypocritical but principled) position had been adopted from the day of the USA’s first taste of political freedom.
This is the text of the deleted Declaration of Independence clause:
“He has waged cruel war against human nature itself, violating its most sacred rights of life and liberty in the persons of a distant people who never offended him, captivating and carrying them into slavery in another hemisphere or to incur miserable death in their transportation thither. This piratical warfare, the opprobrium of infidel powers, is the warfare of the Christian King of Great Britain. Determined to keep open a market where Men should be bought and sold, he has prostituted his negative for suppressing every legislative attempt to prohibit or restrain this execrable commerce. And that this assemblage of horrors might want no fact of distinguished die, he is now exciting those very people to rise in arms among us, and to purchase that liberty of which he has deprived them, by murdering the people on whom he has obtruded them: thus paying off former crimes committed against the Liberties of one people, with crimes which he urges them to commit against the lives of another..”
The second Continental Congress, based on hardline votes of South Carolina and the desire to avoid alienating potential sympathizers in England, slaveholding patriots, and the harbor cities of the North that were complicit in the slave trade, dropped this vital statement of principle
The removal of the anti-slavery clause of the declaration was not the only time Jefferson’s efforts might have led to the premature end of the “peculiar institution.” Economist and cultural historian Thomas Sowell notes that Jefferson’s 1784 anti-slavery bill, which had the votes to pass but did not because of a single ill legislator’s absence from the floor, would have ended the expansion of slavery to any newly admitted states to the Union years before the Constitution’s infamous three-fifths compromise. One wonders if America would have seen a secessionist movement or Civil War, and how the economies of states from Alabama and Florida to Texas would have developed without slave labor, which in some states and counties constituted the majority.
These ideas form a core moral principle for most Americans today, but they are not hypothetical or irrelevant to modern debates about liberty. Though America and the broader Western World have brought the slavery debate to an end, the larger world has not; though countries have officially made enslavement a crime (true only since 2007), many within the highest levels of government aid and abet the practice. 30 million individuals around the world suffer under the same types of chattel slavery seen millennia ago, including in nominal US allies in the Middle East. The debates between the pursuit of non-intervention as a form of freedom and the defense of the liberty of others as a form of freedom have been consistently important since the 1800’s (or arguably earlier), and I think it is vital that these discussions continue in the public forum. I hope that this 4th of July reminds us that liberty is not just a distant concept, but a set of values that requires constant support, intellectual nurturing, and pursuit.