New Paper: Historical Narratives and the Defense of Stigmatized Industries

15 06 2021

I’m proud to announce the publication of our paper in Journal of Management Inquiry that explains how entrepreneurs in stigmatized industries can try to fight stigmatization.

Abstract: This study examines how managers and entrepreneurs in stigmatized industries use historical narratives to combat stigma. We examine two industries, the private military contractors (PMC) industry in the United States and the cannabis industry in Canada. In recent decades, the representatives of these industries have worked to reduce the level of stigmatization faced by the industries. We show that historical narratives were used rhetorically by the representatives of both industries. In both cases, these historical narratives were targeted at just one subset of the population. Our research contributes to debates about stigmatization in ideologically diverse societies, an important issue that have been overlooked by the existing literature on stigmatized industries, which tends to assume the existence of homogeneous audiences when researching the efforts of industry representatives to destigmatize their industries.

You can read the full paper here.





Why Individual Organizations Not The State Should Cover The Costs of Historic Crimes

10 06 2021

Vancouver Art Gallery community memorial to the 215 buried children discovered at Kamloops Residential School. The main memorial consists of 215 pairs of children’s shoes, along with various accessories including teddy bears, books, images, and flower Image Source.

In recent weeks, a sensational discovery on the grounds of a Canadian residential school has gathered international attention and has raised important questions about where the responsibility should be placed. From the late nineteenth century onwards, the Canadian government had a policy of requiring all Indigenous children to attend school. In Indigenous communities in which there wasn’t a European-style school, that meant that children were forcibly taken from their families and sent to boarding schools that were designed to Christianize the children and to assimilate them into Euro-Canadian culture. In many cases, these state-subsidized schools were run by churches, often the Roman Catholic Church. The Roman Catholic Church, which still has a privileged constitutional status in parts of Canada, enjoyed tremendous political power in Canada in the period in question due to the nature of the political settlement that produced the modern Canadian constitution in 1867.

 Regardless of whether the residential schools were managed by celibate Catholic clerics or by Protestants who were married family men, the entire system was deplorable. Even when the residential schools were well managed by the standards of the time (e.g., well ventilated, adequate food, no sexual abuse, etc) the residential school policy was a deeply illiberal one that is today universally condemned across the Canadian political spectrum. The residential schools policy can be condemned on multiple grounds. The policy violated a number of important principles, including Indigenous rights, parental autonomy, the departure from the principle that the state should be religiously neutral, etc etc. Moreover, the child mortality rate in these schools appears to have been far higher than the baseline rate that would have prevailed in Indigenous communities at that time. Simply put, by taking children away from their families and putting them in institutions, the policy reduced the proportion of Indigenous children who reached adulthood. Regardless of whether someone today regards the residential school policy through either a left-wing perspective or a politically conservative lens, the policy is indefensible. A number of years ago, the Canadian government formally apologized for the residential school policy in a highly public ceremony and offered compensation to those who were victimized by this policy. In connection with this apology, the Canadian government launched a Truth and Reconciliation Commission modelled on that once led by Desmond Tutu in South Africa. This commission heard testimony from many survivors. In an effort to reconstruct the facts of the case and to establish exactly what happened in the residential schools, the government commissioned extensive historical research, some of which was executed by people with PhDs and by respected historical consulting firms such as the excellent Public History Inc of Ottawa.

Apparently, the historical research undertaken using oral historical and archival research methods missed out of some of the horrors of the residential school programme. A few weeks ago, the First Nations government near Kamloops, British Columbia announced that archaeologists they had hired had discovered a large number of human remains in unmarked graves on the site of a residential schools. It is estimated that over 200 children were buried on the site, which strongly suggests that the managers of the school, who recorded far fewer than 200 child deaths in official documents, knew that the death rate was unacceptable high, even by the standards of the time, and tried to cover it up. The discovery has generated outpourings of grief in Canada from both Indigenous and non-Indigenous Canadians and has renewed discussions of abuses by other church-run organizations, such as the infamous Mount Cashel Orphanage in Newfoundland, and has raised profound questions about the relationship between church and state in that country. Canada’s monarch must, by law, be a Protestant, the Canadian Crown and the Catholic church have had an extremely cosy relationship since at least 1774, when a new constitutional order was granted to Canada.

Overseas, the revelations out of Kamloops have renewed ongoing discussions about some of the equally terribly things the Catholic church did locally at roughly the same time. The residential school in Kamloops was run by the Oblates, who recruited teachers from Irish convents. The Kamloops mass grave story has resulted in many comparisons the infamous Bon Secours home for unwed mothers in Ireland. A few years ago, radar imaging discovered a mass grave of malnourished and maltreated children that the nuns had mistreated because they had been born out of wedlock and were thus morally tainted in their eyes. The remains of almost a thousand children were found buried under a septic tank, which further eroded the legitimacy of the Catholic Church in Ireland. These births and deaths were never registered with the civil authorities in Ireland, which is precisely the same modus operandi used in Kamloops. So the Kamloops revelations isn’t just a story about the residential schools or race relations.  For obvious political reasons, successive Canadian governments have tried to steer the conversation about residential schools away from comparisons between the Catholic, Anglican, United Church, and secular schools or between making comparisons between the residential schools run by the different orders of the Roman Catholic Church. (For roughly 80% of the last half century, Canada’s Prime Ministers have been Roman Catholic). The long-overdue discussion of that issue is now taking place.

Another discussion is whether additional compensation for the victims of the worst residential schools should come from the state (i.e., all taxpayers regardless of their religious commitments) or just from the incorporated organizations that managed the schools in question, or both but in a weighted fashion. This issue is one that should interest anyone in anywhere in the world who is interested in corporate responsibility or business ethics issues. [Full disclosure: I’ve published in Journal of Business Ethics twice. One of those articles was about the Hudson’s Bay Company’s present-day responsibility for its historic relationships with Indigenous peoples].

Pretty basic law and economics theory strongly suggests that, as a general rule, the onus to provide compensation for corporate misdeeds should be placed on the responsible organization rather than diffused widely across all of society. If our goal is to incentivize organizations to behave better in the future, we need to use this moment to make it clear that the burden of compensation must fall on the organization in question. When I drive too fast, I don’t share the costs of the speeding ticket with everyone on my street. If we shifted to a system whereby speeding ticket costs were shifted from individual drivers to street populations, average speeds and the highway death toll would go up. Allowing the   incorporated organization that ran the Kamloops school, which happens to be the Canadian subsidiary of the Missionary Oblates of Mary Immaculate, or indeed any other organization to say “society or the government made me do it” would set a dangerous precedent that would be observed by other non-profit and commercial organizations right now.

Ideally, we would also like to hold the individual natural persons who had decision-making power responsible as well, but since they are almost certainly deceased by this point, responsibility needs to go up one level to the corporation rather than being shared between the corporation and the individuals.  

In thinking about corporate responsibility for criminal actions that took place long ago, some people might be inclined to apply “statute of limitations” thinking and say that after so much time has elapsed, individuals and corporations should no longer be held accountable for misdeeds. When I say “held accountable” I am refer to both legally liability and to the informal norms that govern when we are expected to compensate others for our misdeeds, since both are potentially relevant here. I think that it would be a mistake to apply statute of limitations thinking or “time limits” thinking in this case. While there are good reasons for thinking that the existence of the statute of limitations rule produces positive social outcomes, there are strong justifications for the modification of that rule that says that the statute of limitations doesn’t apply in the case when the misdeed in question was very serious and has long-term implications. In laws of both England and Canada, the statute of limitation applies to small crimes not big crimes. It basically applies to shoplifting, not murder and that two-tier approach is probably the most efficient one with the right social outcomes.

 Similarly, the social norms that require people to compensate others for their misdeeds also distinguish between minor misdeeds, where claims for compensation must be presented immediately, and really serious misdeeds. I recall a case of a restaurant where it was revealed that the child of the owners had engaged in paedophilia on that property many decades earlier. (The individual malefactor was dead by the time of the revelations by the restaurant was still owned by relatives of the founder).  Although the owning family and the restaurant corporation escaped any legal liability for what the family member had done, social sanctioning kicked in and the business soon folded, despite desperate efforts to retain customers by slashing prices. As it was witnessed by so many organizations in the community, the the bankruptcy of that restaurant firm was, probably, a positive development because it reinforced the salutary norm that says that an organization will be punished for serious misdeeds no matter how long ago they took place. I would imagine that this example increased the incentives for family members to police the behaviour of family members. If we assume that managers care about both short-term and long-term reputational costs to firms, we want to make it very clear that the time limits/statute of limitations principle doesn’t not apply to the most serious corporate crimes.     

Bottom line, the organization that managed the school should be viewed by both policymakers and by ordinary people as primarily responsible for what happened in Kamloops. This organization should be subject to legal liability and informal social sanctioning similar to that applied to the aforementioned restaurant.  Perhaps the state could contribute to the costs of providing the additional compensation that is now being requested but only once the resources of the organization have been exhausted. Any other arrangement would be both unfair to Canadian taxpayers and likely to reduce the incentives of corporations to behave ethically. How the government acts now will change the incentive structure for many organizations.  I think that asking the Canadian taxpayers to cover these costs would be particularly unfair to those taxpayers who have abandoned the Catholic faith of their ancestors, whose ancestors arrived in Canada after the mandatory residential school programme ended, or whose ancestors at the time went on the record by challenging the idea that the production of Roman Catholicism is a legitimate public good that the state should be in the business of subsidizing. During the period in which this residential school was in operation there were many people who objected to any sort of state support for the Catholic Church. From the 1850s onwards most Canadian Protestants subscribed to the belief that the state should provide a level playing field for religions and that different denominations should compete on a free market basis, a doctrine called Free Trade in Religion or Voluntaryism. During the middle of the nineteenth century, there were eloquent proponents of this view in the Free Presbyterian Church and other denominations. Had that principle been applied consistently after the creation of the new Canadian constitution in 1867, social outcomes would have been superior, in my view.       





Business History Special Issue on Varieties of Capitalism

26 05 2021

The UK journal Business History recently published a special issue on Varieties of Capitalism. I’ve finally had a chance to take a close look at this really important special issue, which provides some desperately needed historical contextualisation for the Varieties of Capitalism literature. In reading it, I was reminded to two exchanges at conferences. First, I recall a seminar discussion at the BHC in Denver in which we got discussing the Varieties of Capitalism theory with Veronique Pouillard , Richard R. John  and others. We all agreed that the main variants of capitalism discussed by Hall and Soskice and their many followers are much newer than most researchers think. I also recall being at a conference mostly attended by political scientists, sociologists, etc where I astonished other attendees by repeating what Japanese business historians have told me, namely, that in the 1920s Japan was a Liberal Market Economy. That statement blew their minds as it challenged their assumption that national varieties of capitalism are deeply rooted in national political and cultural histories. Personally, I had never been much convinced by the theory that the variant of capitalism in place in a given country had much to do with what had happened many centuries ago as opposed to developments in since the Second Industrial Revolution.

Anyway, congratulations to the team that produced this Special Issue. The editors are Niall G. MacKenzie (Glasgow) ,Andrew Perchard (Newcastle Business School, Northumbria University), Christopher Miller (Glasgow) and Neil Forbes (Coventry). Article contributors were Martin Shanahan and Susanna Fellman, María Fernández-Moya and Núria Puig, and Zoi Pittaki, Pasi Nevalainen and Ville Yliasko, Martin Eriksson, Lena Andersson-Skog and Josefin Sabo, and Beatriz Rodriguez-Satizabal, Julie Bower, and Grietjie Verhoef.





Some Big Unanswered Questions in Business History

3 05 2021

The business historian Peter Scott published the following on social media

“I am interested in what the BH [business history] community considers the most important business/economic history questions that we haven’t yet answered. I am not talking about emerging areas/new perspectives, but important questions for which there are major data or methodological problems to providing answers. Any examples would be much appreciated.”

This is an excellent post, as it deals with empirical questions in BH for which we don’t yet have answers. It seems, to me at least, that following business-historical questions are the ones that most desperately needs answering:

  1. To what extent were decisions by British managers responsible for the slowdown in UK productivity growth relative to Germany and the United States that began in the late 1890s and which is visible in the data produced by Nick Crafts, Stephen Broadberry, etc? The Chandlerian explanation is that British firms were wedded to family capitalism longer than their German and American rivals. Alternative explanations include the hypothesis that the main culprit is educational malpractice:  British business families spent too much money educating their sons in Latin and not enough in chemistry and the other subjects associated with the Second Industrial Revolution. This question involves qualitative business historians using firm-level data in corporate archives to help to answer an empirical answer that our colleagues in the field of economic history cannot answer using their preferred research methods?
  2. To what extent was the achievement by 1950 of leadership positions in most industries by US firms a function of factors other than the favourable geographical position of the United States, which insulated American firms from the most devastating effects of the two world wars? In his rejoinder to the claims of Chandler, Les Hannah argued that the real reason US firms overtook British firms in technological prowess had nothing to do with the trends perceived by Chandler and everything to do with the fact the US, unlike the UK, wasn’t bombed and otherwise devastated during the world wars.
  3. How did decisions taken within British firms cause and respond to the putative Engels’s Pause of the nineteenth century? The Engels’s Pause was a period of about a generation in which working-class living standards stagnated even though GDP per capita and TFP continued to increase. There is intense interest today in this Pause because it appears to be similar to the experience of the United States since about 1980.
  4. Which firms took the lead in developing the global factories associated with the Global Value Chain Revolution of the 1980s and 1990s? Answering this question would involve using largely qualitative methods and firm-level data to deepen our understanding of the unbundling pattern that the economist Richard Baldwin identifies as the Global Value Chain Revolution?
  5. To what extent did managerial decisions contribute to the post-1973 growth slowdown in the United States? Explaining the growth slowdown is literally a trillion dollar question and any business historian who can make a credible contribution to the debate about what caused it and what can be done to solve it would, in my view, be well rewarded in professional terms.

Please note that I am aware that these questions have a very narrow North Atlantic economies comparative focus. I’m certain that business historians who formulated research questions about the so-called Great Divergence would also be doing very important work and would be justly rewarded by the academic labour market. Similarly, I’m aware that these questions all relate to the last 150 or so years and that research questions on older topics, such as the long-term roots of the British Industrial Revolution or the role of the Scientific Revolution in the Great Divergence would also be important.





Some Thoughts Inspired by Listening to Imran Ahmad Khan MP

28 04 2021

Imran Ahmad Khan, Member of Parliament for Wakefield, is a rising force in UK politics. In part because he achieved a historic victory over a Labour incumbent in a very working-class, traditionally Labour town, Khan is seen by many as the future of the modern Conservative Party. He is one of the most visible gay Muslim politicians in the UK. A graduate of the war studies postgraduate programme at King’s College in London, he has particularly interesting things to say about geopolitics.

He recently sat down with retired economic history professor and all-round interesting smart guy Steve Davies to talk about the future of free trade and British foreign policy (listen here). Khan spoke about the UK’s long history of promoting free trade and the long-standing belief that international free trade can help to promote world peace. Historically, this theory informed the robust advocacy of free trade by a range of British and American intellectuals and political leaders from Adam Smith and Richard Cobden to Cordell Hull and other architects of the GATT in the 1940s. In 1990s many leaders in the Western democracies were too confident that the process of integrating into the world economy would make authoritarian regimes peaceful and, eventually, democratic. That optimistic theory pervaded the academic literature in that era, informed the optimistic thinking of the Clinton administration, and was popularized by the journalist Tom Friedman with his Golden Arches theory of world peace.

Today, nobody is quite so sanguine about the ability of globalization and open markets to make the world peaceful and to promote human rights. Instead of becoming more peaceful and more respectful of human rights, authoritarian regimes are becoming more bellicose and more repressive. Rather than being on the cusp of a “South Korea in the 1980s style” transition to democracy, some of the world’s authoritarian regimes have become more belligerent and more domestically repressive as they have grown wealthy via globalization. We see that pattern in a couple of the large, nuclear-armed countries in what was formerly known as the Communist bloc. This pattern is the precise opposite of what capitalist peace theory would lead one to predict.  

Liberal democracies now need to come up with the sensible ways of managing their trade with countries that are both economically important and politically authoritarian.  There are three basic approaches open to us.

One approach is to systematically cut off all of the commercial relationships that currently connect firms and households win the liberal democracies with firms and households in authoritarian regimes, which is the approach advocated by some of the people who were around Trump, such as Steve Bannon. This approach led to Trump announcing a ban on Americans using TikTok, a harmless app that had been developed by a firm that happened to be located in an authoritarian country. Trump’s TikTok ban, which was ultimately blocked by the courts, deprives American consumers of a fun little app and is unfair to the apps creators, none of whom have been shown to be responsible for any human rights abuses.   

 As Steve Davies and Professor (and now Lord) Syed Kamall  have recently argued in a recent book, adopting the approach typified by the TikTok ban would be an act of national self-harm with serious economic consequences. Moreover, it would be a deeply illiberal exercise in collective punishment, as it would punish entirely blameless individuals and firms  who happen to live in authoritarian states. It would also be counterproductive, as such an approach would cut off the flow of liberal ideas to authoritarian regimes. Davies and Kamall rightly stress that last point.

At the other extreme, there are those that say the degree to which people in a given capitalist country interact with firms in an authoritarian firm should be entirely determined by the free market. This approach would mean that if some consumers in the UK want to, say, buy goods produced by unfree labour in an authoritarian regime or if a company wishes to sell high-tech dual use technology to that country, the government shouldn’t interfere with their freedom to do so.  Let’s call that the hardcore or maximalist libertarian approach.

A middle of the road approach says that democratic governments need to set ground rules about what type of trade their citizens are allowed to do with firms and individuals in authoritarian countries. Most Western countries seem to be gravitating towards that approach. The trillion dollar question, however, is how should these rules should be set and which principles should guide their development. Who gets to decide whether a given transaction between citizens of a democracy and citizens of an authoritarian regime gets to go ahead?

Should decisions about which transactions we are going to allow to proceed be made by a centralized process involving a small number of individuals working in secrecy? Or should such decision be made through a more inclusive distributed process? In other words, should the decisions be made by government officials who issue diktat saying “We aren’t going to allow this transaction to go ahead because of national security and stuff” or do we want decisions to be more transparent and to involve multiple veto points?

There are good reasons for thinking that following principles should apply here: consistency with least-common-denominator shared values, transparency, and decentralized enforcement.

There are many problems with the centralized process approach that the US adopted under Trump.  Most obviously, there are the problems with banning specific commercial transactions on vague “national security” grounds is that the justification for such bans are usually unclear and based on classified information. “National security”, an American term that has now spread to many other countries, can be invoked in almost any case, as when Trump limited Canadian steel exports to the US on specious national security grounds.

A more fundamental problem with the centralized approach is that not all citizens may agree with the conception of national security that informs the thinking of the person or persons who are charged with deciding if a given transaction undermines national security. We wouldn’t entrust a single policymaker with the power to block all proposed transactions that are “unethical” in the opinion of a because people’s definitions of what is “unethical” are so diverse. If we are going to interfere with freedom of contract by allowing democratic governments to block some transactions involving our citizens and the citizens of authoritarian states, we need very clear standards that accord with the values of virtually all citizens. (I say virtually all because I know that 100% unanimity is impossible).   

Another problem with the approach typified by the Trump administration’s banning of TikTok on national security grounds is that decision-making about which transactions to a permit is highly centralized and is concentrated in the hands of just a few bureaucrats is almost never a good idea. Such top-down, centralized approaches are usually counterproductive because of what Hayek called the Knowledge Problem and also the enhanced probabilities for bureaucratic self-interest that emerge whenever you entrust decisions to a small group of unelected officials.

So how should democratic states make and enforce rules about what sorts of transactions their citizens are not allowed to contract with individuals and firms in authoritarian countries? There are strong theoretical reasons for believing that the best way of filtering trade between liberal democracies and authoritarian regimes is to create a distributed process for deciding which transactions are going to be banned. Ideally, this process should be grounded in moral principles that command near universal assent rather than vague judgement calls about “national security”. For instance, while many Americans clearly disagree with the principle that defending Latvia from Russia is something that US taxpayers ought to be doing, virtually everyone in that country believes that murdering individuals because of their political views is wrong. That’s why there is now widespread support for Magnitsky legislation in many Western countries. (When a country passes such a law, which are named after a Russian lawyer who was murdered by Russian officials, prevent the individuals who were involved in such crimes from doing business there).

Similarly, there is now nearly universal support for the idea that de facto enslaving people so they can produce a commodity for export is deeply immoral. I’m certain that if you look on the internet hard enough you can find a couple of racists who think that the abolition of slavery by Abraham Lincoln was a bad thing but I think we can all accept that 99% of the citizens of Western democracies now believe that slavery and social systems closely akin to slavery are wrong. Similarly, 99% of citizens in Western democracies believe that genocide is everywhere and always wrong.

The lowest common denominator norms against murder and slavery, not vague ideas about national security or the national interest, should in, my view, by the basis of laws in Western countries for limiting transactions between their citizens and persons in authoritarian regimes.   

We should also make the enforcement of these laws a more distributed process, so instead of relying on government agencies to impose fines on firms and individuals that break the rules, Western countries should create tort systems to incentivize firms and individuals not to enter into commercial transactions that facilitate murder and slavery in authoritarian regimes. Such tort systems could be modelled on the Alien Tort system in the US, which is currently being used in an effort to discourage multinational firms from using commodities produced by unfree labour. (We should all be following the case of Nestlé USA Inc vs Doe, which is now before SCOTUS). The virtue of this approach is that it would force people who believe that a given transaction would tend to result in more murder and slavery in such countries to publish evidence to support their claims. Such evidence might take the form of satellite photos and eyewitness accounts of escapees. As a tort-based approach involves more people in the process than a bureaucratic approach, which should result in better decisions.   Moreover, a tort-based approach harnesses the profit motive of lawyers and private investigators to help gather and assemble evidence of wrongdoing overseas.





Some Thoughts About Laurentian

19 04 2021

My first tenure-track job was at Laurentian University in Ontario, Canada. I worked there for three years before coming back to the UK, where I have pursued my academic career ever since. I grew considerably during my time at Laurentian: my capabilities as a classroom teacher improved immensely and my French benefitted from the fact I was in a bilingual organization. I also benefitted from being exposed to a part of Canada very different from the parts of the country I previously knew, which were basically affluent cities and suburbs.

Northern Ontario School of Medicine, Laurentian University Campus

By Matt Strickland at English Wikipedia, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=16728682

  I was saddened to a hear that this university had declared itself to be insolvent was undergoing reorganization under Canada’s CCAA, which is the equivalent of Chapter 11 in the United States or Administration in the insolvency law of England and Wales. I was also saddened by the huge cuts to a wide range of departments and degrees (ranging from civil engineering to midwifery to political science) that have had a negative impact on students, faculty and other stakeholders.

I was distressed to see the blog post by higher education consultant Alex Usher in which he documented that the university’s president and board of directors had presided over a degree of mismanagement that seems to, in the opinion of Usher, constitutes extremely bad business judgement. When a company goes bankrupt or goes into the local equivalent of Chapter 11, the shareholders usually aren’t entitled to sue the directors and top executives. However, they are allowed to sue if they can demonstrate to a court that the managerial judgement calls that caused the organization to fail were so incredibly bad as to constitute really bad business judgement that nobody familiar with the industry in question would regard as a legitimate way to run an organization of that type. We are talking about the difference between circumstances in which people review the managerial decisions that contributed to insolvency and say “Yeah, I can see why they made that bet and it’s too bad it didn’t work out because of unforeseen factor X” versus “Holy crap! I can’t believe a manager in that type of organization would ever do that!” For a theoretically-informed overview of what the law says about business judgement in England, Australia, and Delaware, see this recent paper in the Journal of Legal Studies here.

 Now all universities have suffered because of the pandemic and universities can also suffer due to long-term trends such as declining population in their region. Sometimes universities fail despite the directors and managers diligently discharging their duty of care to the best of their abilities. However, the language that Alex Usher uses in writing about the degree of bad decision-making at Laurentian University leads me to expect that there will be various lawsuits, including class action suits, against the directors and executives of this organization. I express absolutely no opinion here about whether these lawsuits will be successful or meritorious—for informed opinion about that, please talk to an expert on Canadian insolvency law and the duties of directors such as Thomas Telfer.  However, I’m surprised that no such lawsuits have already been announced. I would tend to attribute the fact we haven’t seen such lawsuits yet to the fact that Laurentian’s alumni and other stakeholders tend to live at some distance from the large law firms that have expertise in this specialised area of law. Another factor is that academics, the stakeholder group affected most directly by this event, generally do not use agency theory and the rational actor model to understand how senior university administrators operate. These are the approaches that I have used throughout my academic career to understand what is going on around me. In fact, I often encourage PhD students who want to have academic careers to read chapter 2 of Cracks in the Ivory Tower, a controversial book that applies agency theory to higher education.  While I think that it is possible to go too far in applying agency theory in understanding how universities operate, as the Cracks in the Ivory Tower book does, I think that they have enough predictive power to be useful. I think, therefore, that people who want senior managers at other universities to make better decisions should think about what agency theory says can be done in this case. The academic literature on director liability in UK higher education  and US higher education may be useful in thinking about these issues.

I also think that we need to think carefully about the circumstances in which universities and similar organizations should be allowed to file for Chapter 11 and its local equivalents. I’ve said above that I think that it would be a mistake to say that CCAA should never be available to such organizations. The CCAA process was created in the Great Depression to allow insolvent companies in restructure while remaining in business. It’s less dramatic that traditional bankruptcy. In a company with shareholders to whom the executives must answer and who know their investments will lose much of their value if the firm goes into CCAA, the CCAA process is only going to be used as a last resort. A non-profit university doesn’t have residual claimants with the same sort of control rights, so the ex ante personal costs to a university president and board members of triggering CCAA are lower than in a regular company. The CCAA route, which has massive social costs for creditors and other stakeholders, should only be used as a last resort. I think that it would be reasonable to say that university administrators should be allowed to use it, but only if they are willing to sacrifice some of their personal wealth.

During and after the 2008 financial crisis, which saw governments bailing out Wall Street firms and managers who had made bad choices, people from across the political spectrum complained, quite rightly, that we have a system in which top executives were rewarded by the market when their risky bet paid off and were rescued by the taxpayer when things went bad. “Capitalism for the profits, socialism for the losses” isn’t how a capitalist system is supposed to work as the risk-reward matrix requires personal accountability by top decision-makers. Personal accountability should apply to well-paid managers in the private sector and to those in the public sector as well. I’m astonished that the president of Laurentian remains in his job.

I see that a Canadian MP has introduced a private member’s bill that would clarify that universities should not be allowed to use the CCAA process. I understand there is widespread support for this proposal but I think that the bill should not be supported. First, to say that universities should NEVER be able to use CCAA is an extreme position. This bill, if passed, could force a university to declare bankruptcy at some point in the future.  Moreover, it is a diversion from the more immediate action that is needed. Right now though, we need something immediate to disincentivize other university managers from using CCAA unless it is absolutely necessary. A private member’s bill that would take forever to get Royal Assent isn’t going to work. A personal lawsuit against university administrators for mismanagement launched by donors, faculty, alumni, and other stakeholders would raise the ex ante costs of other university presidents who are considering whether to trigger the CCAA button.  I understand that when Mountain Equipment Coop’s board and CEO used the CCAA option, they were sued by the customers of that beloved brand. I’m not saying the CCAA option should never be available to the managers of universities, cooperative, and other organizations that aren’t profit-seeking companies, merely that it should be a last resort that is only available to the directors at the cost of the loss of their personal money. 

A contact in Canada has brought a very interesting piece in the Globe and Mail to my attention. The article reports that the university’s creditors, which include TD Bank, had extended unsecured loans to the university on the assumption that in lending money to on Ontario university, they were effectively lending money to the Ontario government, which has a very high credit rating. The decision by the Ontario government to allow LU to go into CCAA suggests that, in future, lenders will regard each university as a separate borrower and will attach interest rates and conditions to loan that reflect each institution’s own credit rating. Perhaps Ontario universities will have to start issuing bonds that are rated on the issuer-pays model. US and UK universities already have bond ratings issued by Moody’s (for example, see here).

The article also discusses the political dimensions of this debacle and suggests that the increasing unpopularity of universities with people on the cultural right (think of fans of Jordan Peterson) may have something to do with the decision of the Ontario government not to bail out this organisation at a time when so many other organisations (airlines, restaurant chains, etc) are getting assistance from the taxpayer.  Konrad Yakabuski writes:

The Ford government was roundly despised within the ranks of the academy well before the Laurentian debacle landed on its plate. On the campaign trail, Mr. Ford railed against cancel culture on university campuses in his province. Only weeks after taking office in 2018, his Progressive Conservative government implementednew rules requiring post-secondary institutions to “protect free speech” and “not attempt to shield students from ideas or opinions that they disagree with or find offensive.” That put a lot of noses out of joint on campus, where woke culture rules with an iron fist.

Next up, the Ford government cut tuition fees by 10 per cent and froze them for two years, without topping up university operating grants. For smaller institutions, which already had a harder time than their bigger peers drawing premium-paying international students, the revenue crunch made already tight budgets unworkable. For Laurentian, which appears to have overextended itself on capital projects it had little business undertaking, the COVID-19 pandemic and subsequent loss of foreign students left it with no choice but to call in the bankruptcy experts.

I have long been concerned that universities in the English-speaking countries have been losing their social licence to operate because they have become too closely identified in the public mind with the political left, particularly with the new variant of the left that is closely associated with intersectional theory and identity politics.  It has long been common knowledge that university professors tend to be a bit more left-wing than the average person their age. Registered Democrats have outnumbered Registered Republicans in the US for many decades. However, until recently there were always at least a few prominent right-wing professors at the top universities. Hard data from the US (see here and here) and soft data from other culturally proximate countries suggests that universities have, since 2000, become less viewpoint diverse and downright hostile to conservatives. If the average centre-right voter, or even the median voter, forms that impression that universities nowadays only teach postcolonial transgender settler-colonial sociology and have become intolerant of conservatives, centrists, and even Old School Social Democrats, such voter will tend to believe that universities, unlike obviously socially useful organizations such as airlines, just don’t deserve bailouts in crises. In early May, university leaders here in the UK were very disappointed when the government turned down their request for a comprehensive bailout package and instead offered much more limited support to universities.  

The right-wing of the Conservative party was positively jubilant when they heard that the universities were being denied their request for a cash bailout. They were also pleased that the government said they would allow UK universities to go bankrupt. Now if you take a close look at the fine print in the UK government’s rescue measures announced in May 2020 it is certainly true that they denied the request for a straight bailout, but they also announced a host of below of the radar measures to help universities, such as giving them the right to collect a full year’s tuition fees in September and, crucially, urging university creditors to show “restraint”.  My reading of the situation is that the UK’s Conservative government wanted to show its electoral base that it is going to tough on the bearded Marxists in higher education sector while quietly supporting a sector that has played such a crucial role in the UK response to the pandemic (think of the vaccine developed by Oxford scientists). Something similar may be going on in Canada.

In the case of Laurentian, however, we are talking about a university that is far more ideologically diverse and far more hospitable to small-c conservatives than the highly selective universities that serve more affluent universities. Ontario policymakers should be aware of that Laurentian is a relatively conservative, or at least viewpoint diverse university. It’s not the type of Woke University that Jordan Peterson warned you guys about. At Laurentian, I had a number of senior colleagues who were open about the fact they belonged to socially conservative Protestant denominations. I also had colleagues who were very supportive of the controversial decision to send Canadian soldiers to Afghanistan. In fact, when I was there the president of the university, who is a relative of a famous Canadian social democrat, drove a car that was emblazoned with a yellow “Support Our Troops” ribbon, which in the Canadian context is a signal of relatively conservative political sentiments. I had a colleague who produced a large number of scholarly works that were informed by her Pentecostal faith and who was extremely popular with students, especially those who shared her religious beliefs.

I noticed that while many of the highly selective universities that teach affluent students have pretty much discontinued the teaching of military history as too macho and vaguely right-wing, military history is alive and well there. Laurentian even hired me—I’m neither a conservative nor a Conservative, but I’m not a standard woke leftist either. At dinner with colleagues at Laurentian, I once let slip that I supported capital punishment for murderers. I was feeling a bit provocative that day so I added in that I was particularly in favour of capital punishment in cases in which the murderer has harvestable organs that could be given to law-abiding citizens. (I was dining with some professor friends who I knew had pretty strong left-wing views about the justice system). There was a brief moment of silence as my colleagues were shocked to hear a fellow academic support the death penalty but then someone broke the ice by joking “Hey, this isn’t &*%ing Queen’s University here—the president has a yellow ribbon on her car so you should fit in just fine.” Queen’s is university that teaches disproportionately wealthy students and which is famous for virtue signalling and left-wing academics.

I would conclude this blog post by saying that I am saddened by the crude, callous, and inefficient way in which Laurentian appears to be making its transition from a bilingual French-English institution to one that is basically English. Academics whose first language is French have been fired without consideration of their ability to teach in English. Given that the correlation between being a native English-speaker and getting high student satisfaction scores is relatively weak, that seems like a dumb and cruel move.

 I understand that underlying demographic reality: the proportion of Canadians whose first language is French has shrunk dramatically in my life time and in the region in which the Laurentian is located there are large numbers of students with French Canadian surnames who can’t speak a word of French and, in some case, can’t communicate with their grandmothers. From the 1960s to the 1990s, Canadian politics was dominated by questions related to French-Canadian nationalism and this university’s bilingual heritage needs to be understood with respect to that political context. Obviously the Ontario of 2021 is a radically different society and the French-English tensions of the twentieth century seem somewhat quaint, as do the closely related tensions between Catholics and Protestants.  So I suppose it was inevitable that Laurentian eventually eliminate the courses that are taught in French, which have small numbers of students, and which were being cross-subsidized by the end of the university. It was also probably necessary for the university to eliminate the costly and anachronistic system of have separate federated universities for several Christian denominations. The importance of intra-Christian identities in Canadian society has faded in recent decades, so paying to have separate mini-universities for the Anglicans and the United Church etc seems really anachronistic in 2021.  So modernizing the university doubtless involves changing its culture so that it is better equipped to attract students from the very successful multicultural city of Toronto. However, it should have been done in a more efficient and compassionate fashion.





My Thoughts About Miller on Teaching History in Business Schools

6 04 2021

Is historical knowledge useful to business people? If so, what types of historical knowledge are most useful to decision-makers in finance? How certain can we be that history is really useful to practitioners?

Financial history has long been a major focus of research and teaching at the International Finance Center of the Yale School of Management.  The ICF director, William Goetzmann, a finance prof who is the son of a historian, has published extensively on financial history. Scott C. Miller, a Yale ICF postdoctoral fellow in economic and business history, has published a very interesting and important blog post about the importance of teaching history in business schools. His post, which was shared today in a social media group for business historians, discusses the current state of business history teaching and research in leading US business schools. Miller’s focus is, quite rightly, on identifying the benefits to students of including more historical content in the curriculum of business schools. Miller argues, very plausibly, that this issue is very important not just to the individual students and their future job performance, but also to society as a whole. He observes that decisions about the MBA curriculum are highly consequential because they may change how people with really important jobs will make decisions:  “it is largely business school graduates who will make the economic, financial, and business decisions that prepare the ground for massive societal change. As business schools train students to make these decisions, they have the duty to remind them of the implications of these decisions as well.”

Miller makes four separate yet related claims about the benefits of learning about history. I think that the cause of promoting business history in management schools will be served by each of his claims to a steel man critique, which is the spirit of my reply to his blog post. I’ve tried to respond to each of his points below.

  1. It prevents bad, ad-hoc uses of history in decision making. Whether we acknowledge it or not, humans are historical animals. We base most of our decisions not on incoming data, but rather on the historical and cultural frameworks through which we filter that data. Since historical and cultural forces shape not only how we analyze information, but also what and how data is even collected, historically-minded analysts are much better positioned to not simply process the data they have, but understand what data they are missing and why they are missing it.

    On a practical level, teaching economic, financial, and business history well in business schools will help prevent the ultimate analytical error: fighting the last battle. Since humans intrinsically look to the past for guidance, they tend to find solutions in the past as well.

My thoughts: We are all familiar with the aphorism that generals are constantly fighting that last war. I suppose the desire to prevent this error helps to explain why staff colleges have invested so heavily in military history and why so many schools of policy analysis require their students to read the famous book by Neustadt and May called Thinking In Time: The Uses Of History For Decision Makers. I’m very confident that proper empirical research using such methods as in a randomized control trial would confirm the hypothesis that learning history and then thinking historically about important decisions can indeed be useful in promoting the correct use of history. However, right now this claim is just an untested hypothesis and we need to acknowledge that limitation in keeping with the principle of intellectual humility (the third learning outcome identified by Miller). Experimental research on the effects of learning history or thinking about history is still very under-developed, notwithstanding some important early papers by Gilovitch (1981) and others. I have long argued that the case for teaching business history in North American management schools would be boosted by the publication of robustly scientific research results that supports this hypothesis. Simply presenting this hypothesis as already proved by any shadow of a doubt may actually be counterproductive as it may cause sceptical audience members (e.g., your standard finance prof) to ask “How do you know that?” We need a iron-clad proof that teaching history is functional to overcome the opposition documented by Miller.

  • It promotes long-term thinking. While we are unlikely to break the tyranny of the quarterly report any time soon, teaching history forces business school students to think in the longue durée. Analysts who think in the long term are less susceptible to mistaking volatility spikes for the greater trend, and thus better structure investments and firms that are successful over 20, 50, and even 100 or more years. (My SOM colleague Paul Schmelzing’s work on the “suprasecular” decline of interest rates over the last 700 years is a perfect example of this.)

Once again, the argument that learning business history promoted long-term thinking in businesspeople remains an untested hypothesis, albeit one that is very plausible to me. Miller’s claim that teaching history promotes long-term thinking is certainly more plausible to me than the view that the most cost-efficient way to promote long-term thinking is to change how we write dates. Some readers will be aware that Russell Brand and the Long Now Foundation have argued that changing the way we write out the date by adding a zero at the front (e.g., “02021”) would encourage long-term thinking and thus better decisions. The Long Now Foundation’s claim has never been tested, however.

Neither has anyone been able to test the theory that learning about financial history, the cycle of boom and bust, Tulipmania, 1929, and all that, makes a financial decision-maker behave in a less risky fashion.  A couple of years ago, I sought out funding to run some experiments with some collaborators to test some of the claims that have been made over the years about the impact of teaching history in business. Unfortunately, the funders didn’t give us the money to do the research and nobody has, to my knowledge, done similar research.

As I wrote in 2018, one of the common justifications for teaching history to future managers and decision-makers is that awareness of the past makes them better judges of risk. After the 2008 financial crisis, which revealed that many managers had incorrectly judged financial risks (via misplaced optimism about the prices of particular securities), there were calls for the sharing of more historical information with businesspeople. For instance, the Bloomberg business news service established its Echoes column, which showcased economic-historical research related to the 1929 stock market crisis and other episodes in financial history.  To date, however, nobody has rigorous tested the conjecture that promoting greater awareness of history would actually improve the behaviour of businesspeople by one or more measurable indicators.

Luckily for purposes, there is an extensive body of experimental research in psychology, finance, and other fields, on the determinants of individual’s levels of financial risk aversion. Many of these experiments involve requiring subjects to participate in the Iowa Gambling Task. Some of this research looks at how fixed traits (e.g., gender) influence risk preferences, while others examine how the administration of certain hormone (e.g., an injection of testosterone) can change a given individual’s level of financial risk aversion (see Nave et al., 2017; Kusev et al., 2017). In such experiments, risk aversion is usually measured through behaviour in games played for small sums of money. For our immediately purposes, the most relevant area of research on financial risk aversion relates to cognitive priming.  In psychological research, “cognitive priming” involves presenting subjects with images or words that trigger the recall of prior knowledge that in turn has a measurable effect on many types of behaviour. 

The question for us is: how does cognitive priming that surfaces an individual’s historical knowledge change their aversion to financial risk?  Since we know from (Gilovich, 1981)  that cognitive priming by evoking memories of different historical wars (World War Two, Vietnam) changes how American individuals thinking about proposed military action by the United States,[1] there are strong apriori reasons to expect that reminding people about different episodes in financial history would change individuals’ appetites for financial risk. In the experiment I proposed in 2018, subjects would be randomly divided into two groups. One group would be presented with a short historical text about the financial history of the United States in the 1920s that would end with the 1929 stock market crash. The other group would be presented with a short historical text about a successful entrepreneurial firm (say Intel) that does not include references to ANY financial setbacks and crises. We would then compare the behavior of the two groups on the Iowa Gambling Task, a computer game that measures appetite for financial risk.  If there is a significant differences between the revealed risk aversion of the two groups, we will be able to confirm our hypothesis that historical knowledge changes how people perceive risk.

  • It fosters humility. At the beginning of my economic and financial history courses, students routinely begin questions with some variation of, “We know that these people were less sophisticated than us, so…” By this they tend to mean, “we have better data, more developed analytic theory, and better computational tools, so I know that we would not have made these mistakes.” Interestingly enough, however, this prelude always disappears by the end of the semester.

Would repeated cognitive priming with references to history in either the classroom or the workplace serve to remind decision-makers of history in a fashion that would promote intellectual humility? Anecdotally, we know the Warren Buffett has explained his decision to put framed newspapers from the 1929 market crash in his office by saying he wanted to remind to surround himself with a reminder of bad decisions taken by other investors.

  • It reminds students of the raw power to shape society that they will soon wield. I firmly believe that economic crises, not political or social trends, cause profound societal shifts. The Depression of the 1780’s, not independence from Great Britain, resulted in the U.S. Constitution.

Ok. Good point.


[1] This experiment (Gilovich, 1981) involved a population of US university students majoring in International Relations who were randomly divided into two groups. One group was exposed to text that reminded them of Neville Chamberlain, the well-known appeaser of Nazi Germany. The half of the population was exposed to texts designed to trigger the recall of knowledge about Lyndon Johnson, the President whose decision to escalate the Vietnam War is now generally perceived to have been a mistake. The subjects were then asked about a hypothetical situation in which the United States had the option to use military force against a non-democratic regime in a distant country. Subjects who were cognitively primed via the references to Neville Chamberlain were measurably more likely to support military intervention than those who had been subtly reminded of the Vietnam War via the references to Lyndon Johnson.  





Some Thoughts on the Recent Symposium on Racial Justice, History, and Business Ethics

29 03 2021

On Friday, 26 March, I was honoured to be part of an online panel on Racial Justice, History, and Business Ethics organized by the Social Issues in Management Division of the Academy of Management. The panel, which is connected to a special issue of the Journal of Business Ethics (an FT50 journal) on racial injustice and business ethics, included me as well as  Jennifer Johns (Bristol), Leon Prieto (Clayton State), and Simone Phipps (Middle Georgia State). My hosts included Paul T. Harper of University of Pittsburgh’s Katz School of Business and David Wasieleski, who is the Albert P. Viragh Professor of Business Ethics in the Palumbo-Donahue School of Business at Duquesne University. Anyway, here is a description on the session written by the organizers

Panel will provide examples of the ways ahistorical methods and temporal frames expose and occlude the role of race in knowledge creation processes. The Atlantic Slave Trade as a context for understanding current management practices will be discussed as well as the unrecognized history of Black entrepreneurship in the U.S.

You can watch a video of the session here.

I found that the session on Friday was extremely stimulating and useful. I learnt a great deal from listening to the presentation of my co-panelists. I also got valuable feedback on my paper that will allow us to do a better job of preparing it for submission to the journal. A major theme of the conversation on Friday was slavery, both its historical legacies and the existence of slavery and slavery-like forms of exploitation in the present. In my presentation, I suggested that corporate involvement in crimes against humanity usually, although not necessarily in all cases, involves a company profiting from the mistreat of individuals who have been Otherized. By Otherized, I mean depicted by a regime or a culture as inferior, sub-human, and less deserving of the rights enjoyed by individuals who are members of the locally dominant ethno-racial group.

 I prefaced my discussion of the involvement of companies in the historic crime of Black slavery by observing that while historic crimes by firms are, in theory, a separate from discrimination against Otherization populations the most prominent examples of firms having profited by participating in crimes against humanity involved crimes that were directed against Otherized populations (e.g. Jews in Nazi Germany and people of African descent in the British Empire and its offshoots). I would theorize that the worst types of criminal behaviour involve actions at the expense of marginalized groups because managers who are embedded in cultures that have already Otherized the group in question find it easier to justify their decisions to exploit.

It seems to me that this historical pattern is consistent with what we see in the present, particularly with respect to some patterns we currently see in debates about global supply chains for such commodities as cotton.

Overall, the session was superb. I was pleased by the fact it was diverse in so many dimensions– by presenter demographic background, but also by academic discipline, meta-theoretical orientation, and geography.





Racial Justice, History, and Business Ethics

25 03 2021

The summer of 2020 was a difficult one for the companies that once profited from African slave labour, as these firms, which include such prominent corporations as Citibank and Barclays, faced calls from Black Lives Matter and others to apologize and pay reparations for their involvement in the historic crime of slavery. The managers of these firms responded in strikingly different ways. About thirty extant US and UK firms with documented ties to pre-1865 Black slavery have faced criticism for their roles in what is now universally regarded as a terrible crime against humanity. Tomorrow, I’ll present research that aims to explain why the companies accused of historic involvement in slavery have responded in such different ways. I’ll be speaking to a webinar organized by the Academy of Management’s Social Issues in Management Division. To attend the webinar, please register here.

I’m really looking forward to presenting my co-authored research and from hearing about the interesting research of my co-panelists.

AOM Social Issues in Management Division
Hosts: Paul T. Harper (Pittsburgh) & David Wasieleski (Duquesne)


Featuring:
Andrew Smith (Liverpool)
Jennifer Johns (Bristol)
Leon Prieto (Clayton State)
Simone Phipps (Middle Georgia State)

Panel will provide examples of the ways ahistorical methods and temporal frames expose and occlude the role of race in knowledge creation processes. The Atlantic Slave Trade as a context for understanding current management practices will be discussed as well as the unrecognized history of Black entrepreneurship in the U.S.
Lead Sponsor: Katz Graduate School of Business, University of Pittsburgh. Special thanks to the David Berg Centre for Ethics and Leadership for sponsoring the event.





Global0013. Benefits offered by historical explanation to statistical studies in strategic management

14 03 2021

Interesting webinar.

Business History Collective / Colectivo de Historia Empresarial

23/03/2021 16.00 UK

Register here

Presenters: Sandeep Pillai (Bocconi University), Brent Goldfarb,and David Kirsch (University of Maryland)
Chair: Adam Nix (De Montfort University)

Abstract:

We contribute the literature on research methodologies in strategy research (CITE) and argue that historical explanation is essential to improve the internal validity, external validity, and objectivity of statistical reasoning. To enhance internal validity, tools used by historians offer statistical reasoning explanatory virtues, visibility across time and levels of analyses, the ability to identify mechanisms, and the ability to test that a proposed hypothesis is invariant. Explanatory practices followed by historians improved external validity because it provides readers with embedded generalizations from logically rigorous analytic narratives and contextualized thick descriptions that the readers can then use to determine whether the explanations are generalizable to contexts that are of interest to the readers. Further, to improve objectivity, historical explanation complements statistical reasoning through source…

View original post 19 more words