Wednesday, December 07, 2016

Human Progress: (4) Life Shows More Equality of Opportunity

Five years ago, the "Occupy Wall Street" movement began when protesters settled into Zuccotti Park in New York City's Wall Street financial district.  Their slogan was "We Are the 99%"--to indicate their protest against the economic inequality that allows too much wealth to be concentrated in the top 1% of society. 

One of the protesters in the park held a poster that said: "Equality First! Liberty Later."

This suggests that equality and liberty are in conflict, and that fairness requires that equality take priority over liberty, that we sacrifice liberty to achieve equality.

Against this position, classical liberals argue that equality and liberty are compatible, as long as equality is rightly understood as equality of opportunity rather than equality of outcome.  According to this liberal view, inequality of outcome is good inequality if it arises from equality of opportunity.   By this standard, the liberal social order has achieved human progress if life shows more equality of opportunity than ever before.

Despite the general agreement in American political debate on the idea that government should secure equal liberty for all individuals, as expressed in the Declaration of Independence, Americans disagree about the best way to pursue that goal. 

Thomas Jefferson hoped that although previous regimes had promoted "an artificial aristocracy, founded on wealth and birth," American democracy would be ruled by "a natural aristocracy" grounded on "virtue and talents."  He thought that an educational system was needed that would allow the most talented few to rise to the top even if they were born poor and of low social status.  Democratic equality, therefore, would be an equality of opportunity that would give all people the liberty to develop their talents, so that the naturally best could rise to the top.

Abraham Lincoln conveyed this thought in his metaphor of life as a race.  The primary aim of popular government, he believed, was "to elevate the condition of men--to life artificial weights from all shoulders--to clear the paths of laudable pursuit for all--to afford all, an unfettered start, and a fair chance, in the race of life."  This is the noble vision that elevates American political rhetoric.  It's the American Dream--a fair chance for all to get ahead in life.  And, of course, Lincoln himself--born in a log cabin--was the exemplification of this American story.

And yet Lincoln's image of the race of life suggests a possible conflict between equality and liberty.  The fairness of the race demands equality at the starting line but liberty in the running of the race.  The faster runners must be free to take the lead and leave the slower runners behind.  But how can we be sure that the slower runners are not hindered by "artificial weights"?  How many of the slower runners were raised in families that didn't train them to run fast?  Were they perhaps born to parents who had already fallen behind in the race?  Is it unfair for the fast runners to be free to give their children a head start?  Was it unfair, for example, that Fred Trump gave his son Donald a head start in becoming a real estate developer in New York, and that Donald could later give his children the same head start?

Does fairness require that we occasionally stop the race, bring everyone back to the starting line, and then start again?  If so, what does this mean?  Would we have to abolish the family, because children born into different families will not be equal at the starting line? 

What should we do for those who from birth suffer physical or mental disabilities that prevent them from running well?  And what should we do for those who accidentally injure themselves during the race?  Should the faster runners be forced to help those who are unfairly disadvantaged? 

President Lyndon Johnson answered yes to this question in 1965, when he announced the policy of "affirmative action" to ensure for black Americans not just equality of opportunity but equality of result in the race of life.  Does this mean that slower runners must have a head-start in the race?

On the one hand, those of us who stress the fairness of equality would want to protect the unfortunate from unfair competition.  On the other hand, those of us who stress the fairness of liberty would want to protect the freedom of the fastest runners to win the trophies.

If we were serious about achieving equality of outcome, even at the expense of liberty, would we have to embrace pure socialism, in which the private family and private property would be abolished, and the means of production would be collectively owned by government?  This is what Karl Marx proposed.  And this is what has been attempted in some utopian socialist communities, like the original kibbutzim in Israel.  (As I have indicated in some previous posts here, and here., the kibbutzim have given up their attempts to abolish private property and private families.)

Max Roser's article on "Income Inequality." is a good survey of the empirical data on inequality in economic income.  But he does not distinguish inequality of outcome and inequality of opportunity.  And so he does not consider the possibility that some of the inequality of income that we see arises from equality of opportunity.

He employs the Gini index as a measure of the income distribution of a population.  The Gini index is a number from zero to 1, where zero represents a distribution that is perfectly equal--so that 10% of the population earn 10% of the total income, 20% earn 20%, and so on--and 1 represents the maximum of inequality--so that one person has all the income of a population.  Another measure of income distribution is to calculate the share of total income or wealth received by various strata of the population--the top 1%, the top 20%, and so on.

Despite the debates over these measurements and what they mean, some patterns do emerge rather clearly. In the richest countries, there was great economic inequality at the beginning of the 20th century.  So that, for example, in the USA, before the Second World War, 18% of all the yearly income went to the richest 1%.  But then the share of the top 1% began to drop.  In 1980, in the USA, the richest 1% received 8% of the total income, much lower than the 18% before the war.  Then, however, inequality started to rise again.  By 2010, in the USA, the richest 1% received 17% of the total income, which was back up to the pre-war levels.  Other English-speaking countries (Canada, the UK, Ireland, and Australia) show the same U-shaped pattern: high inequality, then a drop down to the 1980s, and then rising again to the high levels of inequality seen at the beginning of the 20th century.

The pattern for much of continental Europe and Japan is a little different.  It's an L-shaped pattern: first high inequality, then a drop down to the 1980s, and then leveling off or only slightly rising over the past 25 years.  For example, in Sweden, the richest 1% in 1938 received 12% of the total yearly income, then in 1980, this was down to 4%, and in 2010, it was up slightly to 7%.  Sweden and the other Nordic countries (Iceland, Norway, Finland, and Denmark) have the lowest levels of economic inequality today, apparently because their policies of Social Democracy (or Democratic Socialism) favor a redistribution of wealth from the rich to the poor.

Those people who worry most about economic inequality--people like Thomas Piketty and Bernie Sanders--generally argue that the English-speaking countries should try to be more like the Nordic Social Democracies--using confiscatory progressive tax rates and social welfare programs to redistribute wealth from the rich to the poor and the middle class.  To escape from an unfair inequality, they argue, we need to move from liberalism to socialism. (I have written a series of posts on Piketty that begins here.)

And yet, there are at least three objections to this argument for socialist equality.  The first is that the empirical data for economic inequality does not necessarily show a lack of equal opportunity, because there can be a lot of mobility into and out of the top economic ranks of society.

 In fact, there is a lot of evidence for such mobility.  Economists who study this have shown that over 50 percent of Americans will be in the top 10 percent of income-earners for at least one year in their lives.  Over 11 percent of Americans will be among the top 1 percent of income-earners (people making a minimum of $332,000 per year) for at least one year in their lives.  94 percent of the Americans who join the top 1 percent group will keep that status for only one year.

Moreover, the factors that explain higher household incomes among Americans are not fixed over a lifetime, and they are to some degree a matter of personal decisions, which means that people are not forced to remain in one income bracket for their whole lives.  American households with higher than average incomes tend to be households where the members are well-educated, in their prime earning years (between the ages of 35 and 64), working full-time, and are in stable marriages.  Households with lower than average incomes tend to be households where the members are less-educated, outside their prime earning years, unemployed or working only part-time, and they are likely to be unmarried.

A large part of the growth in economic inequality among Americans over the past 40 years has been a result of assortative mating:  college students marry people they have met in college, and then form two-income households with the higher income levels correlated with higher education.  These "power couples" are then in a position to help their children become successful, because their children will inherit the good genes of their parents as well as the good environments of rearing the parents provide.  Since high educational achievement is correlated with high IQ, and since the higher paying jobs in a highly technological and mentally challenging economy require higher intelligence, what we see here is the emergence of what Charles Murray has called a "cognitive elite."  So if we really wanted to reduce economic inequality, we would have to prohibit intelligent and well-educated people from marrying other intelligent and well-educated people.

Consequently, people can raise their chances of becoming wealthy by getting a good education, by getting married to other well-educated people, by getting lots of professional work experience, and by forming two-income households.  When people do this, they create economic inequality.  But isn't this good inequality?

That supports the second objection to socialist equality--inequality of outcome is not necessarily unfair if it arises from an inequality of skills, education, and intelligence.  As Roser shows in his article on "The Skill Premium," much of the growing inequality of income over the past 30 years can be explained as the growing gap between skilled and highly educated people and those who are unskilled and less educated.  The economic effects of technology and globalization have increased this gap.

The third objection to the argument for socialist equality as achieved by the Nordic Social Democracies is that these are not really socialist regimes but capitalist welfare states, and consequently they fail to achieve the absolute equality of outcome that would require a pure socialism that most human beings find undesirable.  As I have indicated in a previous post, the Nordic Social Democracies are highly ranked on the indexes for "economic freedom" compiled by the Heritage Foundation and the Frazer Institute.  This confirms Rosa Luxemburg's complaint that Eduard Bernstein's proposal for social democracy was actually a "variety of liberalism" and not true socialism. 

While it is true that the Nordic countries have a lower level of economic inequality than does the United States, it is also true that those countries have not achieved absolute equality.  So, for example, in 2007, the richest 1% in the United States had 33.8% of the total wealth, while the richest 1% in Sweden had 18.8%.

Only once have democratic socialists succeeded in creating real socialism with real socialist equality--in the kibbutz.  (Bernie Sanders volunteered on an Israeli kibbutz in the 1960s.) But once the kibbutzniks had created socialist equality, they chose democratically to abolish it, because most of them found life in a perfectly egalitarian community unbearable.  (I have written about the kibbutzim and other socialist communes in Darwinian Natural Right, pp. 92-101.)

The kibbutzniks were the pioneers of Jewish resettlement in Palestine.  The first kibbutz, Deganya, was founded in 1910.  By 1980, there were over 130,000 people in over 270 kibbutzim in Israel.

The kibbutzim practiced pure socialism that came as close to absolute equality as any human community has ever achieved.  The members rotated jobs.  They took all of their meals in a common dining hall.  They had no private property.  They did not even own the clothing they wore, which was provided for them by the community.  When children were born, they were sent to a children's house to be cared for and educated by the community.  Children were allowed to visit their parents only a few hours in the afternoon.  This was understood as necessary for the sexual equality of men and women, because women were free from the burden of caring for their children. All decisions about the organization of the community were made by consensus in a general assembly, usually held weekly, where every member participated equally.  The kibbutzniks saw themselves as putting into practice the Marxist principle of "from each according to his ability, to each according to his need."  They also seemed to be following Plato's recommendation in The Republic that the Guardians in the just city should not have private property or private families, because they should care for the common good of the whole community rather than their selfish private interests.

But then, beginning in the 1950s and 1960s, young mothers began to complain that they did not have enough time with their children.  They wanted at least to be able to put their children to sleep at night.  As the children matured to adulthood, many of them left the kibbutz because they didn't like the communal childrearing.  Beginning in the 1970s, many of the kibbutzim decided to allow family sleeping rather than collective sleeping.  Socialists complained that allowing children to live with their parents would lead to the privatization of many things and inequality.

The kibbutzniks wanted not only private families but also private property.  Some of them returning from serving in the British army in World War Two came back with teakettles.  Allowing some people to own private teakettles, and to drink tea privately in their homes rather than in the communal dining room, violated socialist equality.  Then some people wanted to own their own clothes and to select their clothing.  They also wanted to own their homes.

The kibbutzim had to abandon job rotation to keep skilled people in their specialized jobs.  The most skilled workers wanted extra pay for their work, and they complained about those people who didn't work hard but received equal pay.  By the end of the 1990s, many kibbutzim were assigning wages according to skill level.

Henry Near has written the most comprehensive history of the kibbutzim--The Kibbutz Movement: A History (2 volumes).  He concluded:
"During most of the history of the kibbutz movement social change was justified (or resisted) on grounds which stemmed from, or were compatible with, a socialist world-view. From about 1980 onwards, however, the ideological background changed. . . . The improvisations were still ideologized, but the ideology was no longer that of socialism, but of late twentieth-century capitalism" (vol. 2, pp. 357-58).
The most fervently socialist of the kibbutzniks complained bitterly about this, and some of them lamented that they had tried to change human nature, but they had failed.

In a liberal social order, people are free to form egalitarian socialist communes like the kibbutz, as long as they are voluntary.  Indeed, in the history of the United States, there have been hundreds of such socialist communes, beginning with Robert Owen's New Harmony in Indiana (1825-1827).  Most of them lasted no longer than a few years.  Some of those that were animated by some religious faith lasted a few decades.   This shows that people devoted to socialist equality can form small communities that succeed for some time, and in the case of the kibbutzim, they succeeded for at least seventy years.  But eventually they must fail, because most human beings will find that the conditions for socialist equality--abolishing private families and private property--frustrate their deepest natural desires.

By contrast, liberal equality--equality of opportunity rather than equality of outcome--is more achievable because it does not require changing human nature.

There is one major objection to my argument here.  Even if equality of opportunity and equality of outcome are different, aren't they connected, in that countries with a higher degree of equality of outcome tend to be countries with a higher degree of equality of opportunity?  That's the point of what some economists have called "The Great Gatsby Curve."  (See Miles Corak, "Income Inequality, Equality of Opportunity, and Intergenerational Mobility," Journal of Economic Perspectives, 27 [2013]: 79-102.)

We might not regard income inequality as unfair as long as there is social mobility, so that children born into poor families have the opportunity to enter the high income levels as adults, and the children born into rich families do not always inherit the high income levels of their parents. 

This is how Abraham Lincoln interpreted equality in the race of life.  Speaking to Union soldiers in the Civil War who had come to meet him at the White House, he explained to them what they were fighting for:
"I happen temporarily to occupy this big White House. I am a living witness that any one of your children may look to come here as my father's child has.  It is in order that each of you here may have through this free government which we have enjoyed, an open field and a fair chance for your industry, enterprise, and intelligence, that you may all have equal privileges in the race of life, with all its desirable human aspirations. It is for this the struggle should be maintained" (August 22, 1864).
But the Great Gatsby Curve suggests that this equal opportunity in the race of life is no longer true in the United States and other rich countries (particularly the English-speaking countries), because income inequality lowers social mobility by shaping opportunity, so that the children of poor parents remain poor as adults, and the children of rich parents remain rich as adults.  The Great Gatsby Curve plots countries against a x-axis that is the Gini index of inequality and a y-axis that measures intergenerational immobility as the probability that people have the same income level as their parents.  The Curve rises and thus indicates that countries like the United States and the United Kingdom with high income inequality also have high intergenerational immobility, while countries like Norway and Finland with low income inequality also have low intergenerational immobility.

There are various factors to explain how parents pass on their economic status to their children.  Children might inherit genetic propensities from their parents (intelligence, talents, and personality traits) that make it more or less likely that they will enter high-paying occupations.  Rich parents might instill in their children the habits of self-discipline, ambitious striving, and education that are required for economic success.  Rich parents might help their children enter the expensive and highly selective schools that train their graduates for economic success.  Rich parents might also use their social connections to help their children find the best jobs for economic success.

Some critics of the Great Gatsby Curve say that it is being misinterpreted.  The differences between the United States and Norway, for example, might be due to the fact that the United States is a large and culturally diverse country, while Norway is a small and culturally homogeneous country, so that the inequality in the United States is actually mostly inequality between different cultural groups.

But this would not explain the differences between the United States and Canada, both of which are large and culturally diverse countries.  Canada has both a lower level of inequality than the United States, and a lower level of intergenerational immobility. 

Miles Corak has compared the income rankings of fathers and sons in the United States and Canada, comparing sons born to top decile fathers (the top 10% in earnings) and sons born to bottom decile fathers (the bottom 10% in earnings).  In the United States, there is a 25% probability that the son of the top decile father will remain in the top decile.  In Canada, the probability is about 17%.  In the United States, there is a 22% probability that the son of the bottom decile father will remain in the bottom decile.  In Canada, the probability is 16%.  In both the United States and Canada, the son of the bottom decile father has about a 7% probability of rising to the top decile.  More than half of sons raised by top decile American fathers fall no further than the 8th decile, and about half of those raised by bottom decile fathers rise no further than the third decile.

But notice what this means.  There is some social mobility in all of these rich liberal countries.  But they differ in the degree of social mobility.  The opportunities at the starting line of the race of life are not perfectly equal, because parents and the environments that they create influence the opportunities for their children.  It is harder for the children of poor parents to reach the highest level of economic success than it is for the children of rich parents.  But notice that about 50% of the sons of fathers at the bottom of the economic ladder can enter the lower middle class level.

Remember also that while at least 95% of all human beings before 1800 lived in grinding poverty--close to mere subsistence--almost all Americans, Canadians, and others in the richest countries today live a richer life than most human beings have ever lived before, and that greater wealth brings greater opportunity for living a flourishing life.

Thursday, December 01, 2016

Human Progress: (3) Life is Richer and Less Impoverished


In considering the empirical data for human progress through the Liberal Enlightenment, we tend to look for statistical data.  But statistical data often does not convey any clear image of how people live--of how, for example, the lives of poor people differ from the lives of rich people around the world.  As is indicated in this video of a lecture by Anna Rosling Ronnlund, it is possible to use photos as data, as she has done at Dollar Street, to understand how other people live their daily lives, and how their poverty or wealth makes a difference.  Photos as data illuminate the statistical data to help us understand what it really means for human life to progress from an impoverished world to a rich world. Max Roser includes many different kinds of data in his article on "World Poverty."

Economic historians have surveyed the data for "efflorescences" in premodern history--spurts of growth in wealth and population--that include the High Middle Ages in northwestern Europe (1150-1250), Golden Age Holland (1570-1670), High Qing China (1680-1780), and classical Greece (5th and 4th centuries BCE).  But in none of these efflorescences does one see the self-sustaining and accelerating explosion in economic growth that began in Great Britain around 1850, which Deirdre McCloskey calls The Great Enrichment.  Although Northian institutionalism can explain these efflorescences of economic growth in per capita income of up to 1% per year, it cannot explain the unprecedented growth in the past two centuries, in which liberal societies have seen increases in average income from 1800 to the present of over 1,000 to 3,000 percent.  To explain that, McCloskey argues, we need to see the crucial rhetorical change that led to the great Bourgeois Revaluation that recognized the moral and intellectual virtues of the bourgeois commercial society.

Some of the best work in generating economic and demographic data for the world economy from Roman times (1 AD) to the present has been done by Angus Maddison and (since Maddison's death in 2010) his colleagues at the Maddison Project.  Maddison assembled precise estimates of GDP, population, and GDP per capita in the world economy over the past two thousand years.  Some of these estimates were drawn from national statistical offices or international agencies like the United Nations.  Some were drawn from historians studying various kinds of data (such as, for example, the proceeds of certain taxes).  Some of Maddison's estimates, particularly for ancient history, were--he admitted--"conjectures" or "guestimates."  His justification was that even highly conjectural statistics were desirable, because they would invite scholarly criticism from people offering better estimates.  In fact, the scholars at the Maddison Project have been revising his estimates based on new research.

Maddison set his estimates to 1990 "international dollars"--measured by adjusting for price changes over time and for price differences between countries.  He then charted the history of GDP per capita and the rate of growth in GDP per capita for countries around the world from 1 AD to 2003.  Roser has some of these charts in his article on "World Growth."

Here, for example, are the numbers for the average GDP per capita for 12 European countries from 1 AD to 2003: 599 (1 AD), 425 (1000), 798 (1500), 907 (1600), 1,032 (1700), 1,243 (1820), 2,087 (1870), 3,688 (1913), 5,018 (1950), 12,157 (1973), and 20,597 (2003). 

Here are the numbers for the Netherlands: 425 (1 AD), 425 (1000), 761 (1500), 1,381 (1600), 2,130 (1700), 1,838 (1820), 2,757 (1870), 4,049 (1913), 5,996 (1950), 13,082 (1973), 21,480 (2003). 

Here are the numbers for the UK: 400 (1 AD), 400 (1000), 714 (1500), 974 (1600), 1,250 (1700), 1,706 (1820), 3,190 (1870), 4,921 (1913), 6,939 (1950), 12,025 (1973), 21,310 (2003). 

Here are the numbers for the USA: 400 (1 AD), 400 (1000), 400 (1500), 400 (1600), 527 (1700), 1,257 (1820), 2,445 (1870), 5,301 (1913), 9,561 (1950), 16,689 (1973), and 29,037 (2003).

Here are the numbers for the average rate of growth in GDP per capita for 12 European countries: -0.03 per cent (1-1000 AD), 0.13 (1000-1500), 0.14 (1500-1820), 1.04 (1820-70), 1.33 (1870-1913), 0.84 (1913-1950), 3.92 (1950-1973), and 1.77 (1973-2003). 

Here are the numbers for the Netherlands: 0.00 (1-1000 AD), 0.12 (1000-1500), 0.28 (1500-1820), 0.81 (1820-1870), 0.90 (1870-1913), 1.07 (1913-50), 3.45 (1950-1973), 1.67 (1973-2003).

Here are the numbers for the UK: 0.00 (1-1000 AD), 0.12 (1000-1500), 0.27 (1500-1820), 1.26 (1820-1870), 1.01 (1870-1912), 0.93 (1913-1950), 2.42 (1950-1973), 1.93 (1973-2003).

Here are the numbers for the USA: 0.00 (1-1000 AD), 0.00 (1000-1500), 0.36 (1500-1820), 1.34 (1820-1870), 1.82 (1870-1913), 1.61 (1913-1950), 2.45 (1950-1973), 1.86 (1973-2003).

These and other numbers support Maddison's conclusions about seven phases in the development of the world economy over the past two thousand years.

First, there was a decline in Europe from ancient Rome in 1 AD to 1000 AD.  During this period, the Islamic world and China had higher economic development than Europe. 

Second, there was a gradual growth in the European economy from 1000 to 1820. which accelerated in the merchant capitalist epoch, 1500-1820.  In the 15th century, Europe surpassed China in GDP per capita for the first time.  In the 17th century, the Dutch Republic emerged as the most prosperous country in human history up to that time.  Following the lead of the Dutch, Great Britain in the 18th century became the most prosperous country.

Third, the initial phase of the modern capitalist epoch in 1820-1870 brought higher growth rates in Europe and in the "western offshoots" of Europe (USA, Canada, Australia, and New Zealand).

Fourth, the "liberal international order" based on global free trade from 1870 to 1913 brought even higher growth rates for Europe and the western offshoots.

Fifth, from 1913 to 1950, the two world wars and the global depression brought a slowing of the growth rate associated with high trade barriers and illiberal regimes (like the Soviet Union and Nazi Germany).  Notice, however, that growth continued during this period, although the rate of growth slowed.

Sixth, the "golden age" of liberal free trade in 1950-1973 brought the highest global growth rates in all of human history.

Finally, in the "neo-liberal order" of 1973-2003, the rate of global growth slowed from the golden age, but even so, the rate of economic growth during this most recent period has been the second highest in human history.

Relying on such statistics for GDP per capita as a measurement for economic growth is open to objections.  One objection is that many of these numbers are pure guesswork--particularly for the distant past.  So, for example, Maddison stipulates that the GDP per capita for Great Britain in 1000 was 400 dollars, because he assumes that most people were living at the subsistence level of living, and that 400 in 1990 international dollars supports a subsistence life.  But it's not clear that this is anything more than an arbitrary assumption.  Maddison's response to this objection was that he always specified the reasoning and sources for his estimates, and that other scholars could challenge and revise his numbers.  Indeed, those contributing to the Maddison Project have done this.  So, for example, while Maddison estimated an average income per capita of 771 dollars for western Europe in 1500, people with the Maddison Project now think that current research supports a higher estimate of 1,200 dollars or more.

Another objection is that GDP statistics underestimate economic growth because they do not measure the improvements in the goods and services available to consumers, so that what counts as wealth or poverty changes over time.  Matt Ridley writes: "Today, of Americans officially designated as 'poor,' 99 per cent have electricity, running water, flush toilets, and a refrigerator; 95 per cent have a television, 88 per cent a telephone; 71 per cent a car, and 70 per cent air conditioning. Cornelius Vanderbilt (the richest man in the world in the mid 1800s) had none of these" (Rational Optimist, 17).

In his article on "Technological Progress," Max Roser makes this point by noting the exponential growth in technological progress, particularly in computational technology.  Extending Moore's law--that the number of transistors on integrated circuits doubles approximately every two years--Ray Kurzweil has shown the exponential growth of computing (calculations per second per $1,000) for 110 years, so that there is an exponentially decreasing price for a given product quality over 110 years.  Consequently, Maddison's statistics for growing GDP per capita greatly underestimate the true growth in wealth for the average person.  Today, many poor people can afford to buy computing power that was previously not available to even rich people.

Disputes over the statistical measurement of wealth have contributed to a split between Malthusian and Smithian interpretations of what happened in the merchant capitalist epoch (1500-1820).  Adam Smith in 1776, in The Wealth of Nations, saw some economic progress since 1500.  He argued that the discovery of the Americas and the southern route to Asia had created international trade across the entire globe for the first time in human history, which opened up opportunities for international exchange and specialization in the division of labor.  He thought that these opportunities had not yet been fully developed because of the mercantilist restrictions on trade.  But he foresaw the possibility that these restrictions could be dropped, bringing accelerated economic growth, if he was persuasive in his argument for liberalism--for "the liberal system of free exportation and free importation" (Wealth of Nations, Liberty Fund, 538) and "allowing every man to pursue his own interest his own way, upon the liberal plan of equality, liberty, and justice" (664).  (This is the "liberal system" that Donald Trump and his Alt-Right supporters say they want to overturn in favor of neo-mercantilism and ethnic nationalism.)

Although Smith did not try to precisely quantify economic growth, he did rank countries in descending level of economic growth: the Netherlands, England, France, the North American colonies, Spanish America, China, Bengal, and Africa. Those countries ranking at the top were those that had more nearly approached the institutions and norms of free trade and individual freedom.  Maddison's historical statistics support this Smithian account of growth performance from 1500 to 1800.

In contrast to Smith, Thomas Malthus in 1798, in An Essay on the Principle of Population, argued that the scarcity of natural resources limits economic growth for all animals, including human beings.  As the material conditions of life improve, the fertility rate increases, while the death rate decreases, which produces an increase in population.  But with increasing population, the competitive struggle for scarce resources produces a decline in material conditions of life, which eventually forces a collapse in population.  So, Malthus insisted, in the long run births must equal deaths.

Malthusian economists--like Gregory Clark in A Farewell to Alms--have argued that human beings have been so trapped in this logic of the Malthusian economy, like all other animals, that there was essentially no sustained economic growth prior to 1800.  Clark writes: "Thus the average person in the world of 1800 was no better off than the average person of 100,000 BC.  Indeed in 1800 the bulk of the world's population was poorer than their remote ancestors.  The lucky denizens of wealthy societies such as eighteenth-century England or the Netherlands manage a material lifestyle equivalent to that of the Stone Age.  But the vast swath of humanity in East and South Asia, particularly in China and Japan, eked out a living under conditions probably significantly poorer than those of cavemen" (1).  So in 1800, the economic life of the Dutch and the English was no better than that of the Stone Age cavemen!

But despite this disagreement between the Malthusians and the Smithians over economic development before 1800, they are agreed that after 1800 there was a stunning increase in wealth and decrease in poverty, beginning in northwestern Europe, that was unprecedented in human history.  They agree that the statistical data for economic growth over the last two hundred years shows that life has become richer and less impoverished.

Some of my previous posts on the Great Enrichment and the debate over how best to explain it can be found here, here, here, and here.

Tuesday, November 22, 2016

Human Progress: (2) Life is Healthier

Good health is a precondition for a good life.  If we are crushed by disease, injury, or undernourishment, we cannot live a happy and flourishing life.  Throughout most of human history, most human beings were disabled in their lives or died prematurely from poor health.  Although this is still true today for many people around the world, over the past two hundred years, human life has become healthier on average than it ever was before 1800.  This has been caused by the increasing freedom, knowledge, and technology coming from the Liberal Enlightenment. (See Max Roser's survey of the data for "Global Health.")

One reminder of the catastrophic effects of epidemic diseases in human history is Thucydides' account of the plague in Athens during the Peloponnesian War.

The Peloponnesian War between Athens and Sparta and their allied cities began in the summer of 431 BCE. Athens was led politically and militarily by Pericles. In the winter of 431, he delivered the Funeral Oration for those Athenians who died in the first year of the war. As reported by Thucydides, in his history of the war, Pericles' speech was a celebration of the power and the virtues of the Athenians as shaped by the freedom they enjoyed in their democracy. 

As I have indicated in a previous post, ancient Athens manifested some of the economic, social, and political success of the Liberal Enlightenment, although it did not achieve the uniquely self-sustained and accelerating growth of the Great Enrichment in northwestern Europe and North America at the end of the 19th century.

In the first days of summer in 430 BCE, the Spartans and their allies invaded Attica once again, initiating the second year of the war.  But Thucydides passes over this quickly in one sentence, so that he can turn immediately to a careful and dramatic description of the plague that began to appear in Athens that summer (2.47-58). It is a jolting effect for readers to have Pericles' beautiful funeral speech followed by an ugly account of a plague where bodies were left unburied.  Readers might see that the implied question here is the human meaning of death: death in war can be glorious, but death in a plague is not.

The plague killed Pericles, and for that reason alone it might have turned the course of the war against Athens.  Thucydides himself had the disease, but he was one of the lucky ones who recovered, which gave him immunity and allowed him to study the disease in others. 

His scientific attitude in his careful observation and recording of the disease might show the influence of Hippocrates, who stressed the importance of recording clinical histories of diseases and of looking for natural causes rather than superstitiously assuming that there are divine causes.  Hippocratic medical science was limited, however, by the ancient Greek taboo against the dissection and autopsy of human bodies, so that their inferences about human anatomy came from the dissection of other animals.  It was also limited by the absence of microscopes for seeing microorganisms, so that the bacterial and viral causes of disease were invisible.

Thucydides says that physicians were completely ignorant of the causes of the disease, and they knew no way to treat it. In fact, the physicians commonly died from the disease because they had contact with the sick.  Not only did all human arts fail to stop or slow the disease, even supplications in the temples of the gods and divinations failed, and eventually people stopped appealing to the gods.

Thucydides traces the path of the plague from sub-Saharan Africa to Egypt and Libya and then through Persia to Athens.  Leaving to others any speculations about the causes of the disease, he proposes only to lay out the symptoms of the disease so that it can be recognized if it ever breaks out again.

The mortality rate was high, and victims generally died on the seventh or eighth day after first contracting the disease.  But those who survived were protected from reinfection.

People in good health suddenly felt headaches, and they had inflammation and redness of the eyes, bleeding from the mouth, small pustules and ulcers over the body, vomiting, diarrhea, stomach pain, violent spasms, and unquenchable thirst.  They were never able to rest or sleep. Many lost their fingers, their toes, and their genitals from the violent swelling and ulceration.  Some lost their memory, so that even if they survived, they did not know themselves or their friends.

People died alone, because their family and friends were afraid to care for them, for fear of contracting the disease.  Those few who were good enough to care for the victims often died as a result.  The most caring and compassionate people were those who had recovered from the disease, and so they knew that they could care for the sick without fear of being attacked by the disease.

Bodies were thrown into piles to be burned without ceremony.  Many bodies laid unburied.

When the sick thought they were dying, they became utterly lawless, because since they saw themselves as already under a sentence of death, they had no fear of either human law or divine law.  They ceased worshipping the gods, because they saw no benefit in this.

Some people thought this plague was the fulfillment of ancient prophecies and oracles foreseeing that such a pestilence would be inflicted on the Athenians to give victory to the Spartans.  But Thucydides was skeptical about this.

Thucydides says that the plague was "too much for human nature to endure" and "stronger than reason" or "beyond rational explanation" (kreisson logou).

The dark pessimistic mood of this description of the plague in Athens was recreated by Lucretius at end of his De Rerum Natura, where he recounted the story of the Athenian plague as told by Thucydides.  Many readers have found it strange that Lucretius chose to end his book this way, with a sad rather than a happy ending, because the argument of the book is that the Epicurean philosophic teaching allows us to avoid any fear of death that would ruin our happiness in life.  It is odd, then, that Lucretius does not suggest that Epicurus or an Epicurean would have withstood the horrible circumstances of the plague any better than anyone else. 

Lucretius presents the plague at the end of his book as if it were the end of the world caused by natural causes.  As I have indicated in previous posts (with links here), Lucretius taught that since the cosmos was not designed by providential gods who care for human beings, the cosmic conditions necessary for human life are not eternal, and thus the human world must someday come to an end, and human life on Earth will be extinguished. For Leo Strauss and the Straussians, this is "the most terrible truth."

While Thucydides had no rational explanation for the plague, modern historians and scientists have offered a wide variety of explanations for it as epidemic typhus, anthrax, typhoid fever, bubonic plague, smallpox, measles, or toxic shock syndrome.  Most recently, some researchers have argued that the clinical and epidemiologic features of the disease as described by Thucydides conform best to Ebola, which was first recognized in humans in 1976, and which appeared recently in an outbreak in sub-Saharan Africa in 2014-2016. (See Powel Kazanjian, "Ebola in Antiquity?", Clinical Infectious Diseases 61 [2015]: 963-68.)  Ebola is a deadly virus disease that kills about 50% of the people that suffer from it.  The ancient Greeks could not have understood such a disease since the virus cannot be seen with the naked eye. But while modern science can explain the disease, there is so far no known cure.  Understanding the disease and how it spreads through contact with bodily fluids does at least allow for containing it, and the recent epidemic was brought under control by the spring of this year.

Other epidemic diseases that have ravaged human life throughout history have been brought under control or even largely eliminated through modern scientific technology. For example, the bubonic plague is now understood as caused by the bacterium Yersinia pestis, which is transmitted by fleas carried by rodents.  Originating in China, the bubonic plague was responsible for three long-lasting epidemics in Europe: the Justinian Plague (the 6th through the 8th centuries), the Black Plague (from the mid-14th century to the Great Plague of London in 1665), and the third pandemic at the end of the 19th century.  The Black Plague killed 30%-60% of Europe's total population.  Although it could still become a major health threat, the bubonic plague has been controlled by insecticides, antibiotics, and a plague vaccine.  Other epidemic diseases that have claimed hundreds of millions of lives over human history have been almost completely eradicated--such as smallpox and malaria.

The Liberal Enlightenment has promoted the knowledge and the technology that has made this human progress in health possible.  It has also promoted the statistical knowledge that makes it possible to precisely measure this progress.

As Max Roser indicates in his article on the "Burden of Disease," the Global Burden of Disease Project (GBD) of the Institute for Health Metrics and Evaluation measures the Disability Adjusted Life Years (DALYs) lost per 100,000 people.  This is the sum of years of potential human life and flourishing lost due to premature mortality and the years of productive life lost due to disability. 

So, for example, the DALYs for Ebola in 2015 were zero for almost all nations, but they were 2,734.33 for Sierra Leone, 1,424.39 for Liberia, and 432.91 for Guinea.  The reduction in the damage from typhoid fever over the past 25 years can be measured by the DALYs for this disease.  In 1990, the DALYs for India were 928.45 and for Burkina Faso 1,232.25.  In 2015, the DALYs were 436.68 for India and 587.97 for Burkina Faso.  The DALYs for the United States, Great Britain, and many other countries were almost zero.

Friday, November 18, 2016

Human Progress: (1) More Lives and Longer Lives

If evolutionary success is measured by high rates of survival and reproduction, leading to a growing population, then the human species has been amazingly successful over the last two hundred years.  (See Max Roser's article on "World Population Growth.")

Some historians have estimated that in 10,000 BCE, the world population was 2.4 million.  By 1,000 CE, it was 295 million.  By 1800, it was 900 million.  So, for thousands of years, the human population grew, but very slowly.  The annual growth rate was probably never more than .5%. 

But after 1800, the annual growth rate increased to .8% in 1900 and to 2.2% in 1962, which was the highest rate of growth in human history.  After 1962, population has grown, but at a declining rate.  Population has grown from 1.5 billion in 1900 to 6.1 billion in 2000, and then to 7.5 billion at the end of 2016.

Growth in population depends on the combination of two factors--the rate of fertility and the rate of mortality.  Prior to 1800, the rate of fertility was often high, but the rate of mortality was also high.  Women gave birth to many children with the expectation that only a few would survive to adulthood.  In the 19th century, beginning in northwestern Europe and North America, fertility remained high, but mortality declined, because improved standards of health and sanitation based on improved knowledge in medical science and public health lowered the rate of infant mortality and lengthened life expectancy.  Consequently, population began to grow faster than ever before in human history. No country in the world today has a lower life expectancy than the countries with the highest life expectancies in 1800.

Beginning in the first half of the 20th century, the most developed nations began to show a drop in the rate of fertility.  Social scientists have called this the "demographic transition."  In the more developed countries, women delay the age of their first pregnancy, and they choose to have fewer children.  As parents invest more in the education of their children, and as women have more opportunities for investing in their careers outside the home, parents choose to have fewer children.  By the 1960s, some countries saw fertility rates fall below replacement levels (less than 2.1 children per woman), which brought a decline in population along with an ageing of the population. (See my previous post on the demographic transition.)

In recent years, however, there has been some evidence that as societies move into the very highest levels of human development--as measured by long life expectancy, great wealth, and high levels of education--the declining trend in fertility is being reversed. By 2005, Sweden and some other highly developed societies were showing this, although the increase in fertility was still not yet up to replacement levels. (See Mikko Myrskyla et al., "Advances in Development Reverse Fertility Declines," Nature 460 [6 August 2009]: 741-43.)  For me, this shows that the natural human desire for children will always assert itself, although parents in the socioeconomic circumstances of modern liberal societies will often prefer to invest heavily in fewer children.

Even if the demographic transition has slowed the rate of growth in world population, the stupendous growth in population has continued.  Is this a sign of human progress or not?  Many thinkers of the Liberal Enlightenment have said yes.  David Hume, for example, in his long essay on "Of the Populouness of Ancient Nations," criticized ancient nations for having a lower growth in population than modern nations, and he argued: "every wise, just, and mild government, by rendering the condition of its subjects easy and secure, will always abound most in people, as well as in commodities and riches. . . . if every thing else be equal, it seems natural to expect, that, wherever there are most happiness and virtue, and the wisest institutions, there will also be most people" (Essays, Liberty Fund, p. 382).  Hume believed that population was growing faster in modern nations than in ancient nations because there was more liberty in modern nations: "human nature, in general, really enjoys more liberty at present, in the most arbitrary government of Europe, than it ever did during the most flourishing period of ancient times" (383).  After all, the primary difference between the economic life of the ancients and that of the moderns was the practice of slavery among the ancients.

Like Hume, Etienne Damilaville, in his article on "Population" in the French Encyclopedia, edited by Diderot and d'Alembert, claimed that liberty fosters a growing population, because "it is under mild, limited governments, where the rights of humanity are respected, that men will become numerous" (Encyclopedic Liberty, Liberty Fund, p. 502).

This belief that growing population was a sign of human progress in a free society was challenged by Thomas Malthus in his Essay on the Principle of Population (1798), who warned that since population tends to increase faster than the production of food, restricting the number of births was the only way to avoid famine and starvation.

This Malthusian pessimism has been adopted by many modern environmentalists, who insist that the modern growth in human population is unsustainable and must soon lead to a catastrophic collapse of human civilization.  In 1968, Paul Ehrich began his best-selling book The Population Bomb by declaring:
"The battle to feed all of humanity is over. In the 1970's the world will undergo famines--hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate. . . . We must have population control at home, hopefully through a system of incentives and penalties, but by compulsion if voluntary methods fail. . . . The birth rate must be brought into balance with the death rate or mankind will breed itself into oblivion" (11).
Ehrlich wrote that he first knew "the feel of overpopulation" during "one stinking hot night in Delhi":
"My wife and daughter and I were returning to our hotel in an ancient taxi.  The seats were hopping with fleas.  The only functional gear was third. As we crawled through the city, we entered a crowded slum area.  The temperature was well over f100, and the air was a haze of dusty and smoke. The streets seemed alive with people. People eating, people washing, people sleeping. People visiting, arguing, and screaming. People thrusting their hands through the taxi window, begging. People defecating and urinating. People clinging to buses. People herding animals. People, people, people, people. As we moved slowing through the mob, hand horn squawking, the dust, noise, heat, and cooking fires gave the scene a hellish aspect" (15).
For an environmentalist like Ehrlich, Hell is "people, people, people, people"--too many people!

Ehrlich's prediction of massive famines in the 1970s in overpopulated countries like India proved false because of people like Norman Borlaug.  Borlaug, an agronomist from Iowa, spent his life developing high-yield hybrid crops that would solve the problem of global hunger.  His success in doing this was called the Green Revolution.  After working for many years helping farmers in Mexico, Borlaug moved in 1963 to India and Pakistan, where he showed farmers that they could have better crops with bigger yields.  He also advised governments that farmers should be paid market prices for their crops instead of imposing price controls to subsidize food for urban people, because such price controls would reduce the supply of food. Today, India and Pakistan produce seven times more wheat than they did before Borlaug arrived.  In 1970, Borlaug won the Nobel Peace Prize for his work increasing the global food supply and thus averting the famines predicted by Ehrlich in 1968.

Major famines have largely disappeared from the world.  The great famines of the 20th century were mostly man-made in illiberal regimes like the Soviet Union, China, Cambodia, Ethiopia, and North Korea.  In the 21st century, socialist regimes like that in Venezuela continue to produce food shortages.  Mao's "Great Leap Forward" famine in China (1958-1962) killed 30 million people, making it perhaps the greatest single catastrophe in human history.  Once the collectivized farms in China were abolished, and farming was privatized, food production increased, and now China produces a surplus of food for world markets. The freedom for people to choose their own work, and to reap the rewards, has made this most populous nation on Earth prosperous.

For Julian Simon, an economist at the University of Maryland, people like the Chinese farmers and Norman Borlaug illustrate the point that a growing human population is good, because people are so productive and inventive in solving problems that human beings are the "ultimate resource," and so having lots of them is good for us.  Simon's classical liberal approach to population made him Ehrlich's greatest adversary. 

Simon argued that there are no resources for human life without the human effort to discover and use them.  So, for example, petroleum is not inherently a resource. The Native Americans had no use for it.  It became useful only after human beings discovered how it could be used to satisfy human desires and then found efficient ways to extract it and sell it.

Of course, human beings are consumers of resources as well as producers, and people like Ehrlich assume that in general people consume more than they produce, so that population growth is bad. But Simon argued that growing human populations in free societies, where people are free to be inventively productive, will produce net increases in resources.

Tuesday, November 15, 2016

The Empirical Data for Human Progress through the Liberal Enlightenment




Human life today is better than it has ever been in human history, because we enjoy the benefits of two centuries of human progress through the Liberal Enlightenment.  Our time is the best of all times that human beings have ever known.

And yet most human beings around the world deny this.  As the above chart indicates, in surveys asking people whether the world is getter better, most people (94% in the United States, 96% in Great Britain and Germany) say no.  Many of those Americans who believe everything is getting worse voted for Donald Trump, because he appealed to their fear that America and the whole world are in decline, and because he persuaded them that only the leadership of a strongman can save them.

This popular pessimism is contradicted by empirical data that shows more human progress in the past two hundred years than at any time in previous human history.  For example, in public opinion surveys, most people around the world say that world poverty has been increasing.  In the United States, only 8% of the people believe that over the last 30 years the proportion of the world population living in extreme poverty has decreased.  But these 8% are correct.



In this chart, the top line shows the proportion of the world's population living in extreme poverty, defined as living on less than $2 a day (measured in international dollars according to prices of 2011), from 1820 to 1990.  In 1820, 94% of the world's population lived in extreme poverty.  In 1990, it was down to 52%.  For the second line, poverty is defined as living on less than $1 a day.  In 1820, 84% of the world's population lived in such poverty.  In 1990, it was down to 24%.  The shorter line to the right is based on World Bank data for 1980 to 2015 showing the level of poverty defined as living on less than $1.90 a day.  In 1980, 44% of the world population was living in such severe poverty.  By 2015, this was down to 9.6%.  This shows a steady decrease in extreme poverty over the last two hundred years, with the decrease accelerating over the past 30 years.  Moreover, economic historians have found evidence that for all of human history prior to about 1800 over 95% of human beings lived in such extreme poverty.

Perhaps the best place to find this kind of data is "Our World in Data" -- an online publication by Max Roser (an economist at Oxford University) that surveys the data on the development of human living conditions at a global scale.  Most of the data is presented in visual charts and global maps that show historical trends across time.

This data shows that human life in general is better today in at least ten ways:

(1) There are more lives and longer lives.
(2) Life is healthier
(3) Life is richer and less impoverished.
(4) Life shows more equality of opportunity.
(5) Life is more peaceful.
(6) Life is freer.
(7) Family life is better.
(8) Life is more environmentally sustainable.
(9) Life is more enlightened.
(10) Life is more virtuous.

Relying largely on the data collected by Roser, I will be writing a series of blog posts presenting some of the evidence for these twelve progressive trends as caused by the norms and institutions of the Liberal Enlightenment that promote increasing freedom, knowledge, and technology. 

I can foresee, however, that most readers will not be convinced by this evidence.  The reason for this is that the human mind has evolved cognitive biases that make it hard for us to believe that human life is improving and easy for us to feel worried and dissatisfied. 

One such bias is what psychologists Daniel Kahneman and Amos Tversky call the "availability heuristic."  The more memorable an event is, because it's horribly shocking, the more probable we think it is. 

So, for example, when we see shocking reports about terrorist attacks, we assume that such attacks are highly probable, and so we become terrified.  I remember a friend of mine in Manhattan, Kansas, telling me that when she first saw the television coverage of the 9/11 terrorist attacks in New York City and Washington, DC, she left her office, picked up her son at his elementary school, and returned home where she hugged him for hours.  Of course, she knew intellectually that her son in Kansas was in no danger of being attacked by terrorists.  But emotionally she felt as if her family was under attack.  Many Americans and many people around the world felt the same way that day.

In fact, that's the whole aim of terrorists--to throw an entire community of people into a state of panic.  After the killing of 14 people in a terrorist attack by radical Muslims in San Bernardino, California, in December of 2015, Trump proposed that all Muslims should be banned from entering the United States.  But very few Muslims become terrorists. And the statistical possibility of being killed by a terrorist in the United States is extremely low.  Ordinary homicide is more likely.  But even homicidal violence in the United States has been declining since a peak in the early 1990s.  Nevertheless, the reports of terrorist attacks have such a shocking impact on us that we are inclined to be thrown into a panic that distorts our judgment about what should be done to protect ourselves.

The great danger here is that if we do not recognize the human progress brought to us by liberal norms and institutions, we might become so fearful and angry that we will turn away from our liberal principles and embrace the illiberal rhetoric of a demagogue who promises to save us.

Although the human progress achieved by the Liberal Enlightenment is real, it is not inevitable, as should be clear from the catastrophic suffering brought into the middle of the 20th century by the illiberal regimes of Stalin, Hitler, and Mao.  The recent resurgence of ethnic nationalism in its attack on liberal globalism suggests that this could happen again.

Tuesday, November 08, 2016

The Paleolithic Origin of Science and Philosophy in the Art of Tracking

This is a video from 2001 of Karoha Langwane, a /Gwi tracker from Lone Tree in the central Kalahari, Botswana, running an eight-hour-long persistence hunt of a kudu bull (a species of antelope), which requires the cognitive skill for tracking animals. Karoha might be one of the last traditional hunters practicing the persistence hunt--chasing the hunted animal in the mid-day sun for hours until it collapses from overheating and exhaustion--which was probably the earliest form of human hunting, going back two million years among our hominid ancestors. Louis Liebenberg has argued that the speculative tracking required for such hunting involves the hypothetico-deductive reasoning that underlies the cognitive abilities for science: our hunter-gatherer ancestors evolved an innate ability to use scientific reasoning when they interpreted tracks and signs and made testable predictions about animal behavior.


This is a video of Daniel Lieberman lecturing on "Brains, Brawn, and the Evolution of the Human Body."  Lieberman is a proponent of the "endurance running hypothesis"--the idea that our human bodies and brains are evolved for running long distances, because this was required for persistence hunting millions of years ago before the invention of hunting technology like bows and arrows.

The arguments here by Liebenberg and Lieberman help to resolve a paradox about human evolution first identified by Alfred Russel Wallace.  While Wallace was a co-discoverer along with Darwin of the theory of evolution by natural selection, Wallace disagreed with Darwin in that Wallace did not believe that natural selection could fully explain the evolution of human beings in their high moral and mental capacities.  In his essay on "The Limits of Natural Selection as Applied to Man," Wallace argued that the human brain was larger than it needed to be for survival as a primitive hunter-gatherer.  Natural selection, he observed, "has no power to produce absolute perfection but only relative perfection, no power to advance any being much beyond his fellow beings, but only so much beyond them as to enable it to survive them in the struggle for existence."  And yet human beings have mental powers for abstract thought, as expressed in art, science, mathematics, philosophy, and religion, that would not have been useful for the survival of our Paleolithic ancestors.  Such powers could not therefore have evolved by natural selection, because they would have been useless for the survival and reproduction of our prehistoric ancestors.

Wallace declared:
"The mental requirements of savages, and the faculties actually exercised by them, are very little above those of animals.  The higher feelings of pure morality and refined emotion, and the power of abstract reasoning and ideal conception, are useless to them, are rarely if ever manifested, and have no important relations to their habits, wants, desires, or well-being. They possess a mental organ beyond their needs.  Natural Selection could only have endowed savage man with a brain a little superior to that of an ape, whereas he actually possesses one very little inferior to that of a philosopher."
Since the evolution of such a brain could not be the work of natural selection, Wallace inferred, it must be the work of artificial selection by "some higher intelligence."  Just as human beings have artificially selected plants and animals to be bred for special traits, so this "higher intelligence" must have guided human evolution to achieve a high mental capacity.  Many readers assumed that this "higher intelligence" must be God.  But Wallace said this was a misconception, because this higher intelligence could be some kind of spiritual mind other than God.

Creationists and intelligent design theorists have seen Wallace as agreeing with their claim that natural science can see evidence of creative intelligence in the natural world, and particularly in the cognitive and moral capacities of the human mind that show evidence of supernatural design.

Similar to Wallace's argument is the argument of theistic evolutionists like C. S. Lewis and Alvin Plantinga that an evolutionary naturalism becomes self-refuting if it denies the supernatural origin of the human mind.  The reasoning is that the theistic doctrine of the human mind as created by God in His image provides the necessary support for the validity of human thought, including the validity of modern science.  If we embrace metaphysical naturalism--the view that nothing exists except nature, and so there is no God or nothing like God--we are caught in self-contradiction: if human thought originated not from a divine Mind but from the irrational causes of nature, then we cannot trust our minds as reliable, and thus we cannot trust our belief in naturalism, or anything else.  Insofar as science--including evolutionary science--depends on the validity of human thought, and insofar as theism is the indispensable support for trusting in the validity of human thought, science is not only compatible with theism, science depends upon theism.

In my posts on Plantinga's argument (here and here), I have pointed out that the weak link in Plantinga's reasoning for metaphysical naturalism as self-defeating is his assumption that adaptive behavior is completely unrelated to true belief. Plantinga asks us to imagine that we could have been naturally evolved for a state of complete and perpetual delusion.  Having taken this step of radical Cartesian skepticism, he then tells us--as Descartes did--that the only escape from such skepticism is to assume that God would never allow this to happen.  But as always is the case for the Cartesian skeptic, this all depends on imagining scenarios that are utterly implausible and unsupported by even a shred of evidence.  The evidence of evolutionary history suggests that evolution produces cognitive faculties that are reliable but fallible.  The mental abilities of animals, including human beings, are fallible because evolution produces adaptations that are good enough but not perfect, and this results in the mental fallibility that is familiar to us.

But despite this fallibility, the mental faculties cannot be absolutely unreliable or delusional.  Even Plantinga concedes that in the evolution of animals, "adaptive behavior requires accurate indicators."  So, for example, a frog must have sensory equipment that allows him to accurately detect flies so that he can catch them with his tongue.  And the honeybee waggle dance is a dramatic example of how evolution by natural selection favors adaptive behavior that tracks the truth about the world.

Similarly, evolution by natural selection has given human beings mental capacities that are reliable, even if fallible, in tracking the world.  If Liebenberg is right, the distinctively human mental capacities arose originally among prehistoric hunter-gatherers for the literal tracking of wild game, which created the capacity for the abstract hypothetical reasoning of modern science.

Archaeological evidence indicates that our hominid ancestors were hunting about two million years ago.  Without weapons such as bows and arrows, the only effective form of hunting was probably persistence hunting--chasing an animal during the hottest time of the day until it overheated and dropped from exhaustion.  Anatomical evidence indicates that human beings are the only primates that are designed for the endurance running required for persistence hunting. 

In easy tracking terrain, hunters could follow an animal's trail by looking for one sign after another. But in difficult terrain, the hunters had to imagine the likely route the animal might take so that they might reconstruct the animal's behavior and decide in advance where they might find signs.  This would require what Liebenberg calls "speculative tracking" that uses "hypothetico-deductive reasoning."  Based on their knowledge of animal behavior and of the physical environment, hunters had to interpret the visible signs of an animal's path in terms of some hypothesis as to how and where the animal was moving.

In modern science, the visible world is explained by a postulated invisible world.  So that, for example, physicists use particle colliders to create visible particle tracks that are explained by hypotheses about invisible structures (atoms and subatomic particles) and forces (such as gravity).  Similarly, ancient hunters tracking an antelope had to interpret visible tracks as signs to be explained by hypotheses about the invisible movements of the antelope.  This abstract mental capacity for hypothetical reasoning could evolve by natural selection because those hunters who were good at this were more likely to have antelope for dinner.

It has been observed, however, that only a few people in hunter-gatherer societies are successful at this, because only the most intelligent members of the society will have the capacity for such scientific reasoning.  Similarly, we know that only a few people--an Aristotle, an Isaac Newton, or an Albert Einstein--will have the intellectual capacity for the deepest scientific or philosophic inquiries.  And thus the philosophic or scientific life will be most fully expressed by only a few people, even though the capacity for philosophic and scientific reasoning is latent in evolved human nature.

This same capacity for imaginative, hypothetical reasoning that generates scientific and philosophic knowledge can also generate mythic fiction and superstition.  Knowledge is valuable, because if we can follow the tracks of the antelope, we find the antelope, and we can eat.  We can also derive some satisfaction in telling stories about the antelope deity, although we will never find it.


REFERENCES

Bramble, Dennis M., and Daniel Lieberman. 2004. "Endurance Running and the Evolution of Homo." Nature 432 (18 November): 345-52.

Conniff, Richard. 2008. "Yes, You Were Born to Run." Men's Health 23 (May): 132-39. Available online.

Liebenberg, Louis. 2013a. The Origin of Science: The Evolutionary Roots of Scientific Reasoning and its Implications for Citizen Science. Cape Town, South Africa: Cybertracker.org. Available online.

Liebenberg, Louis. 2013b. "Tracking Science: The Origin of Scientific Thinking in Our Paleolithic Ancestors." Skeptic Magazine 18, no. 3: 18-23. Available online.

Wallace, Alfred Russel Wallace. 1870. "The Limits of Natural Selection as Applied to Man." In Contributions to the Theory of Natural Selection: A Series of Essays. London: Macmillan.

Tuesday, November 01, 2016

Budziszewski's Critique of Darwinian Natural Law

J. Budziszewski (pronounced "Boojee-shef-skee") is a professor in the Departments of Government and Philosophy at the University of Texas.  He is a prolific author best known for his writings on his Christian interpretation of natural moral law, which he finds in the work of Thomas Aquinas. His most recent book is a commentary on Aquinas's "Treatise on Law."

In a series of papers, Budziszewski has criticized my defense of natural law or natural right as grounded in a Darwinian account of human nature.  He criticizes me for my "determined attempt to make natural law safe for atheists."  For "natural law," he argues, one must "regard nature as the design of a supernatural intelligence."  By contrast, for "naturalism," one must "regard nature (in a physical or material sense as all there is."  What I defend, he says, is not "natural law" but "naturalism," and thus atheism.

I have responded to his criticisms in previous posts (here and here).

I was reminded of this debate while attending the recent meeting (October 28-29) of the Society of Catholic Social Scientists at Aquinas College in Grand Rapids, Michigan.  There was a special panel on Budziszewski's interpretation of natural law. 

As some of the panelists indicated, one of Budziszewski's main ideas is to oppose what he calls "the Second Table Project."  It is said that Moses brought down from Mount Sinai Ten Commandments on two tablets of stone.  Traditionally, the first four commandments are identified as the first tablet or table, and they concern the worship of God; the last six commandments (beginning with honoring father and mother) are identified as the second table, and they concern moral laws.  Some Christians (Roger Williams, for example) have seen here a separation of Church and State, in that the Church enforces the first table of theological law, while the State enforces only the second table of moral law.  The first table requires religious faith.  But the second table can be known by natural reason.  The first table corresponds to divine law that can be known only by those who are believers in the Bible as divine revelation.  The second table corresponds to natural law that can be known by all human beings, even those who are not biblical believers, because it depends on natural human experience.  The second table can stand on its own natural ground without any necessary dependence on the supernatural.  But this is exactly what Budziszewski denies, because, he insists, there cannot be a natural law if there is no divine lawgiver.

I have argued that if we see what Aquinas calls natural law as corresponding to what Darwin calls the natural moral sense rooted in evolved human nature, then those people who have been infused with religious faith can understand that evolved human nature as the product of God's creative design working through the natural history of evolution, while those people who lack such faith can understand that evolved human nature as the product of an unguided natural history of evolution. 

Darwin leaves open the possibility of theistic evolution by employing Aquinas's idea of "dual causality"--the religious believer can see natural causes as secondary causes, as distinguished from divine causes as primary causes (the subject of a previous post).  (I have also written posts on the Catholic Church's acceptance of Darwinian evolution.) 

Whether we have faith or not, whether we are on the side of revelation or on the side of reason, we can all recognize the common morality of natural law or natural right.  Religious belief can reinforce that natural morality for those who are religious believers.  But those who lack any religious belief can still recognize that natural morality as dictated by our natural experience and natural reason.

Some of my critics--not only Budziszewski, but also Craig Boyd, C. Stephen Evans, John Hare, Carson Holloway, Matthew Levering, Stephen Pope, Richard Sherlock, John West, and Benjamin Wiker--have complained that this distorts Aquinas's teaching by ignoring the ways in which Aquinas makes natural law dependent on God as the creator of that law.  After all, Aquinas indicates that natural law belongs to God's eternal law, because a human being as a rational creature "has a natural inclination to his proper act and end, and this participation of the eternal law in the rational creature is called the natural law" (ST, I-II, q. 91. a. 2).  Moreover, Aquinas indicates that human beings are directed to eternal happiness in Heaven as their final end, and for this they need divine law--the divinely revealed Biblical law of the Old and New Testaments--to instruct them how to achieve that eternal happiness (ST, I-II, q. 91, aa. 4-5).

But doesn't this confirm my claim about the autonomy of natural law as separated from divine law?  By natural law, Aquinas says, human beings are directed to their natural end of earthly happiness, which is "in proportion to the capacity of human nature."  Human beings cannot recognize their supernatural end--eternal happiness in Heaven--unless they believe in the divine law of the Bible (ST, I-II, q. 91, a. 4, ad 1).  The need for divine law to reveal supernatural ends shows that natural law by itself is directed to purely natural ends that can be known by natural experience without any belief in God or His commands.

Moreover, it is only by recognizing the autonomy of natural law that allows us to use the natural law to correct the divine law of the Bible.  Budziszewski's denial of natural law's autonomy makes this impossible.  Consider three examples of how natural law can correct the moral mistakes in the Bible.

First, Budziszewski says that recognizing "the wrong of deliberately taking innocent human life" is part of the natural law.  And yet, according to the Bible (Genesis 22), Abraham showed his faith in God by being willing to obey God's commandment to murder his innocent son Isaac.  Some Christians like Soren Kierkegaard have seen this Biblical story as teaching us "the suspension of the ethical" in our faith in God.  We must obey God's commands even when they are unethical.  But most people see this Biblical teaching as wrong, because we recognize that wrongness of killing innocent children, and thus our natural moral sense corrects the Bible.

Aquinas explains: "that which is done miraculously by the Divine power is not contrary to nature, though it be contrary to the usual course of nature. Therefore, . . . Abraham did not sin in being willing to slay his innocent son, because he obeyed God, although considered in itself, it was contrary to right human reason" (ST, II-II, q. 154, a. 2, ad 2).  Here Aquinas shows us a direct contradiction between reason and revelation, natural law and divine law, and if we take the side of revelation and divine law, we must allow--even honor--the killing of innocent people whenever we think God has commanded it, even though this is "contrary to the usual course of nature" and "contrary to right human reason."

For Aquinas, there is no way of escaping this shocking contradiction between natural moral law and arbitrary divine command, because if he appeals to natural law to correct the Biblical story, he will be exposed to persecution from church authorities.  In fact, the Bishop of Paris had condemned faculty at the University of Paris who were accused of teaching pagan natural philosophy that was contrary to the Christian faith, and there were suspicions about Aquinas being one of that group.  We might consider the possibility that this forced Aquinas to engage in esoteric writing.

A second example of how natural law might correct the Bible is in correcting the religious violence of the Old Testament.  The last three popes--John Paul II, Benedict XVI, and Francis--have all acknowledged that the Church needs to ask forgiveness for the religious violence practiced by the Church and endorsed by the Bible, including violence against heretics and apostates.  As Cardinal Ratzinger, and Prefect of the Congregation for the Doctrine of the Faith, Pope Benedict XVI endorsed a remarkable statement of the International Theological Commission in 1999 on "The Church and the Faults of the Past."  This statement indicates that we must recognize that the Bible is mistaken when it reports God as commanding unjust violence.  This is said to require a "paradigm change"--"a transition from a sacral society to a pluralist society, or, as occurred in a few cases, to a secular society." 

So now, it seems that the Catholic Church has embraced liberalism in accepting the move from a premodern "sacral society," in which violence could be used to enforce religion, to a "pluralist society" or "secular society," based on religious toleration and nonviolence. (This has been the subject of a previous post.) 

At the conference at Aquinas College, conservative Catholics argued that the only escape from the morally corrupting relativism of America's liberal culture was to restore faithfulness to the moral teaching of the Catholic Church's Magisterium.  But they were largely silent about how the Church (beginning with Vatican II) has accepted the liberalism of toleration and pluralism as a correction of the illiberal religious violence endorsed by the Bible and by the premodern Church.  Until recently, the Church saw Protestant Christians as heretics who could be properly persecuted and even executed (see Robert Bellarmine, On Temporal and Spiritual Authority [Liberty Fund, 2012], 79-120).  Aquinas endorsed the Inquisition (ST, II-II, q. 10, aa. 8, 11; q. 11, a. 3). Now, even conservative Catholics like Budziszewski recognize that this was wrong in violating natural law.

A third example of natural law correcting the Bible is recognizing the wrongness of the Bible's endorsement of slavery.  While the Bible sanctions slavery (see my post here, which includes links to other posts), Budziszewski knows by natural law that this is wrong, and therefore he looks for some way to correct the Bible to conform to his natural moral knowledge that slavery is wrong.  He writes: "Consider how many centuries it took natural law thinkers even in the Christian tradition to work out the implications of the brotherhood of master and slave.  At least they did eventually.  Outside of the biblical orbit, no one ever did--not spontaneously" (The Line Through the Heart, 36).  The explicit teaching of the Bible is that the "brotherhood of master and slave" is consistent with preserving slavery as a moral good, and this was the understanding of many Christians in the American South before the Civil War.  But Budziszewski rightly judges that Christians had to correct the Bible by seeing that human brotherhood demands the abolition of slavery as a great moral wrong.

As I have argued in Darwinian Natural Right, Darwin and others were able to see the wrongness of slavery as a violation of evolved human nature, and particularly of the natural desire for justice as reciprocity.