Friday, March 25, 2011

Skin color and the discrimination paradigm

Lighter skin color correlates with higher earnings among new immigrants to the U.S. This correlation holds up even if one controls for English language proficiency, education, occupation before migrating to the United States, and family background. It even holds up among immigrants of the same ethnicity, race, and country of origin. For some social scientists, only one possible explanation remains: discrimination.

The modern American credo blames underachievement by minorities on the majority, specifically through discrimination. The causal relationship may be direct, i.e., some people are consciously assigned to lower-paying jobs simply because of their skin color. Or it may be indirect, as Gunnar Myrdal argued almost seventy years ago in An American Dilemma:

The mechanism that operates here is the “principle of cumulation,” also commonly called the “vicious circle.”

[…] White prejudice and discrimination keep the Negro low in standards of living, health, education, manners and morals. This, in its turn, gives support to white prejudice. White prejudice and Negro standards thus mutually “cause” each other.

[…] If, for example, we assume that for some reason white prejudice could be decreased and discrimination mitigated, this is likely to cause a rise in Negro standards, which may decrease white prejudice still a little more, which would again allow Negro standards to rise, and so on through mutual interaction. If, instead, discrimination should become intensified, we should see the vicious circle spiraling downward.
(Myrdal, 1962, pp. 75-76)

Originally, this ‘American dilemma’ was supposed to explain underachievement by African Americans. Today, it is increasingly extended to immigrants, even newly arrived ones. This is the premise of a recent study by Joni Hersch (2008):

[…] most new legal immigrants to the United States have darker skin color than white U.S. natives and are on average shorter. This article considers whether skin color and height affect economic outcomes among new legal immigrants to the United States. (Hersch, 2008, p. 346)

To this end, Hersch consulted the New Immigrant Survey 2003. The survey involved interviewing a nationally representative sample of immigrants as soon as possible after they got permanent resident status. Among other things, the immigrant’s skin color was measured with a color scale: a series of increasingly darker hands numbered from one to ten. Interviewers were given the following instructions:

As you know, human beings display a wide variety of physical attributes. One of these is skin color. Unfortunately discrimination on the basis of skin color continues to be a reality in American life. Substantial evidence suggests that lighter skinned people fare better in a variety of social and economic settings than those with darker skins. In order to detect such discrimination, it is important that the NIS include a measure of skin color. We therefore ask interviewers to use the Scale of Skin Color Darkness as a guide to rate the skin color of each respondent on a scale of 0 to 10, where 0 is the lightest possible skin color (such as that of an albino) and 10 is the darkest possible skin color” (Hersch, 2008, p. 361)

After analysing the data, Hersch found the following:

[…] I find strong evidence that darker skin color is associated with lower wages, taking into account a wide array of demographic and productivity-related characteristics such as English language proficiency, education, occupation before migrating to the United States, and family background, as well as ethnicity, race, and country of origin, which are themselves highly correlated with skin color. Immigrants with the lightest skin color earn on average 17% more than comparable immigrants with the darkest skin color. On average, moving from the 10th percentile to the 90th percentile of the distribution of skin color within ethnic or racial groups would reduce wages by about 7%-9%. These magnitudes are roughly similar to the black-white disparity and Hispanic-non-Hispanic disparity reported in Altonji and Blank. (Hersch, 2008, p. 346)

She concluded: “The results indicate that any such discrimination is not merely ethnic or racially based nor due to country of birth. […] Skin color is not merely capturing the effects of ethnicity, race, or country of birth but also has an independent effect on wages.”

Why does skin color have greater explanatory power than race, ethnicity, or country of origin? One reason is that these other variables are often problematic. ‘Blacks’ include anyone with some ancestry from sub-Saharan Africa. ‘Hispanics’ encompass a wide range of populations. At one end, there are Amerindian groups in Guatemala and southern Mexico who use Spanish as a second language. At the other, there are Italian, German, and Jewish communities in Argentina who still maintain strong links with Europe.

Similar objections could be raised for country of origin. Hersch’s data show that 18.2% of immigrants from the United Kingdom self-identify as ‘non-white’ (Hersch, 2008, p. 355). There are similarly large non-European communities in France, Belgium, the Netherlands, Norway, and Sweden. In an increasingly globalized world, what does it mean to be ‘French’, ‘Dutch’, or ‘Norwegian’?

Interestingly, 1.7% of ‘Polish’ immigrants in the U.S. are actually Hispanic (Hersch, 2008, p. 355). This should be no surprise. Eastern European universities, with their low tuition, attract large number of Third World students. Many remain after graduation before eventually moving on to a third country, like the United States.

So to explain economic success among new immigrants, we should look at their skin color rather than their ‘race,’ ‘ethnicity,’ or ‘country of origin’. Does it follow, then, that color prejudice is the ultimate cause? Or do some immigrants fare poorly because they are less able? Hersch (2008, p. 378) reports some evidence for the second explanation:

The possible connection between skin color and ability has been examined using the 1982 GSS, which includes a 10-item vocabulary test as well as a measure of skin color for a sample of about 500 African Americans. Using these data, Lynn (2002) reports a positive correlation between lighter skin color and higher test scores.

But this kind of explanation strikes her as being faulty:

However, using the same data, Hill (2002) demonstrates that controlling for education and family background eliminates the relation between skin color and test scores. […] Available evidence in the scientific literature does not support a link between skin color and intelligence. In addition, the correlation between skin color and ancestry varies considerably, with low correlations in many populations of mixed ancestry (Parra, Kittles, and Shriver 2004). In the absence of genetic evidence or a high correlation between skin color and ancestry, it seems unlikely that inclusion of test scores as a measure of ability would greatly alter the skin color effects found in this article.

This is more than a bit disingenuous. First, saying that some immigrants are less able than others doesn’t necessarily imply a genetic explanation, even if they share the same ethnic background and supposedly differ ‘only’ in skin color. A Mayan immigrant from Guatemala may be as Hispanic as a Jewish immigrant from Buenos Aires, but the two individuals bring very different attitudinal and behavioral tools for survival in a modern market economy.

Second, a genetic explanation doesn’t imply a direct causal link between skin color and intelligence. Skin color is just an indicator of one’s ancestral gene pool, which may statistically differ from other gene pools for any number of heritable traits, including those that influence intelligence. It’s also disingenuous to claim that skin color is weakly correlated with ancestry by referencing a study done on a population that has been intermixed over many generations. If a British immigrant has brown skin, the chances are very good that he or she is not of European origin.

Finally, it’s disingenuous to treat discrimination as a default explanation that should be accepted unless there is strong evidence to the contrary. This is in contrast to evidence for discrimination, which is usually inferred. For instance, Hersch (2008, p. 368) notes:

An analysis shows that those with darker skin color are less likely to be self-employed, controlling for the same predetermined characteristics used in the regression equations reported below. While it might be tempting to interpret this finding as suggestive that immigrants with darker skin color avoid self-employment out of concern about customer discrimination, there is no information in the data regarding customer contact, and there is limited empirical evidence of customer discrimination in the literature generally, so it seems wisest to avoid making this leap.

The above interpretation doesn’t strike me as “tempting.” Self-employment is typically a refuge for those who have been excluded from the job market. I remember many young people who turned to self-employment in the 1980s and early 1990s because they had no alternative. All of the stable jobs were reserved for older workers with seniority. But for Hersch, self-employment shows you’ve had more freedom to choose your livelihood.

Another point deserves mention here. When I think of self-employed immigrants, I readily think of Sikhs, Lebanese, Chinese, Armenians, and other ‘middleman’ minorities. Yes, they’re lighter-skinned than most immigrants. But do they choose self-employment over wage labor because they face little or no discrimination? That’s not my impression.

Conclusion

The discrimination paradigm was first applied to a minority that had suffered centuries of unequal, prejudicial treatment. Bit by bit, and somewhat unthinkingly, it has been extended to other groups: Amerindians and Mexican Americans and, now, newly arrived immigrants. Discrimination is no longer supposed to exercise its crippling effects through generations of enslavement and Jim Crow. After all, we see these effects in people who’ve just arrived.

Well, why not look just a bit further ‘upstream’? These different patterns of economic behavior also exist in the immigrants’ home countries. They exist worldwide. They have nothing to do with whatever prejudices White Americans might have.

But why do these patterns exist? Why should lighter skin improve one’s ability to perform in a modern market economy? Surely that proposition is just as absurd.

Actually, no. Please hear me out before you start shouting. Success in a modern market economy depends on the ability to plan ahead, and that ability has been favored outside the tropics—in areas where people tend to be lighter-skinned. The non-tropical zone has a yearly cycle that makes planning necessary. This cycle began to impact human survival as early hunter-gatherers spread into the temperate and arctic zones. It became even more important when their descendents took up agriculture. The fall harvest had to last until early summer; otherwise you and your family would starve (Frost, 2010; Frost, 2011).

The same yearly cycle also made men necessary for family survival, particularly in winter when hunting used to be the only other means of sustenance. This is in contrast to the tropics, where year-round agriculture enabled women to provide for themselves and their children with minimal male assistance. In such circumstances, men spent more time and energy seeking additional mates (Frost, 2008).

Are these patterns of behavior genetic? Or are they learned from one’s cultural environment? From the standpoint of natural selection, the question is unimportant. What works works, and what doesn’t doesn’t. But the question does matter to curious humans. To find the answer, the first step is to admit that the question is legitimate. Are we ready to take that step?

References

Frost, P. (2011). Religiosity and the origins of civilization, Evo and Proud, January 21
http://evoandproud.blogspot.com/2011/01/religiosity-and-origins-of-civilization.html

Frost, P. (2010). Out of North Eurasia, Evo and Proud, May 27
http://evoandproud.blogspot.com/2010/05/out-of-north-eurasia.html

Frost, P. (2008). Sexual selection and human geographic variation, Special Issue: Proceedings of the 2nd Annual Meeting of the NorthEastern Evolutionary Psychology Society. Journal of Social, Evolutionary, and Cultural Psychology, 2(4),169-191.
http://www.jsecjournal.com/articles/volume2/issue4/NEEPSfrost.pdf

Hersch, J. (2008). Profiling the new immigrant worker: the effects of skin color and height, Journal of Labor Economics, 26, 345-386.

Myrdal, G. (1962). An American Dilemma, New York: Harper & Row.

Friday, March 18, 2011

The fall of blood lust and the rise of empathy

St. Bernard of Clairvaux and a nun embrace a bloody crucifix (early 14th century). While co-opting blood lust as a means to strengthen its emotional appeal, Christianity also created a social environment that gradually removed such desires from the population.

In the past millennium, some European societies underwent a profound behavioral change. People no longer limited their trust to close kin and longtime friends. It became normal to trust even non-kin and strangers, and this increased trust allowed the market economy to take off. When people are generally “good-natured,” you no longer have to check and double-check every transaction for evidence of cheating. You no longer have to be on your guard when dealing with strangers. You no longer have to scrutinize facial expressions and body language for signs of deceit—or an oncoming sucker punch. You no longer have to overcharge to cover the costs of theft, vandalism, and “inventory shrinkage.”

This is what we call a high-trust society. Its advent was a milestone of cultural evolution, yet we tend to focus on more material signs of progress. Even more surprisingly, we tend to take a high-trust environment for granted. Aren’t people good-natured by nature? Isn’t anything else abnormal?

Well, no. That kind of abnormality used to be the norm. And it still is, in many societies.

What did this transition actually mean in behavioral terms? For the historical economist Gregory Clark, it meant a shift from improvidence, violence, impulsiveness, and leisure loving to thrift, prudence, negotiation and hard work—what some now deride as “middle-class values.” Indeed, these values were those of the nascent middle class, and they became more preponderant as that class itself became more preponderant. By the early 19th century, in England, even the working class largely had middle-class ancestry (Clark, 2009).

This is not to say that the older behavioral traits were completely removed from the English population. Rather, they were reduced to a level that allowed the growing middle class to impose its norms on the entire population. The first attempts at imposition came with the rise of Puritanism in the 16th and 17th centuries, but the high tide of this behavioral hegemony would be the Victorian era of the mid to late 19th century.

During the same period, another behavioral shift was an increase in empathy. People began to show more concern for strangers, non-kin, and even non-human animals. Furthermore, this greater outward concern was paralleled by greater inward feelings of grief, love, and worry for the Other.

It is difficult to grasp how people once felt toward others, especially those beyond the charmed circle of close kin and friends. Such feelings went beyond mere indifference. Clark (2007) describes the former popularity in England of blood sports and other forms of exhibitionist violence (cock fighting, bear and bull baiting, public executions). Yes, most normal people used to crave the sight of blood and suffering.

Bynum (2002) describes this infatuation with the sight of blood in the late Middle Ages. There was above all “the violent quality of the religiosity itself—what we might call its visual violence, especially the prominence of the motifs of body parts and of blood.” This blood cult often spilled over into real violence:

In an even more troubling and graphic sense, the blood that spilled across European piety also accused, calling for vindication of, as well as empathy with, Christ. The majority of the blood cults of fourteenth- and fifteenth-century Europe were places of supposed host desecration, and these wonderhosts—sites of pilgrimage and pogrom—targeted lower-class women, thieves, and most of all Jews as violators of God. (Bynum, 2002, p. 31)

Blood lust is as old as humanity. Christianity merely used this desire to bind people to the faith. In time, however, Christianity became a means to remove blood lust from the repertory of normal feelings and, hence, to stigmatize people who openly voiced their love for the sight of blood. Such people found themselves increasingly marginalized—as suspected criminals, undesirable marriage partners, or simply “perverts.” With each passing generation, this predisposition was steadily culled from the population. We can see the beginnings of the displacement of blood lust by empathy in the 15th century, in the writings of the religious mystic Margery Kempe:

Margery tells us she thought of Christ beaten or wounded not just when she saw the crucifix but whenever she “saw a man or a beast . . . [with] a wound or if a man beat a child before her or smote a horse or another beast with a whip . . . as well in the field as in the town . . . .” To Margery, the violence of everyday life only reduplicated her sorrow at the violence inflicted on Christ. But the displacement could work the other way; the horror and filth of living could seem to pollute God. (Bynum, 2002, pp. 25-26)

Was this decline in blood lust driven by changes to the gene pool or by changes to learned behaviors? The answer is unclear, if only because natural selection acts indifferently on both. In their review of the literature, Jones and Gagnon (2007) state:

Some investigators have demonstrated a potential genetic basis for empathy (Hoffman, 2000). For example, Zahn-Waxler et al. (1992) found modest evidence for heritability of empathy and prosocial behaviours in 14- to 20-month-old monozygotic and dizygotic twins.

Further, empathic reasoning was associated with fewer behavioural problems in twin studies, suggesting a possible genetic basis for risk and resilience for psychopathology (Zahn-Waxler et al., 1996). Ultimately, these findings have been used to suggest that empathy has genetic influences as well as environmental ones (due to the modest heritability factor) during normal and problematic development.
(Jones & Gagnon, 2007, p. 227)

On the other hand, Kochanska (1993) argues for a more complex mix of heritable and environmental factors:

However, impulsivity may have a more complex relation with conscience development. It may indeed interfere with observing prohibitions and suppressing antisocial impulses, but it need not interfere with positive aspects of morality, such as empathy, sympathy, and response to others' distress. Bryant (1987) found that girls who at age 7 were characterized on Thomas and Chess's dimensions of temperament as intense and difficult to soothe (concepts that have some affinity to impulsivity) were more empathic at age 10. (Kochanska, 1993, p. 338)

Whatever the exact mix of genes and memes, Christianity eventually created a new man and a new woman. This cultural and biological evolution has been so successful that we are scarcely aware that human nature used to be very different. Hasn’t universal empathy always been normal? And blood lust abnormal?

References

Bynum, C.W. (2002). Violent imagery in late medieval piety, GHI Bulletin no. 30 (Spring), 3-36.

Clark, G. (2009).The Domestication of Man: The Social Implications of Darwin, ArtefaCToS, 2, 64-80
http://campus.usal.es/~revistas_trabajo/index.php/artefactos/article/view/5427

Clark, G. (2007). A Farewell to Alms. A Brief Economic History of the World, Princeton University Press, Princeton and Oxford.

Jones, N. A., & Gagnon, C. M. (2007). The neurophysiology of empathy. In T. F. D. Farrow & P. W. R. Woodruff (Eds.), Empathy in mental illness. (pp. 217-238), Cambridge, England : Cambridge University Press.

Kochanska, G. (1993). Toward a Synthesis of Parental Socialization and Child Temperament in Early Development of Conscience, Child Development, 64 (2), 325-347

Friday, March 11, 2011

From low trust to high trust

Code of Hammurabi. Today, law is based on universality, impartiality, and non-discrimination. Yet, originally, its core principles were the very opposite.

A key stage in cultural evolution has been the transition from low-trust to high-trust societies. Originally, the “horizon of trust” encompassed only close kin and long-time friends. Then, in some societies, this horizon progressively broadened to include more distant kin, eventually millions of people. How was this done?

In part, through “fictitious kinship.” Patricia Williams describes how this approach has been used to integrate in-laws into kinship networks:

The way to accomplish this deep acceptance of one’s spouse’s relatives, surely, is to take the emotions regarding one’s own genetic kin that are already present, attach symbols to the emotions, and redirect them to nonrelatives by considering those nonrelatives to be symbolic kin. A strong way to do this is to transform them by exactly parallel terms—call his parents “Mom” and “Dad”; call his siblings “sister” and “brother.” (Williams, 1988, p. 564)

We have also seen this approach with religion, nationalism, and later internationalism: “Brothers and sisters in Christ”; “I am my brother’s keeper”; “We are all “brothers” and so on.

A related approach has been to expand the system of law that once existed only among kith and kin. The notion of “law” may seem out of place here, accustomed as we are to its principles of universality, impartiality, and non-discrimination. Yet, originally, its core principles were the very opposite. Law was not universal. It arose within the web of long-standing reciprocal obligations that bound together man and woman, parents and children, immediate kin and more distant kin within a relatively small clan. Beyond one’s kith and kin, there was no law—other than the law of war.

Things changed with the rise of larger entities. Agriculture spurred population growth, with the result that small clans ballooned into much larger groups. Later came State societies, and empires that encompassed a founding ethnic group and its conquered peoples. Typically, this situation was managed by letting conquered peoples keep their own laws and some internal autonomy. This was the case with the Persian Empire, the Ottoman Empire, and many others. Pre-revolutionary France was a patchwork of local legal systems—a relic of earlier regional entities that had been absorbed into the French state over the centuries.

To varying degrees, State societies eventually reversed the original intent of law—by creating universal rules that apply equally to everyone. This process began with the earliest law codes. The very act of putting a social norm into writing entailed some simplification. Ancient legal systems accepted that laws should vary on a “who whom” basis, but the different categories of “who” and “whom” were necessarily limited.

This process advanced further with certain empires, notably the Roman Empire, which sought not merely to conquer other peoples but also to assimilate them, eventually giving them citizenship. But it was the rise of Christianity, and its establishment as a State religion, that fundamentally changed things. The law was no longer the prerogative of a founding group whose claim to power ultimately rested on “might is right.” It became a moral principle that applied equally to everyone, even the Emperor. When a mob killed a Roman general in 390 AD and thousands were slain in retaliation, the bishop of Milan denounced the massacre and forced the emperor to do public penance (Frost, 2010; Lenox-Conyngham, 2005).

Francis Fukuyama makes this point in his forthcoming book The Origins of Political Order, although he places the influence of Christianity later in time:

[…] the concept of the rule of law emerged very early, largely because of the church’s development of canon law in the 11th century. So when strong rulers started to build states, they had to take account of the emerging codes of civil law.

Europeans then developed the unusual idea that it was the law that should be absolute, not the ruler. In pursuit of this principle, the English Parliament executed one king, Charles I, and deposed another, James II. This proved a durable solution to the problem of building a strong state, yet one in which the ruler was held accountable.
(Wade, 2011)

The last phase of this process began with the end of the Dark Ages, and the re-establishment of more orderly societies. Success no longer went to the “bad boys”—the plunderers and ruthless self-aggrandizers—unless they happened to be the ruling elite. And even ruling elites had to become less rapacious, especially after converting to Christianity and founding dynasties—if only to avoid fouling their nest and leaving nothing to their heirs.

People thus entered a new environment of natural selection. A process of self-domestication began, as described by the historical economist Gregory Clark:

[…] societies becoming increasingly middle class in their orientation. Thrift, prudence, negotiation and hard work were imbuing themselves into communities that had been spendthrift, violent, impulsive and leisure loving. (Clark, 2009)

The above list leaves out another personality change. People began to show more empathy toward non-kin. Keep in mind that material success, and ultimately reproductive success, now depended on obedience to the law. And as the law became decontextualized and universalized, success went to those people who could understand universal rules, comply with them, and enforce compliance on others.

This predisposition to follow universal rules and live a rules-based existence might have been passed on culturally or genetically. Natural selection doesn’t “know” which is which, and in a traditional environment the outcome is quite similar. Whatever its cause, this predisposition would have gradually become more widespread with each passing generation. Clark (2009) does, however, make the case for at least partial genetic inheritance:

The chance a Danish adoptee would end up with a criminal record when neither set of parents had one was 13.5 per cent. When only the adoptive parent had a criminal record this chance rose very slightly to 14.7 per cent. However if only the biological parent had a criminal record the chance of the adoptee having a criminal record rose much more, to 20.0 per cent. If both sets of parents had a criminal record the chance of the adoptee having such a record was 24.5 per cent. Genetic influences on criminal propensities are much greater than environmental influences.

Such propensities might reflect weaker impulse control and a more present-oriented time orientation. But it could also indicate indifference toward others, especially non-kin, and a weaker ability to internalize and apply universal rules of conduct. Deceitful behavior in particular seems to have a significant genetic component (Barker et al., 2009). The shift to a high-trust society certainly involved learning new behaviors, but learning wasn’t the whole story.

References

Barker, E.D., H. Larson, E. Viding, B. Maughan, F. Rijsdijk, N. Fontaine, and R. Plomin. (2009). Common genetic but specific environmental influences for aggressive and deceitful behaviors in preadolescent males, Journal of Psychopathology and Behavioral Assessment, 31, 299-308.

Clark, G. (2009).The Domestication of Man: The Social Implications of Darwin, ArtefaCToS, 2, 64-80
http://campus.usal.es/~revistas_trabajo/index.php/artefactos/article/view/5427

Frost, P. (2010). The Roman State and genetic pacification, Evolutionary Psychology, 8(3), 376-389.
http://www.epjournal.net/filestore/EP08376389.pdf

Lenox-Conyngham, A. (2005). The Church in St. Ambrose of Milan, International Journal for the Study of the Christian Church, 5, 211-225.

Wade, N. (2011). From ‘End of History’ Author, a Look at the Beginning and Middle, The New York Times, March 7.
http://www.nytimes.com/2011/03/08/science/08fukuyama.html?_r=2&hpw=&pagewanted=all

Williams, P. (1988). Kin selection, symbolization, and culture, Perspectives in Biology and Medicine, 31, 558-566.

Friday, March 4, 2011

From markets to market economy

Market gate, old city of Baku. Until a few centuries ago, markets were highly localized in time and space. Most economic activity took place outside them, either within each household or in long-term reciprocal relations within the community.

The political Right typically believes that a market economy will self-generate as long as government gets out of the way and as long as property rights are legally protected.

But why, then, are market economies relatively recent? When Adam Smith wrote The Wealth of Nations (1776), he was describing an economic system that had scarcely existed a few centuries earlier in England and that was still unknown in most of the world. It wasn’t as if people had been ignorant of the market principle and its potential for creating wealth. In fact, markets had been common since the dawn of history and probably long before. But they had been highly localized in time and space, with most economic activity taking place outside them, either within each household or in long-term reciprocal relations within the community. Something kept all of these isolated marketplaces from expanding into a market economy.

That something was “trust.” Market economies have developed in societies where lying, cheating, and stealing are highly stigmatized, even when dealing with strangers and even when the chances of getting caught are minimal. In such societies, some people may deviate from this behavioral norm, but they are just that—deviants. They are the exceptions, not the rule. In the early 70s, my hometown used to have newspaper boxes that operated on the honor system. You took a paper and placed enough money in the coin slot. Nothing kept you from taking a paper without paying, but nobody ever did.

Yet this is not how most of the world works. When I raise this point with other people, they usually agree. Like me, they have traveled and seen how other societies really operate. But then they will argue that such societies are deviant. Something—lack of opportunity, poor education, or improper role models—is preventing them from being normal.

Yet the truth is the opposite. It is the high-trust societies that are abnormal, both historically and geographically. They are the ones that have abandoned the fundamental rule of trusting kith and kin above everyone else (because everyone else is out to screw you). That norm no longer applies to them. They have deviated from it; therefore, they are deviant.

Gregory Clark (2009) has described the trajectory that has led some societies to deviate from the low-trust norm:

The Darwinian struggle that shaped human nature did not end with the Neolithic Revolution but continued right up until the Industrial Revolution. But the arrival of settled agriculture and stable property rights set natural selection on a very different course. It created an accelerated period of evolution, rewarding with reproductive success a new repertoire of human behaviors – patience, self-control, passivity, and hard work – which consequently spread widely.

Was this behavioral shift due to changes in genes or in memes (learned ideas)? The question is difficult to answer (and not simply because many find it offensive). Natural selection operates on phenotypes, and only indirectly on genotypes. There was probably a mix of changes to learned behaviors and innate predispositions. There may also have been changes to gene/culture interaction, i.e., some predispositions may have now required less “exercise” to become fully expressed.

Whatever its exact nature, the overall change seems to have been a shift toward the sort of behavioral regime that exists in certain small communities, such as Hutterite colonies and Israeli kibbutzes, except on a much larger scale:

Still, a common good is always a fragile commodity, vulnerable to the parasitism of self-interest and nepotism. How were these perils averted? A large part of the answer is simply small size. Communities are small enough for everyone’s behaviour to be continuously monitored and controlled through informal, and if need be, formal sanctions. At the same time, the group is small enough for the contribution of every adult to be crucial to the collective welfare, and, therefore, for everyone to have a large stake in controlling shirkers and free-riders. In short, the costs of rewarding good behaviour and stigmatizing parasitism are much lower than the costs of ignoring other people’s behaviour. Both types of communities are notorious for lack of privacy: everyone’s behaviour is everybody’s concern. (van den Berghe & Peter, 1988)

But how was it possible to expand this high-trust environment to one that would encompass millions of people? How did societies like England pull it off?

(to be cont’d)

References

Clark, G. (2009).The Domestication of Man: The Social Implications of Darwin, ArtefaCToS, 2, 64-80
http://campus.usal.es/~revistas_trabajo/index.php/artefactos/article/view/5427

Van den Berghe, P.L. and K. Peter. (1988). Hutterites and kibbutzniks: A tale of nepotistic communism, Man (N.S.) 23, 522-539.