The Bell Curve Explained
American Enterprise Institute
From the book "The Bell Curve" by Charles C. Murray and Richard J. Herrnstein
In April, I recorded an interview of almost two and a half hours with Sam Harris for his Waking Up Podcast which, I learned only after I had done it, regularly attracts a few million listeners. We spent more than half of the interview discussing what is actually in The Bell Curve as opposed to what people think is in it. Both of us expected our Twitter feeds to light up with nasty reactions after the interview was posted. But the opposite happened. The nasty reactions were far outnumbered by people who said they had always assumed that The Bell Curve was the hateful pseudoscientific mess that the critics had claimed, but had now decided they wanted to give the book a chance. It has been a heartening experience.
However, there is a difficulty with giving The Bell Curve a chance. The paperback edition has 26 pages of front material, 552 pages of main text, a 23-page response to the critics, 111 pages of appendixes, another 111 pages of endnotes, and a 58-page bibliography. It’s a lot to get through. But there’s a shorter way to get a good idea of what’s in the book: Dick Herrnstein and I began each chapter with a summary that was usually about a page long. With the publisher’s permission, I have stitched all of those summaries together, along with selections from the Introduction and the openings to each of the four parts of the book. If these tidbits arouse enough interest that you buy the book, I will be delighted. But at this point in my life, my main objective is that a labor of love, written with a friend who I still miss twenty-three years after his death, be seen for what it is.
No alterations have been made in the published text. I have interspersed a few bits of explanatory text that are italicized and enclosed by brackets.
That the word intelligence describes something real and that it varies from person to person is as universal and ancient as any understanding about the state of being human. Literate cultures everywhere and throughout history have had words for saying that some people are smarter than others. Given the survival value of intelligence, the concept must be still older than that. Gossip about who in the tribe is cleverest has probably been a topic of conversation around the fire since fires, and conversation, were invented.
Yet for the last thirty years, the concept of intelligence has been a pariah in the world of ideas. The attempt to measure it with tests has been variously dismissed as an artifact of racism, political reaction, statistical bungling, and scholarly fraud. Many of you have reached this page assuming that these accusations are proved. In such a context comes this book, blithely proceeding on the assumption that intelligence is a reasonably well-understood construct, measured with accuracy and fairness by any number of standardized mental tests. The rest of this book can be better followed if you first understand why we can hold such apparently heterodox views, and for this it is necessary to know something about the story of measured intelligence.
[The Introduction then spends 18 pages on a history of the development of the IQ construct and the tests to measure it, beginning with Francis Galton in the last half of the nineteenth century and carrying it through to the hostility that IQ tests attracted in the last half of the twentieth century. We then described the current alternative constructions of “intelligence.” The following is a portion of the concluding section.]
Given these different ways of understanding intelligence, you will naturally ask where our sympathies lie and how they shape this book.
We will be drawing most heavily from the classical tradition. That body of scholarship represents an immense and rigorously analyzed body of knowledge. By accepted standards of what constitutes scientific evidence and scientific proof, the classical tradition has in our view given the world a treasure of information that has been largely ignored in trying to understand contemporary policy issues. Moreover, because our topic is the relationship of human abilities to public policy, we will be dealing in relationships that are based on aggregated data, which is where the classical tradition has the most to offer. Perhaps an example will illustrate what we mean.
Suppose that the question at issue regards individuals: “Given two 11 year olds, one with an IQ of 110 and one with an IQ of 90, what can you tell us about the differences between those two children?” The answer must be phrased very tentatively. On many important topics, the answer must be, “We can tell you nothing with any confidence.” It is well worth a guidance counselor’s time to know what these individual scores are, but only in combination with a variety of other information about the child’s personality, talents, and background. The individual’s IQ score all by itself is a useful tool but a limited one.
Suppose instead that the question at issue is: “Given two sixth-grade classes, one for which the average IQ is 110 and the other for which it is 90, what can you tell us about the difference between those two classes and their average prospects for the future?” Now there is a great deal to be said, and it can be said with considerable confidence—not about any one person in either class but about average outcomes that are important to the school, educational policy in general, and society writ large. The data accumulated under the classical tradition are extremely rich in this regard, as will become evident in subsequent chapters.
If instead we were more concerned with the development of cognitive processes than with aggregate social and economic outcomes, we would correspondingly spend more time discussing the work of the revisionists [Robert Sternberg and colleagues]. That we do not reflects our focus, not a dismissal of their work.
With regard to the radicals [Howard Gardner and colleagues] and the theory of multiple intelligences, we share some common ground. Socially significant individual differences include a wide range of human talents that do not fit within the classical conception of intelligence. For certain spheres of life, they matter profoundly. And even beyond intelligence and talents, people vary temperamentally, in personality, style, and character. But we confess to reservations about using the word intelligence to describe such factors as musical abilities, kinesthetic abilities, or personal skills. It is easy to understand how intelligence (ordinarily understood) is part of some aspects of each of those human qualities—obviously, Bach was engaging in intelligent activity, and so was Ted Williams, and so is a good used-car salesman—but the part intelligence plays in these activities is captured fairly well by intelligence as the classicists and revisionists conceive of it. In the case of music and kinesthetics, talent is a word with a domain and weight of its own, and we are unclear why we gain anything by discarding it in favor of another word, intelligence, that has had another domain and weight. In the case of intrapersonal and interpersonal skills, conventional intelligence may play some role, and, to the extent that other human qualities matter, words like sensitivity, charm, persuasiveness, insight—thelist could go on and on—have accumulated over the centuries to describe them. We lose precision by using the word intelligence to cover them all. Similarly, the effect that an artist or an athlete or a salesman creates is complex, with some aspects that may be dominated by specific endowments or capacities, others that may be the product of learned technique, others that may be linked to desires and drives, and still others that are characteristic of the kind of cognitive ability denoted by intelligence. Why try to make intelligence do triple or quadruple duty?
We agree emphatically with Howard Gardner, however, that the concept of intelligence has taken on a much higher place in the pantheon of human virtues than it deserves. One of the most insidious but also widespread errors regarding IQ, especially among people who have high IQs, is the assumption that another person’s intelligence can be inferred from casual interactions. Many people conclude that if they see someone who is sensitive, humorous, and talks fluently, the person must surely have an above-average IQ.
This identification of IQ with attractive human qualities in general is unfortunate and wrong. Statistically, there is often a modest correlation with such qualities. But modest correlations are of little use in sizing up other individuals one by one. For example, a person can have a terrific sense of humor without giving you a clue about where he is within thirty points on the IQ scale. Or a plumber with a measured IQ of 100—only an average IQ—can know a great deal about the functioning of plumbing systems. He may be able to diagnose problems, discuss them articulately, make shrewd decisions about how to fix them, and, while he is working, make some pithy remarks about the president’s recent speech.
At the same time, high intelligence has earmarks that correspond to a first approximation to the commonly understood meaning of smart. In our experience, people do not use smart to mean (necessarily) that a person is prudent or knowledgeable but rather to refer to qualities of mental quickness and complexity that do in fact show up in high test scores. To return to our examples: Many witty people do not have unusually high test scores, but someone who regularly tosses off impromptu complex puns probably does (which does not necessarily mean that such puns are very funny, we hasten to add). If the plumber runs into a problem he has never seen before and diagnoses its source through inferences from what he does know, he probably has an IQ of more than 100 after all. In this, language tends to reflect real differences: In everyday language, people who are called very smart tend to have high IQs.
All of this is another way of making a point so important that we will italicize it now and repeat elsewhere: Measures of intelligence have reliable statistical relationships with important social phenomena, but they are a limited tool for deciding what to make of any given individual. Repeat it we must, for one of the problems of writing about intelligence is how to remind readers often enough how little an IQ score tells about whether the human being next to you is someone whom you will admire or cherish. This thing we know as IQ is important but not a synonym for human excellence.
Howard Gardner has also convinced us that the word intelligence carries with it undue affect and political baggage. It is still a useful word, but we shall subsequently employ the more neutral term cognitive ability as often as possible to refer to the concept that we have hitherto called intelligence, just as we will use IQ as a generic synonym for intelligence test score. Since cognitive ability is an uneuphonious phrase, we lapse often so as to make the text readable. But at least we hope that it will help you think of intelligence as just a noun, not an accolade.
We have said that we will be drawing most heavily on data from the classical tradition. That implies that we also accept certain conclusions undergirding that tradition. To draw the strands of our perspective together and to set the stage for the rest of the book, let us set them down explicitly. Here are six conclusions regarding tests of cognitive ability, drawn from the classical tradition, that are by now beyond significant technical dispute:
- There is such a thing as a general factor of cognitive ability on which human beings differ.
- All standardized tests of academic aptitude or achievement measure this general factor to some degree, but IQ tests expressly designed for that purpose measure it most accurately.
- IQ scores match, to a first degree, whatever it is that people mean when they use the word intelligent or smart in ordinary language.
- IQ scores are stable, although not perfectly so, over much of a person’s life.
- Properly administered IQ tests are not demonstrably biased against social, economic, ethnic, or racial groups.
- Cognitive ability is substantially heritable, apparently no less than 40 percent and no more than 80 percent.
[The rest of the Introduction elaborates on these six points. After The Bell Curve was published, the American Psychological Association created a task force of 11 leading scholars of cognitive ability with the assignment of compiling a unanimous statement of the state of knowledge. The task force’s report, “Intelligence: Knowns and Unknowns” includes “knowns” that correspond to all six of these conclusions.]
The twentieth century dawned on a world segregated into social classes defined in terms of money, power, and status. The ancient lines of separation based on hereditary rank were being erased, replaced by a more complicated set of overlapping lines. Social standing still played a major role, if less often accompanied by a sword or tiara, but so did out-and-out wealth, educational credentials, and, increasingly, talent.
Our thesis is that the twentieth century has continued the transformation, so that the twenty-first will open on a world in which cognitive ability is the decisive dividing force. The shift is more subtle than the previous one but more momentous. Social class remains the vehicle of social life, but intelligence now pulls the train.
Cognitive stratification takes different forms at the top and the bottom of the scale of intelligence. Part II will look at the bottom. In Part I, we look at the top. Its story line is that modern societies identify the brightest youths with ever increasing efficiency and then guide them into fairly narrow educational and occupational channels. These channels are increasingly lucrative and influential, leading to the development of a distinct stratum in the social hierarchy, which we hereby dub the Cognitive Elite. The isolation of the brightest from the rest of society is already extreme; the forces driving it are growing stronger rather than weaker.
Governments can influence these forces but cannot neutralize them.
This does not mean that a member of the cognitive elite never crosses paths with a person with a low IQ, but the encounters that matter tend to be limited. The more intimate or more enduring the human relationship is, the more likely it is to be among people similar in intellectual level. That the brightest are identified has its benefits. That they become so isolated and inbred has its costs. Some of these costs are already visible in American society, while others lie over the horizon.
In the course of the twentieth century, America opened the doors of its colleges wider than any previous generation of Americans, or other society in history, could have imagined possible. This democratization of higher education has raised new barriers between people that may prove to be more divisive and intractable than the old ones.
The growth in the proportion of people getting college degrees is the most obvious result, with a fifteen-fold increase from 1900 to 1990. Even more important, the students going to college were being selected ever more efficiently for their high 1Q. The crucial decade was the 1950s, when the percentage of top students who went to college rose by more than it had in the preceding three decades. By the beginning of the 1990s, about 80 percent of all students in the top quartile of ability continued to college after high school. Among the high school graduates in the top few percentiles of cognitive ability, the chances of going to college already exceeded 90 percent.
Perhaps the most important of all the changes was the transformation of Americas elite colleges. As more bright youngsters went off to college, the colleges themselves began to sort themselves out. Starting in the 1950s, a handful of institutions became magnets for the very brightest of each year’s new class. In these schools, the cognitive level of the students rose far above the rest of the college population.
Taken together, these trends have stratified America according to cognitive ability.
People in different jobs have different average IQs. Lawyers, for example, have higher IQs on the average than bus drivers. Whether they must have higher IQs than bus drivers is a topic we take up in detail in the next chapter. Here we start by noting simply that people from different ranges on the IQ scale end up in different jobs.
Whatever the reason for the link between IQ and occupation, it goes deep. If you want to guess an adult male’s job status, the results of his childhood IQ test help you as much as knowing how many years he went to school. IQ becomes more important as the job gets intellectually tougher. To be able to dig a ditch, you need a strong back but not necessarily a high IQ score. To be a master carpenter, you need some higher degree of intelligence along with skill with your hands. To be a first-rate lawyer, you had better come from the upper end of the cognitive ability distribution. The same may be said of a handful of other occupations, such as accountants, engineers and architects, college teachers, dentists and physicians, mathematicians, and scientists. The mean IQ of people entering those fields is in the neighborhood of l20. In 1900, only one out of twenty people in the top 10 percent in intelligence were in any of these occupations, a figure that did not change much through 1940. But after 1940, more and more people with high IQs flowed into those jobs, and by 1990 the same handful of occupations employed about 25 percent of all the people in the top tenth of intelligence.
During the same period, IQ became more important for business executives. In 1900, the CEO of a large company was likely to be a WASP born into affluence. He may have been bright, but that was not mainly how he was chosen. Much was still the same as late as 1950. The next three decades saw a great social leveling, as the executive suites filled with bright people who could maximize corporate profits, and never mind if they came from the wrong side of the tracks or worshipped at a temple instead of a church. Meanwhile, the college degree became a requirement for many business positions, and graduate education went from a rarity to a commonplace among senior executives.
When one combines the people known to be in high IQ professions with estimates of the numbers of business executives who are drawn from the top tenth in cognitive ability, the results do not leave much room for maneuver. The specific proportions are open to argument, but the main point seems beyond dispute: Even as recently as midcentury, America was still a society in which most bright people were scattered throughout the wide range of jobs. As the century draws to a close, a very high proportion of that same group is now concentrated within a few occupations that are highly screened for IQ.
What accounts for the way that people with different levels of IQ end up in different occupations? The fashionable explanation has been education. People with high SAT scores get into the best colleges; people with the high GRE, MCAT, or LSAT test scores get into professional and graduate schools; and the education defines the occupation. The SAT score becomes unimportant once the youngster has gotten into the right college or graduate school.
Without doubt, education is part of the explanation; physicians need a high IQ to get into medical school, but they also need to learn the material that medical school teaches before they can be physicians. Plenty of hollow credentialing goes on as well, if not in medicine then in other occupations, as the educational degree becomes a ticket for jobs that could be done just as well by people without the degree.
But the relationship of cognitive ability to job performance goes beyond that. A smarter employee is, on the average, a more proficient employee. This holds true within professions: Lawyers with higher IQs are, on the average, more productive than lawyers with lower IQs. It holds true for skilled blue-collar jobs: Carpenters with high IQs are also (on average) more productive than carpenters with lower IQs. The relationship holds, although weakly, even among people in unskilled manual jobs.
The magnitude of the relationship between cognitive ability and job performance is greater than once thought. A flood of new analyses during the 1980s established several points with large economic and policy implications: Test scores predict job performance because they measure g, Spearman s general intelligence factor, not because they identify “aptitude” for a specific job. Any broad test of general intelligence predicts proficiency in most common occupations, and does so more accurately than tests that are narrowly constructed around the job’s specific tasks.
The advantage conferred by IQ is long-lasting. Much remains to be learned, but usually the smarter employee tends to remain more productive than the less smart employee even after years on the job.
An IQ score is a better predictor of job productivity than a job interview, reference checks, or college transcript.
Most sweepingly important, an employer that is free to pick among applicants can realize large economic gains from hiring those with the highest IQs. An economy that lets employers pick applicants with the highest IQs is a significantly more efficient economy. Herein lies the policy problem: Since 1971, Congress and the Supreme Court have effectively forbidden American employers from hiring based on intelligence tests. How much does this policy cost the economy? Calculating the answer is complex, so estimates vary widely, from what one authority thinks was a lower-bound estimate of $80 billion in 1980 to what another authority called an upper-bound estimate of $13 billion for that year.
Our main point has nothing to do with deciding how large the loss is or how large the gain would be if intelligence tests could be freely used for hiring. Rather, it is simply that intelligence itself is importantly related to job performance. Laws can make the economy less efficient by forbidding employers to use intelligence tests, but laws cannot make intelligence unimportant.
Cognitive partitioning through education and occupations will continue, and there is not much that the government or anyone else can do about it. Economics will be the main reason. At the same time that elite colleges and professional schools are turning out brighter and brighter graduates, the value of intelligence in the marketplace is rising. Wages earned by people in high-IQ occupations have pulled away from the wages in low-IQ occupations, and differences in education cannot explain most of this change.
Another force for cognitive partitioning is the increasing physical segregation of the cognitive elite from the rest of society. Members of the cognitive elite work in jobs that usually keep them off the shop floor, away from the construction site, and close to others who also tend to be smart. Computers and electronic communication make it increasingly likely that people who work mainly with their minds collaborate only with other such people. The isolation of the cognitive elite is compounded by its choices of where to live, shop, play, worship, and send its children to school.
Its isolation is intensified by an irony of a mobile and democratic society like America’s. Cognitive ability is a function of both genes and environment, with implications for egalitarian social policies. The more we succeed in giving every youngster a chance to develop his or her latent cognitive ability, the more we equalize the environmental sources of differences in intelligence. The irony is that as America equalizes the circumstances of people’s lives, the remaining differences in intelligence are increasingly determined by differences in genes. Meanwhile, high cognitive ability means, more than ever before, that the chances of success in life are good and getting better all the time. Putting it all together, success and failure in the American economy, and all that goes with it, are increasingly a matter of the genes that people inherit.
Add to this the phenomenon known as assortative mating. Likes attract when it comes to marriage, and intelligence is one of the most important of those likes. When this propensity to mate by IQ is combined with increasingly efficient educational and occupational stratification, assortative mating by IQ has more powerful effects on the next generation than it had on the previous one. This process too seems to be getting stronger, part of the brew creating an American class system.
Whereas Part I dealt with positive outcomes—attainment of high educational levels, prestigious occupations, high incomes—Part II presents our best estimate of how much intelligence has to do with America’s most pressing social problems. The short answer is “quite a lot,” and the reason is that different levels of cognitive ability are associated with different patterns of social behavior. High cognitive ability is generally associated with socially desirable behaviors, low cognitive ability with socially undesirable ones.
“Generally associated with” does not mean “coincident with.” For virtually all of the topics we will be discussing, cognitive ability accounts for only small to middling proportions of the variation among people. It almost always explains less than 20 percent of the variance, to use the statistician’s term, usually less than 10 percent and often less than 5 percent. What this means in English is that you cannot predict what a given person will do from his IQ score—a point that we have made in Part I and will make again, for it needs repeating. On the other hand, despite the low association at the individual level, large differences in social behavior separate groups of people when the groups differ intellectually on the average.
We will argue that intelligence itself, not just its correlation with socioeconomic status, is responsible for these group differences. Our thesis appears to be radical, judging from its neglect by other social scientists. Could low intelligence possibly be a cause of irresponsible childbearing and parenting behaviors, for example? Scholars of childbearing and parenting do not seem to think so. The 850 double-column pages of the authoritative Handbook of Marriage and the Family, for example, allude to intelligence about half a dozen times, always in passing. Could low intelligence possibly be a cause of unemployment or poverty? Only a scattering of economists have broached the subject.
This neglect points to a gaping hole in the state of knowledge about social behavior. It is not that cognitive ability has been considered and found inconsequential but that it has barely been considered at all. The chapters in Part II add cognitive ability to the mix of variables that social scientists have traditionally used, clearing away some of the mystery that has surrounded the nation’s most serious social problems.
We will also argue that cognitive ability is an important factor in thinking about the nature of the present problems, whether or not cognitive ability is a cause. For example, if many of the single women who have babies also have low IQ, it makes no difference (in one sense) whether the low IQ caused them to have the babies or whether the path of causation takes a more winding route. The reality that less intelligent women have most of the out-of-wedlock babies affects and constrains public policy, whatever the path of causation. The simple correlation, unadjusted for other factors—what social scientists called the zero-order correlation—between cognitive ability and social behaviors is socially important. [The introduction to Part II continues for several more pages, describing the quantitative analyses that make up its core. The main thing to keep in mind as you read the chapter summaries is that the quantitative analyses are based on a sample composed exclusively of non-Latino whites.]
Who becomes poor? One familiar answer is that people who are unlucky enough to be born to poor parents become poor. There is some truth to this. Whites, the focus of our analyses in the chapters of Part II, who grew up in the worst 5 percent of socioeconomic circumstances are eight times more likely to fall below the poverty line than those growing up in the top 5 percent of socioeconomic circumstances. But low intelligence is a stronger precursor of poverty than low socioeconomic background. Whites with IQs in the bottom 5 percent of the distribution of cognitive ability are fifteen times more likely to be poor than those with IQs in the top 5 percent.
How does each of these causes of poverty look when the other is held constant? Or to put it another way: If you have to choose, is it better to be born smart or rich? The answer is unequivocally “smart.” A white youth reared in a home in which the parent or parents were chronically unemployed, worked at only the most menial of jobs, and had not gotten past ninth grade, but of just average intelligence—an IQ of 100—has nearly a 90 percent chance of being out of poverty by his or her early 30s. Conversely, a white youth born to a solid middle-class family but with an IQ equivalently below average faces a much higher risk of poverty, despite his more fortunate background.
When the picture is complicated by adding the effects of sex, marital status, and years of education, intelligence remains more important than any of them, with marital status running a close second. Among people who are both smart and well educated, the risk of poverty approaches zero. But it should also be noted that young white adults who marry are seldom in poverty, even if they are below average in intelligence or education. Even in these more complicated analyses, low IQ continues to be a much stronger precursor of poverty than the socioeconomic circumstances in which people grow up.
Leaving school before getting a high school diploma in the old days was usually not a sign of failure. The youngster had not dropped out but simply moved on. As late as 1940, fewer than half of 18-year-olds got a high school diploma. But in the postwar era, the high school diploma became the norm. Now, not having one is a social disability of some gravity.
The usual picture of high school dropouts focuses on their socioeconomic circumstances. It is true that most of them are from poor families, but the relationship of socioeconomics to school dropout is not simple. Among whites, almost no one with an IQ in the top quarter of the distribution fails to get a high school education, no matter how poor their families. Dropout is extremely rare throughout the upper half of the IQ distribution. Socioeconomic background has its most powerful effect at the lowest end of the social spectrum, among students who are already below average in intelligence. Being poor has a small effect on dropping out of school independent of IQ; it has a sizable independent effect on whether a person finishes school with a regular diploma or a high school equivalency certificate.
To raise the chances of getting a college degree, it helps to be in the upper half of the distribution for either IQ or socioeconomic status. But the advantage of a high IQ outweighs that of high status. Similarly, the disadvantage of a low IQ outweighs that of low status. Youngsters from poor backgrounds with high IQs are likely to get through college these days, but those with low lQs, even if they come from well-to-do backgrounds, are not.
Economists distinguish between being unemployed and being out of the labor force. The unemployed are looking for work unsuccessfully. Those out of the labor force are not looking, at least for the time being. Among young white men in their late 20s and early 30s, both unemployment and being out of the labor force are strongly predicted by low cognitive ability, even after taking other factors into account.
Many of the white males in the NLSY [National Longitudinal Survey of Youth, the database used for the analyses in Part II] who were out of the labor force had the obvious excuse: They were still in college or graduate school. Of those not in school, 15 percent spent at least a month out of the labor force in 1989. The proportion was more than twice as high in cognitive Class V as in Class I. [Class V is people with IQs no higher than 75. Class I is people with IQs of at least 125.] Socioeconomic background was not the explanation. After the effects of IQ were taken into account, the probability of spending time out of the labor force went up, not down, as parental SES rose.
Why are young men out of the labor force? One obvious possibility is physical disability. Yet here too cognitive ability is a strong predictor: Of the men who described themselves as being too disabled to work, more than nine out of ten were in the bottom quarter of the IQ distribution; fewer than one in twenty were in the top quarter. A man’s IQ predicted whether he described himself as disabled better than the kinds of job he had held.
We do not know why intelligence and physical problems are so closely related, but one possibility is that less intelligent people are more accident prone.
The results are similar for unemployment. Among young white men who were in the labor market, the likelihood of unemployment for high school graduates and college graduates was equally dependent on cognitive ability. Socioeconomic background was irrelevant once intelligence was taken into account.
Most men, whatever their intelligence, are working steadily. However, for that minority of men who are either out of the labor force or unemployed, the primary risk factor seems to be neither socioeconomic background nor education but low cognitive ability.
Rumors of the death of the traditional family have much truth in them for some parts of white American society—those with low cognitive ability and little education—and much less truth for the college educated and very bright Americans of all educational levels. In this instance, cognitive ability and education appear to play mutually reinforcing but also independent roles.
For marriage, the general rule is that the more intelligent get married at higher rates than the less intelligent. This relationship, which applies across the range of intelligence, is obscured among people with high levels of education because college and graduate school are powerful delayers of marriage.
Divorce has long been more prevalent in the lower socioeconomic and educational brackets, but this turns out to be explained better by cognitive level than by social status. Once the marriage-breaking impact of low intelligence is taken into account, people of higher socioeconomic status are more likely to get divorced than people of lower status.
Illegitimacy, one of the central social problems of the times, is strongly related to intelligence. White women in the bottom 5 percent of the cognitive ability distribution are six times as likely to have an illegitimate first child as those in the top 5 percent. One out of five of the legitimate first babies of women in the bottom 5 percent was conceived prior to marriage, compared to fewer than one out of twenty of the legitimate babies to women in the top 5 percent. Even among young women who have grown up in broken homes and among young women who are poor—both of which foster illegitimacy—low cognitive ability further raises the odds of giving birth illegitimately. Low cognitive ability is a much stronger predisposing factor for illegitimacy than low socioeconomic background.
At lower educational levels, a woman s intelligence best predicts whether she will bear an illegitimate child. Toward the higher reaches of education, almost no white women are having illegitimate children, whatever their family background or intelligence.
People have had reason to assume for many years that welfare mothers are concentrated at the low end of the cognitive ability distribution, if only because they have generally done poorly in school. Beyond that, it makes sense that smarter women can more easily find jobs and resist the temptations of welfare dependency than duller ones, even if they have given birth out of wedlock.
The link is confirmed in the NLSY. Over three-quarters of the white women who were on welfare within a year of the birth of their first child came from the bottom quartile of IQ, compared to 5 percent from the top quartile. When we subdivide welfare recipients into two groups, “temporary” and “chronic,” the link persists, though differently for the two groups.
Among women who received welfare temporarily, low IQ is a powerful risk factor even after the effects of marital status, poverty, age, and socioeconomic background are statistically extracted. For chronic welfare recipiency, the story is more complicated. For practical purposes, white women with above-average cognitive ability or above-average socioeconomic background do not become chronic welfare recipients. Among the restricted sample of low-IQ, low-SES, and relatively uneducated white women who are chronically on welfare, low socioeconomic background is a more powerful predictor than low IQ, even after taking account of whether they were themselves below the poverty line at the time they had their babies.
The analyses provide some support for those who argue that a culture of poverty tends to transmit chronic welfare dependency from one generation to the next. But if a culture of poverty is at work, it seems to have influence primarily among women who are of low intelligence.
Everyone agrees, in the abstract and at the extremes, that there is good parenting and poor parenting. This chapter addresses the uncomfortable question: Is the competence of parents at all affected by how intelligent they are?
It has been known for some time that socioeconomic class and parenting are linked, both to disciplinary practices and to the many ways in which the intellectual and emotional development of the child are fostered. On both counts, parents with higher socioeconomic status look better. At the other end of the parenting continuum, neglect and abuse are heavily concentrated in the lower socioeconomic classes.
Whenever an IQ measure has been introduced into studies of parent-child relationships, it has explained away much of the differences that otherwise would have been attributed to education or social class, but the examples are sparse. The NLSY provides an opportunity to fill in a few of the gaps.
With regard to prenatal and infant care, low IQ among the white mothers in the NLSY sample was related to low birth weight, even after controlling for socioeconomic background, poverty, and age of the mother. In the NLSYs surveys of the home environment, mothers in the top cognitive classes provided, on average, better environments for children than the mothers in the bottom cognitive classes. Socioeconomic background and current poverty also played significant roles, depending on the specific type of measure and the age of the children.
In the NLSYs measures of child development, low maternal IQ was associated with problematic temperament in the baby and with low scores on an index of “friendliness,” with poor motor and social development of toddlers and with behavioral problems from age 4 on up. Poverty usually had a modest independent role but did not usually diminish the contribution of IQ (which was usually also modest). Predictably, the mother’s IQ was also strongly related to the IQ of the child. Taking these data together, the NLSY results say clearly that high IQ is by no means a prerequisite for being a good mother. The disquieting finding is that the worst environments for raising children, of the kind that not even the most resilient children can easily overcome, are concentrated in the homes in which the mothers are at the low end of the intelligence distribution.
Among the most firmly established facts about criminal offenders is that their distribution of 1Q scores differs from that of the population at large. Taking the scientific literature as a whole, criminal offenders have average IQs of about 92, eight points below the mean. More serious or chronic offenders generally have lower scores than more casual offenders. The relationship of IQ to criminality is especially pronounced in the small fraction of the population, primarily young men, who constitute the chronic criminals that account for a disproportionate amount of crime. Offenders who have been caught do not score much lower, if at all, than those who are getting away with their crimes. Holding socioeconomic status constant does little to explain away the relationship between crime and cognitive ability.
High intelligence also provides some protection against lapsing into criminality for people who otherwise are at risk. Those who have grown up in turbulent homes, have parents who were themselves criminal, or who have exhibited the childhood traits that presage crime are less likely to become criminals as adults if they have high IQ.
These findings from an extensive research literature are supported by the evidence from white males in the NLSY. Low IQ was a risk factor for criminal behavior, whether criminality was measured by incarceration or by self-acknowledged crimes. The socioeconomic background of the NLSYs white males was a negligible risk factor once their cognitive ability was taken into account.
A free society demands a citizenry that willingly participates in the civic enterprise, in matters as grand as national elections and as commonplace as neighborliness. Lacking this quality—civility, in its core meaning—a society must replace freedom with coercion if it is to maintain order. This chapter examines the contribution of cognitive ability to the capacity for civility and citizenship.
Most manifestations of civility are too fleeting to be measured and studied. One realm of activity that does leave measurable traces is political involvement, which includes both participation in political activities and some knowledge and sophistication about them.
For assessing any relationship between political involvement and IQ, the best data, surprisingly, are from studies of children, and the results are consistent: Brighter children of all socioeconomic classes, including the poorest, learn more rapidly about politics and how government works, and are more likely than duller children to read about, discuss, and participate in political activities. The gap between brighter and duller children in political development widens with age, unlike the static gap across socioeconomic classes.
For adults, the standard theory of political involvement for many years has assumed that socioeconomic status is the vital link. People at higher-status levels vote more, and they know and care more about political matters than do people at lower levels of status. But the available research offers ample evidence that the key element for predicting political involvement is educational level. The people who vote least and who care the least about political issues are not so much the poor as the uneducated, whatever their income or occupation. Why does education matter so much? The fragmentary studies available indicate that education predicts political involvement in America because it is primarily a proxy for cognitive ability.
The NLSY does not have the data for pursuing this manifestation of civility, but it permits us to explore another aspect of it: To what extent is high intelligence associated with the behaviors associated with “middle-class values”? The answer is that the brighter young people of the NLSY are also the ones whose lives most resemble a sometimes disdained stereotype: They stick with school, are plugging away in the workforce, and are loyal to their spouse. Insofar as intelligence helps lead people to behave in these ways, it is also a force for maintaining a civil society.
Part II was circumscribed, taking on social behaviors one at a time, focusing on causal roles, with the analysis restricted to whites wherever the data permitted. We now turn to the national scene. This means considering all races and ethnic groups, which leads to the most controversial issues we will discuss: ethnic differences in cognitive ability and social behavior, the effects of fertility patterns on the distribution of intelligence, and the overall relationship of low cognitive ability to what has become known as the underclass. As we begin, perhaps a pact is appropriate. The facts about these topics are not only controversial but exceedingly complex. For our part, we will undertake to confront all the tough questions squarely. We ask that you read carefully.
Despite the forbidding air that envelops the topic, ethnic differences in cognitive ability are neither surprising nor in doubt. Large human populations differ in many ways, both cultural and biological. It is not surprising that they might differ at least slightly in their cognitive characteristics. That they do is confirmed by the data on ethnic differences in cognitive ability from around the world. One message of this chapter is that such differences are real and have consequences. Another is that the facts are not as alarming as many people seem to fear.
East Asians (e.g., Chinese, Japanese), whether in America or in Asia, typically earn higher scores on intelligence and achievement tests than white Americans. The precise size of their advantage is unclear; estimates range from just a few to ten points. A more certain difference between the races is that East Asians have higher nonverbal intelligence than whites while being equal, or perhaps slightly lower, in verbal intelligence.
The difference in test scores between African-Americans and European-Americans as measured in dozens of reputable studies has converged on approximately a one standard deviation difference for several decades.
Translated into centiles, this means that the average white person tests higher than about 84 percent of the population of blacks and that the average black person tests higher than about 16 percent of the population of whites.
The average black and white differ in IQ at every level of socioeconomic status (SES), but they differ more at high levels of SES than at low levels. Attempts to explain the difference in terms of test bias have failed. The tests have approximately equal predictive force for whites and blacks. In the past few decades, the gap between blacks and whites narrowed by perhaps three IQ points. The narrowing appears to have been mainly caused by a shrinking number of very low scores in the black population rather than an increasing number of high scores. Improvements in the economic circumstances of blacks, in the quality of the schools they attend, in better public health, and perhaps also diminishing racism may be narrowing the gap.
The debate about whether and how much genes and environment have to do with ethnic differences remains unresolved. The universality of the contrast in nonverbal and verbal skills between East Asians and European whites suggests, without quite proving, genetic roots. Another line of evidence pointing toward a genetic factor in cognitive ethnic differences is that blacks and whites differ most on the tests that are the best measures of g, or general intelligence. On the other hand, the scores on even highly g-loaded tests can be influenced to some extent by changing environmental factors over the course of a decade or less. Beyond that, some social scientists have challenged the premise that intelligence tests have the same meaning for people who live in different cultural settings or whose forebears had very different histories.
Nothing seems more fearsome to many commentators than the possibility that ethnic and race differences have any genetic component at all. This belief is a fundamental error. Even if the differences between races were entirely genetic (which they surely are not), it should make no practical difference in how individuals deal with each other. The real danger is that the elite wisdom on ethnic differences—that such differences cannot exist—will shift to opposite and equally unjustified extremes. Open and informed discussion is the one certain way to protect society from the dangers of one extreme view or the other.
Ethnic differences in education, occupations, poverty, unemployment, illegitimacy, crime, and other signs of inequality preoccupy scholars and thoughtful citizens. In this chapter, we examine these differences after cognitive ability is taken into account.
We find that Latinos and whites of similar cognitive ability have similar social behavior and economic outcomes. Some differences remain, and a few are substantial, but the overall pattern is similarity. For blacks and whites, the story is more complicated. On two vital indicators of success—educational attainment and entry into prestigious occupations—the black-white discrepancy reverses. After controlling for IQ, larger numbers of blacks than whites graduate from college and enter the professions. On a third important indicator of success, wages, the black-white difference for year-round workers shrinks from several thousand to a few hundred dollars.
In contrast, the B/W gap in annual family income or in persons below the poverty line narrows after controlling for IQ but still remains sizable. Similarly, differences in unemployment, labor force participation, marriage, and illegitimacy get smaller but remain significant after extracting the effect of IQ. These inequalities must be explained by other factors in American life. Scholars have advanced many such explanations; we will not try to adjudicate among them here, except to suggest that in trying to understand the cultural, social, and economic sources of these differences, understanding how cognitive ability plays into the mix of factors seems indispensable. The role of cognitive ability has seldom been considered in the past. Doing so in future research could clarify issues and focus attention on the factors that are actually producing the more troubling inequalities.
When people die, they are not replaced one for one by babies who will develop identical IQs. If the new babies grow up to have systematically higher or lower IQs than the people who die, the national distribution of intelligence changes. Mounting evidence indicates that demographic trends are exerting downward pressure on the distribution of cognitive ability in the United States and that the pressures are strong enough to have social consequences.
Throughout the West, modernization has brought falling birth rates. The rates fall faster for educated women than the uneducated. Because education is so closely linked with cognitive ability, this tends to produce a dysgenic effect, or a downward shift in the ability distribution. Furthermore, education leads women to have their babies later—which alone also produces additional dysgenic pressures.
The professional consensus is that the United States has experienced dysgenic pressures throughout either most of the century (the optimists) or all of the century (the pessimists). Women of all races and ethnic groups follow this pattern in similar fashion. There is some evidence that blacks and Latinos are experiencing even more severe dysgenic pressures than whites, which could lead to further divergence between whites and other groups in future generations. The rules that currently govern immigration provide the other major source of dysgenic pressure. It appears that the mean IQ of immigrants in the 1980s works out to about 95. The low IQ may not be a problem; in the past, immigrants have sometimes shown large increases on such measures. But other evidence indicates that the self-selection process that used to attract the classic American immigrant—brave, hardworking, imaginative, self-starting, and often of high IQ—has been changing, and with it the nature of some of the immigrant population.
Putting the pieces together, something worth worrying about is happening to the cognitive capital of the country. Improved health, education, and childhood interventions may hide the demographic effects, but that does not reduce their importance. Whatever good things we can accomplish with changes in the environment would be that much more effective if they did not have to fight a demographic head wind.
In this chapter, the question is not whether low cognitive ability causes social problems but the prevalence of low cognitive ability among people who have those problems. It is an important distinction. Causal relationships are complex and hard to establish definitely. The measure of prevalence is more straightforward. For most of the worst social problems of our time, the people who have the problem are heavily concentrated in the lower portion of the cognitive ability distribution. Any practical solution must therefore be capable of succeeding with such people.
Our analysis provides few clear and decisive solutions to the major domestic issues of the day. But, at the same time, there is no major domestic issue for which the news we bring is irrelevant.
Do we want to persuade poor single teenagers not to have babies? The knowledge that 95 percent of poor teenage women who have babies are also below average in intelligence should prompt skepticism about strategies that rely on abstract and far-sighted calculations of self-interest. Do we favor job training programs for chronically unemployed men? Any program is going to fail unless it is designed for a target population half of which has IQs below 80. Do we wish to reduce income inequality? If so, we need to understand how the market for cognitive ability drives the process. Do we aspire to a “world class” educational system for America? Before deciding what is wrong with the current system, we had better think hard about how cognitive ability and education are linked. Part IV tries to lay out some of these connections.
Raising intelligence significantly, consistently, and affordably would circumvent many of the problems that we have described. Furthermore, the needed environmental improvements—better nutrition, stimulating environments for preschool children, good schools thereafter—seem obvious. But raising intelligence is not easy.
Nutrition may offer one of the more promising approaches. Height and weight have increased markedly with better nutrition. The rising IQs in many countries suggest that better nutrition may be increasing intelligence too. Controlled studies have made some progress in uncovering a link between improved nutrition and elevated cognitive ability as well, but it remains unproved and not well understood.
Formal schooling offers little hope of narrowing cognitive inequality on a large scale in developed countries, because so much of its potential contribution has already been realized with the advent of universal twelve-year systems. Special programs to improve intelligence within the school have had minor and probably temporary effects on intelligence. There is more to be gained from educational research to find new methods of instruction than from more interventions of the type already tried.
Preschool has borne many of the recent hopes for improving intelligence. However, Head Start, the largest program, does not improve cognitive functioning. More intensive, hence more costly, preschool programs may raise intelligence, but both the size and the reality of the improvements are in dispute. The one intervention that works consistently is adoption at birth from a bad family environment to a good one. The average gains in childhood IQ associated with adoption are in the region of six points—not spectacular but not negligible either.
Taken together, the story of attempts to raise intelligence is one of high hopes, flamboyant claims, and disappointing results. For the foreseeable future, the problems of low cognitive ability are not going to be solved by outside interventions to make children smarter.
Most people think that American public education is in terrible shape, and any number of allegations seem to confirm it. But a search of the data does not reveal that the typical American school child in the past would have done any better on tests of academic skills. An American youth with average IQ is probably better prepared academically now than ever before. The problem with American education is confined mainly to one group of students, the cognitively gifted. Among the most gifted students, SAT scores started falling in the mid-1960s, and the verbal scores have not recovered since.
One reason is that disadvantaged students have been “in” and gifted students “out” for thirty years. Even in the 1990s, only one-tenth of one percent of all the federal funds spent on elementary and secondary education go to programs for the gifted. Because success was measured in terms of how well the average and below-average children performed, American education was dumbed down: Textbooks were made easier, and requirements for courses, homework, and graduation were relaxed. These measures may have worked as intended for the average and below-average students, but they let the gifted get away without ever developing their potential.
In thinking about policy, the first step is to realize where we are. In a universal education system, many students will fall short of basic academic competence. Most American parents say they are already satisfied with their local school. The average student has little incentive to work hard in high school. Getting into most colleges is easy, and achievement in high school does not pay off in higher wages or better jobs for those who do not go to college. On a brighter note, realism also leads one to expect that modest improvements in the education of average students will continue as they have throughout the century except for the aberrational period from the mid-1960s to mid-1970s.
In trying to build on this natural improvement, the federal government should support greater flexibility for parents to send their children to schools of their choosing, whether through vouchers, tax credits, or choice within the public schools. Federal scholarships should reward academic performance. Some federal funds now so exclusively focused on the disadvantaged should be reallocated to programs for the gifted.
We urge primarily not a set of new laws but a change of heart within the ranks of educators. Until the latter half of this century, it was taken for granted that one of the chief purposes of education was to educate the gifted—not because they deserved it through their own merit but because, for better or worse, the future of society was so dependent on them. It was further understood that this education must aim for more than technical facility. It must be an education that fosters wisdom and virtue through the ideal of the “educated man.” Little will change until educators once again embrace this aspect of their vocation.
Affirmative action on the campus needs, at last, to be discussed as it is actually practiced, not as the rhetoric portrays it. Our own efforts to assemble data on a secretive process lead us to conclude that affirmative action as it is practiced cannot survive public scrutiny.
The edge given to minority applicants to college and graduate school is not a nod in their favor in the case of a close call but an extremely large advantage that puts black and Latino candidates in a separate admissions competition. On elite campuses, the average black freshman is in the region of the 10th to 15th percentile of the distribution of cognitive ability among white freshman. Nationwide, the gap seems to be at least that large, perhaps larger. The gap does not diminish in graduate school. If anything, it may be larger.
In the world of college admissions, Asians are a conspicuously unprotected minority. At the elite schools, they suffer a modest penalty, with the average Asian freshman being at about the 60th percentile of the white cognitive ability distribution. Our data from state universities are too sparse to draw conclusions. In all the available cases, the difference between white and Asian distributions is small (either plus or minus) compared to the large differences separating blacks and Latinos from whites.
The edge given to minority candidates could be more easily defended if the competition were between disadvantaged minority youths and privileged white youths. But nearly as large a cognitive difference separates disadvantaged black freshmen from disadvantaged white freshmen. Still more difficult to defend, blacks from affluent socioeconomic backgrounds are given a substantial edge over disadvantaged whites.
There is no question that affirmative action has “worked” in the sense that it has put more blacks and Latinos on college campuses than would otherwise have been there. But this success must be measured against costs.
When students look around them, they see that blacks and Latinos constitute small proportions of the student population but high proportions of the students doing poorly in school. The psychological consequences of this disparity may be part of the explanation for the increasing racial animosity and the high black dropout rates that have troubled American campuses. In society at large, a college degree does not have the same meaning for a minority graduate and a white one, with consequences that reverberate in the workplace and continue throughout life.
It is time to return to the original intentions of affirmative action: to cast a wider net, to give preference to members of disadvantaged groups, whatever their skin color, when qualifications are similar. Such a change would accord more closely with the logic underlying affirmative action, with the needs of today’s students of all ethnic groups, and with progress toward a healthy multiracial society.
Employers want to hire the best workers; employment tests are one of the best and cheapest selection tools at their disposal. Since affirmative action began in the early 1960s, and especially since a landmark decision by the Supreme Court in 1971, employers have been tightly constrained in the use they may make of tests. The most common solution is for employers to use them but to hire enough protected minorities to protect themselves from prosecution and lawsuits under the job discrimination rules.
The rules that constrain employers were developed by Congress and the Supreme Court based on the assumptions that tests of general cognitive ability are not a good way of picking employees, that the best tests are ones that measure specific job skills, that tests are biased against blacks and other minorities, and that all groups have equal distributions of cognitive ability. These assumptions are empirically incorrect. Paradoxically, job hiring and promotion procedures that are truly fair and unbiased will produce the racial disparities that public policy tries to prevent.
Have the job discrimination regulations worked? The scholarly consensus is that they had some impact, on some kinds of jobs, in some settings, during the 1960s and into the 1970s, but have not had the decisive impact that is commonly asserted in political rhetoric. It also appears, however, that since the early 1960s blacks have been overrepresented in white collar and professional occupations relative to the number of candidates in the IQ range from which these jobs are usually filled, suggesting that the effects of affirmative action policy may be greater than usually thought.
The successes of affirmative action have been much more extensively studied than the costs. One of the most understudied areas of this topic is job performance. The scattered data suggest that aggressive affirmative action does produce large racial discrepancies in job performance in a given workplace. It is time that this important area be explored systematically.
In coming to grips with policy, a few hard truths have to be accepted. First, there are no good ways to implement current job discrimination law without incurring costs in economic efficiency and fairness to both employers and employees. Second, after controlling for IQ, it is hard to demonstrate that the United States still suffers from a major problem of racial discrimination in occupations and pay.
As we did for affirmative action in higher education, we present the case for returning to the original conception of affirmative action. This means scrapping the existing edifice of job discrimination law. We think the benefits to productivity and to fairness of ending the antidiscrimination laws are substantial. But our larger reason is that this nation does not have the option of ethnic balkanization.
In this penultimate chapter we speculate about the impact of cognitive stratification on American life and government. Predicting the course of society is chancy, but certain tendencies seem strong enough to worry about:
- An increasingly isolated cognitive elite.
- A merging of the cognitive elite with the affluent.
- A deteriorating quality of life for people at the bottom end of the cognitive ability distribution.
Unchecked, these trends will lead the U.S. toward something resembling a caste society, with the underclass mired ever more firmly at the bottom and the cognitive elite ever more firmly anchored at the top, restructuring the rules of society so that it becomes harder and harder for them to lose. Among the other casualties of this process would be American civil society as we have known it. Like other apocalyptic visions, this one is pessimistic, perhaps too much so. On the other hand, there is much to be pessimistic about.
Hundreds of pages ago, in the Preface, we reflected on the question that we have been asked so often, “What good can come from writing this book?” We have tried to answer it in many ways.
Our first answer has been implicit, scattered in material throughout the book. For thirty years, vast changes in American life have been instituted by the federal government to deal with social problems. We have tried to point out what a small segment of the population accounts for such a large proportion of those problems. To the extent that the problems of this small segment are susceptible to social-engineering solutions at all, they should be highly targeted. The vast majority of Americans can run their own lives just fine, and policy should above all be constructed so that it permits them to do so.
Our second answer, also implicit, has been that just about any policy in any area—education, employment, welfare, criminal justice, or the care of children—can profit if its designers ask how the policy accords with the wide variation in cognitive ability. Policies may fail not because they are inherently flawed but because they do not make allowances for how much people vary. There are hundreds of ways to frame bits and pieces of public policy so that they are based on a realistic appraisal of the responses they will get not from people who think like Rhodes scholars but people who think in simpler ways.
Our third answer has gone to specific issues in raising the cognitive functioning of the disadvantaged (Chapter 17) and in improving education for all (Chapter 18). Part of our answer has been cautionary: Much of public policy toward the disadvantaged starts from the premise that interventions can make up for genetic or environmental disadvantages, and that premise is overly optimistic. Part of our answer has been positive: Much can and should be done to improve education, especially for those who have the greatest potential.
In this closing chapter, we have focused on another aspect of what makes America special. This most individualistic of nations contains one of the friendliest, most eager to oblige, neighborly peoples in all the world.
Our fourth answer has been that group differences in cognitive ability, so desperately denied for so long, can best be handled—can only be handled—by a return to individualism. A person should not be judged as a member of a group but as an individual. With that cornerstone of the American doctrine once again in place, group differences can take their appropriately insignificant place in affecting American life. But until that cornerstone is once again in place, the anger, the hurt, and the animosities will continue to grow.
Visitors to America from Tocqueville on down have observed it. As a by-product of this generosity and civic mindedness, America has had a genius for making valued places, for people of all kinds of abilities, given only that they played by a few basic rules.
Once we as a nation absorbed people of different cultures, abilities, incomes, and temperaments into communities that worked. The nation was good at it precisely because of, not in spite of, the freedom that American individuals and communities enjoyed. Have there been exceptions to that generalization? Yes, predominantly involving race, and the nation rightly moved to rid itself of the enforced discrimination that lay behind those exceptions. Is the generalization nonetheless justified? Overwhelmingly so, in our judgment. Reducing that freedom has enervated our national genius for finding valued places for everyone; the genius will not be revitalized until the freedom is restored.
Cognitive partitioning will continue. It cannot be stopped, because the forces driving it cannot be stopped. But America can choose to preserve a society in which every citizen has access to the central satisfactions of life.
Its people can, through an interweaving of choice and responsibility, create valued places for themselves in their worlds. They can live in communities—urban or rural—where being a good parent, a good neighbor, and a good friend will give their lives purpose and meaning. They can weave the most crucial safety nets together, so that their mistakes and misfortunes are mitigated and withstood with a little help from their friends.
All of these good things are available now to those who are smart enough or rich enough—if they can exploit the complex rules to their advantage, buy their way out of the social institutions that no longer function, and have access to the rich human interconnections that are growing, not diminishing, for the cognitively fortunate. We are calling upon our readers, so heavily concentrated among those who fit that description, to recognize the ways in which public policy has come to deny those good things to those who are not smart enough and rich enough.
At the heart of our thought is the quest for human dignity. The central measure of success for this government, as for any other, is to permit people to live lives of dignity—not to give them dignity, for that is not in any government’s power, but to make it accessible to all. That is one way of thinking about what the Founders had in mind when they proclaimed, as a truth self-evident, that all men are created equal. That is what we have in mind when we talk about valued places for everyone.
Inequality of endowments, including intelligence, is a reality. Trying to pretend that inequality does not really exist has led to disaster. Trying to eradicate inequality with artificially manufactured outcomes has led to disaster. It is time for America once again to try living with inequality, as life is lived: understanding that each human being has strengths and weaknesses, qualities we admire and qualities we do not admire, competencies and incompetencies, assets and debits; that the success of each human life is not measured externally but internally; that of all the rewards we can confer on each other, the most precious is a place as a valued fellow citizen.
From THE BELL CURVE by Richard J. Herrnstein and Charles Murray. Copyright © 1994 by Richard J. Herrnstein and Charles Murray. Reprinted by permission of Free Press, an imprint of Simon & Schuster, Inc. All rights reserved.