Discussion: (0 comments)
There are no comments available.
| AEI Bradley Lecture Series
View related content: Poverty Studies
“The past is a foreign country,” it has been said. But it is not an unfamiliar country. One does not need a Victorian grandmother, like Margaret Thatcher’s, to be reminded of “Victorian values.” One does not even have to be English; “Victorian America,” as it has been called, was not all that different, at least in terms of “values,” from Victorian England. And vestiges of those values remain, in memory if not reality.
When Mrs. Thatcher, during her election campaign in 1983, first raised the issue of “Victorian values,” she said that she was grateful to have been brought up by a Victorian grandmother who taught her those values: hard work, self-reliance, self-respect, cleanliness, neighborliness, pride in country. “All of these things,” she said, “are Victorian values. They are also perennial values.”
Well, not quite. Lady Thatcher’s grandmother would not have spoken of them as “values”; she would have spoken of them as “virtues.” Moreover they were not, as it happened, “perennial” virtues. Certainly they were not the virtues of the classical philosophers. The cardinal virtues celebrated by Aristotle–wisdom, justice, temperance, courage–do not appear in the litany of Lady Thatcher’s grandmother. Nor were her virtues Aristotle’s (although some of them might be subsumed under his categories). “Family values” (an expression Margaret Thatcher also used) do not figure among the classical virtues. Plato, of course, would have utterly rejected them, as he rejected the very idea of the family. And even Aristotle, who gave the family the distinction of being “the first community,” did not go so far as to elevate what we would regard as family values to the rank of virtues (except, perhaps, household management, which was largely a matter of finances and property).
This is not to say that the Victorians would have spurned any of the classical virtues. On the contrary, they would have approved of them. If they did not assign to some of them (courage, perhaps, or such lesser virtues as munificence or magnanimity) a high priority, it was because they would not have thought them the most essential virtues for most people in their own times. They may even have thought them more appropriate to a heroic, aristocratic age than to a bourgeois, democratic one.
Nor were the Victorian virtues the Christian ones–faith, hope, and charity (the latter in its original meaning of the love of God)–although, again, the Victorians would not have belittled these virtues. The Victorian virtues were more domesticated than the classical ones and more secular than the Christian ones. (Not entirely secular, however, as witness the familiar terms used to describe them: the “Puritan” or “Judaic-Christian” ethic.) But whatever their lineage, those virtues were deemed essential, not only for the good life of individuals but for the well-being of society.
And they were “virtues,” not “values” that the Victorians cherished. It was not until the present century that morality became so thoroughly relativized and subjectified that virtues ceased to be “virtues” and became “values.”
This transmutation is the great philosophical revolution of our time, comparable to the late-seventeenth century revolt of the “Moderns” against the “Ancients”–modern science and learning against classical philosophy. Yet unlike the earlier rebels, who were fully conscious of the import of their rebellion, the later ones (with the notable exception of Nietzsche) seemed almost unaware of what they were doing. There was no “Battle of the Books” to sound the alarm and rally the troops. Even the new vocabulary–“values” in place of “virtues”–which was so radical a departure from the old, and which in itself constituted a revolution in thought, passed without notice.
This is all the more curious because the inspirer of the revolution and the creator of the new language was acutely aware of the significance of it all. It was in the 1880s that Nietzsche began to speak of “values” in its present sense, connoting the moral beliefs and attitudes of a society. He used that word consciously and repeatedly, to signify what he took to be the most momentous fact in human history. His “transvaluation of values” was to be the final, ultimate revolution, a revolution against both the classical virtues and the Judaic-Christian ones–indeed, against the very idea of virtue, of a transcendent morality.
When early in the twentieth century, shortly after Nietzsche’s death, the sociologist Max Weber borrowed the word “values,” he had no such nihilistic intentions, which is perhaps why he did not comment on the novelty of the term, still less attribute it to Nietzsche (although we know that he read Nietzsche and was much influenced by him). Instead he used the word matter-of-factly, as if it were part of the accepted vocabulary and of no great moment. Perhaps for that reason, because it seemed so familiar and unthreatening, it was all the more effective, for it was absorbed, gradually and unconsciously, into the ethos of modern society, as it was absorbed into the vocabulary.
“Values” brings with it the assumptions that all moral ideas are subjective and relative, that they are mere customs and conventions, that they have a purely instrumental, utilitarian purpose, and that they adhere to particular peoples–or, as we now say, they are race-, class-, and gender-specific. So long as morality was couched in the language of “virtue,” it had a firm, resolute character. Philosophers might argue about the source of virtues, their relative importance, or the relation between moral and intellectual virtues, between classical and religious ones, or between private and public ones. They might even, like Montesquieu, “historicize” virtues by attributing different virtues to different peoples and polities. But for a particular people at a particular time, the word “virtue” carried with it a sense of gravity and authority, as “values” does not.
Values, as we now understand that word, do not have to be virtues; they can be beliefs, opinions, attitudes, feelings, habits, preferences–whatever any individual, group, or society happens to value, at any time, for any reason. One cannot say of virtues, as one can of values, that anyone’s virtues are as good as anyone else’s, or that everyone has a right to his own virtues. Only values can lay that claim to moral equality and neutrality. This impartial, “non-judgmental” as we now say, sense of values–values as “value-free”–is now so firmly entrenched in our vocabulary and sensibility that one can hardly imagine a time without it.
To speak of Victorian values (as I sometimes do, out of deference to common usage) is not merely a semantical anachronism; it is a distortion of the Victorian ethos. For the Victorians understood them as “virtues,” not “values.” Most Victorians even believed them to be, as Margaret Thatcher once said, “perennial virtues”–or if not perennial, then, for their own time and place at least, sufficiently fixed and certain to have the practical status of “perennial.”
For the Victorians, these virtues were fixed and certain, not in the sense of governing the actual behavior of all people all the time (or even, it may be, of most people most of the time). Plato and Aristotle did not assume that of their virtues; nor did Augustine and Aquinas of theirs. But all of them did believe that they were the standards against which behavior could and should be judged. The standards were firm even if the behavior of individuals did not always measure up to them. And when conduct fell short of those standards, it was deemed to be immoral–bad, wrong, evil–not, as is more often the case today, as misguided, undesirable, or (the most recent corruption of our moral vocabulary) “inappropriate.”
The shift from “virtue” to “values” has had other unfortunate consequences. Having displaced virtue from the central position it once occupied, as the defining attribute of the good life and the good society, we have relegated it to the bedroom and boudoir. When we now speak of virtue, we no longer think of the classical virtues of wisdom, justice, temperance, and courage, or the Christian ones of faith, hope, and charity, or even such Victorian ones as work, thrift, cleanliness, and self-reliance. Instead virtue is now understood in its sexual connotation, as chastity and marital fidelity. Leo Strauss once remarked that one of the great mysteries of Western thought is “how a word which used to mean the manliness of man has come to mean the chastity of women.”
This mutation in the word “virtue” has the effect first of narrowing the meaning of the word, reducing it to a matter of sexuality alone; and then of belittling and disparaging the sexual virtues themselves. These virtues, chastity and fidelity, have been further trivialized by the popular conception of Victorians as pathologically inhibited and repressed. Thus “Victorian values” have been associated with piano legs modestly sheathed in pantaloons, human as well as table legs referred to as “limbs,” and books by men and women authors dwelling chastely on separate shelves in country-house libraries.
In fact, these were not the normal (or even abnormal) practices of real Victorians. They were often the inventions of contemporary satirists (writers in Punch, for example), which have been perpetuated by gullible historians. “The woman who draped the legs of her piano,” one historian solemnly informs us, “so far from concealing her conscious and unconscious exhibitionism, ended by sexualising the piano; no mean feat.” In fact, it is this historian who has sexualized the piano and has imposed his own sexual fantasies upon the Victorians.
Why have values become a subject of such intense discussion? And why do the Victorian values–or virtues–loom so large today?
The answer to both questions lies, in part, in statistics. Thomas Carlyle once rebuked his countrymen with being obsessed with “figures of arithmetic”–about wages and prices, the cost of food and the standard of living. The more important issue, he insisted, was the “condition” and “disposition” of the people: their beliefs and feelings, their sense of right and wrong, the attitudes and habits that would dispose them either to a “wholesome composure, frugality, and prosperity,” or to an “acrid unrest, recklessness, gin-drinking, and gradual ruin.”
In fact, the Victorians had “figures of arithmetic” about these matters as well–about crime, drunkenness, pauperism, vagrancy, illiteracy, illegitimacy; “moral statistics,” they called them. It is instructive–and disquieting–to compare their moral statistics with ours.
In Victorian England, the illegitimacy ratio–the proportion of illegitimate births to total births–fell from 7 percent in 1845 to less than 4 percent by the end of the century. In East London, the poorest section of the city, it was less than that: 4.5 percent in midcentury and 3 percent by the end of the century. Apart from a temporary increase during both world wars, the ratio continued to hover around 5 percent until well into the middle of the twentieth century. In 1960 it began to rise, to 12 percent by 1980, and to 32 percent by the end of 1992–a two-and-a-half times increase in the last decade alone and a sixfold rise in three decades.
In the United States, the figures are no less dramatic. Starting at 3 percent in 1920 (the first year for which there are national statistics), the illegitimacy ratio rose gradually to slightly over 5 percent by 1960 (the same figure as England), after which it grew rapidly, more than doubling by 1970, and reaching 30 percent by 1991–a tenfold increase from 1920 and a sixfold increase from 1960. For whites alone the figures are 1.5 percent in 1920, slightly over 2 percent in 1960, almost 6 percent in 1970, and nearly 22 percent in 1991–fourteen times the 1920 figure and eleven times that of 1960. Black illegitimacy in this period went from 12 percent in 1920 to 22 percent in 1960, and to 68 percent by 1991. In 1964, when Daniel Patrick Moynihan wrote his percipient report about the breakdown of the black family, the black ratio was 24.5 percent; the white ratio now is 22 percent. In 1964, 50 percent of black teenage mothers were single; in 1991, 55 percent of white teenage mothers were single.
Or let us take another “moral statistic”: crime. In England between 1857 and 1901, the rate of indictable offenses (serious offenses, not including simple assault, drunkenness, or vagrancy) declined by almost 50 percent. The absolute numbers are even more graphic: while the population grew from 19 million to 33 million, the number of serious crimes fell from 92,000 to 81,000. 1857, by the way, was not the peak year; it is simply the year when the most reliable series of statistics starts. The decline (earlier statistics suggest) actually started in the mid or late 1840s, about the same time as the decline in illegitimacy.
The low crime rate persisted until the mid 1920s, when it started to rise and continued to do so through the war years, levelling off or declining slightly in the early 1950s. A dramatic rise started in the mid fifties, increasing more than fivefold by 1981 and almost doubling in the following decade. By 1991 the rate was ten times that of 1955 and forty times that of 1901. (In 1955, the anthropologist Geoffrey Gorer remarked upon the extraordinary degree of civility exhibited in England, where “football crowds are as orderly as church meetings.” Within a few years, those games had become notorious as the scene of mayhem and riots.)
National crime statistics for the United States start only in 1960, but local statistics suggest that, as in England, the decrease of crime began in the latter half of the nineteenth century and, except for a few years following the Civil War, continued into the early twentieth century. A rapid increase started in 1960, the rate doubling within the decade and tripling by 1980. A decline in the early 1980s was followed by another rise, bringing the 1992 rate to a level somewhat lower than its peak in 1980. The rate of violent crime (murder, rape, robbery, and aggravated assault) followed a similar pattern, except that the increase after 1985 was more precipitous and continued until 1992, making for an almost fivefold rise from 1960.
For all kinds of crimes the figures for blacks are far higher than for whites–for blacks both as the victims and as the perpetrators of crime. (Homicide is now the leading cause of death among black youths.) Criminologists have coined the term “criminogenic” to describe a community where “the social forces that create predatory criminals are far more numerous and overwhelmingly stronger than the social forces that create virtuous citizens.”
I could go on to regale you with other statistics: about divorce, single-parent families, welfare dependency, “functionally illiterate” teenagers and adults, or crime in the schools (the first exercise of the morning in many innercity schools is a search for guns–guns, no longer knives). This litany, I am afraid, could go on ad tedium.
The English sociologist Christie Davies has described a “U-curve model of deviance,” which applies both to Britain and the United States. The curve shows the drop in crime, violence, illegitimacy, and alcoholism in the last half of the nineteenth century, reaching a low at the turn of the century, and a sharp rise in the latter part of the twentieth century. In fact, the U-curve is more skewed than this image suggests. It might more accurately be described as a “J-curve,” for the height of deviancy in the nineteenth century was considerably lower than it is today–an illegitimacy ratio, for example, of 7 percent in England in the mid nineteenth century, compared with over 32 percent today.
But, as Carlyle would have reminded us, statistics, even moral statistics, do not tell the whole story. In his essay, “Defining Deviancy Down,” Senator Moynihan describes the downward curve of the idea of deviancy. What was once regarded as deviant behavior is no longer so regarded; what was once regarded as abnormal has been normalized. As deviancy is defined downward, so the threshold of deviancy rises: behavior once stigmatized as deviant is now tolerated and even sanctioned. Mental patients, no longer institutionalized, are now treated, and appear in the statistics, not as mentally incapacitated but as “homeless.” Divorce and illegitimacy, once seen as betokening the breakdown of the family, are now viewed more benignly, with divorced and unmarried mothers lumped together in the category of “single parent families.” And violent crime has become so endemic that we have almost become inured to it. The St. Valentine’s Day Massacre in Chicago in 1929, when four gangsters killed seven other gangsters, shocked the nation and became legendary; in Los Angeles, James Q. Wilson points out, as many people are killed every weekend. (In England, Jack the Ripper was the sensation of the century. Today serial killings are standard fare as much in newspapers as on TV.)
Charles Krauthammer has proposed a complementary concept: “Defining Deviancy Up.” As deviancy is normalized, so what was once normal becomes deviant. The kind of family that has been regarded for centuries as natural and moral–the “bourgeois” family, as it is invidiously called–is now seen as pathological, concealing behind the facade of respectability the new “original sin,” child abuse. Thus, while crime is underreported because we have become desensitized to it, child abuse is overreported, including cases (often inspired by therapists) recalled long after the supposed events. Similarly, rape has been “defined up” as “date rape,” referring to situations which the participants themselves did not perceive as rape, or even all heterosexual relations, which are seen as inherently coercive and violent.
The combined effect of defining deviancy up and defining it down has been to normalize and legitimize what was once regarded as abnormal and illegitimate, and, conversely, to denigrate and discredit what was once normal and respectable. This process too has occurred with startling rapidity. One might expect that attitudes and values would lag behind the reality, that people would continue to pay lip service to the moral principles they were brought up with, even while violating those principles in practice. What is startling about the 1960s “sexual revolution” is how revolutionary it was, in sensibility as well as reality. In 1965, 69 percent of American women and 65 percent of men under the age of thirty said that premarital sex was always or almost always wrong; in 1972, those figures plummeted to 24 percent and 21 percent–this in seven short years. Thus it is that language, sensibility, and social policy conspire together to redefine deviancy.
For a long time social critics and policy makers found it hard to face up to the realities of our moral condition, in spite of the statistical evidence. The realities are difficult to confront because they violate the dominant liberal ethos, which assumes that moral progress is a necessary by-product of material progress. It seems incomprehensible that in this age of free, compulsory education, illiteracy should be a problem even among native-born Americans; or illegitimacy, at a time when sex education, birth control, and abortion are widely available.
More important than the illusion of moral progress is the distrust of the very idea of morality. Moral principles, still more moral judgments, are thought to be illiberal and coercive. Most of us are uncomfortable with the idea of making moral judgments even in our private lives, let alone with the “intrusion,” as we say, of moral judgments into public affairs. We are uncomfortable not only because we have come to feel that we have no right to make such judgments and impose them upon others, but because we have no confidence in the judgments themselves, no assurance that our principles are true and right for us, let alone for others. We are constantly beseeched to be “nonjudgmental,” to be wary of crediting our beliefs with any greater validity than anyone else’s, to be conscious of how “Eurocentric” and “culture-bound” we are. “Chacun à son goût,” we say of morals, as of taste; indeed, morals have become a matter of taste.
More than any specific values or virtues, it is this reluctance to speak the language of morality, and to apply moral ideas to social policies, that separates us from the Victorians. In Victorian England, moral principles were as much a part of public discourse as of private discourse, and as much a part of social policy as of personal life. They were not only deeply ingrained in tradition; they were also imbedded in two powerful strains of Victorian thought: Utilitarianism on the one hand, Evangelicalism and Methodism on the other. These may not have been compatible philosophically, but in practice they complemented and reenforced each other, the utilitarian calculus of pleasure and pain, rewards and punishments, being the secular equivalent of the religious gospel of virtues and vices. It was this alliance of a secular ethos and a religious one that provided the practical basis for social policy, so that every measure of poor relief, for example, had to justify itself by showing that it would promote the moral as well as the material well-being of the poor–and not only of the pauper receiving relief but of the independent laboring poor as well.
This was the rationale behind the Victorian principle of “less-eligibility”: the idea that the “able-bodied pauper” should be in a less “eligible”–that is, less desirable, less favorable–condition than the independent laborer. Less-eligibility meant not only that the pauper should receive less by way of relief than the laborer did from his wages, but also that he receive it in a manner (in the workhouse, for example) that made pauperism less desirable, less respectable than work. This principle (which applied only to the able-bodied, not to the sick, aged, or children) was designed to prevent the “pauperization of the poor,” as was said–to discourage the independent laborer from lapsing into pauperism, and to encourage the able-bodied pauper to become independent.
In recent times we have so completely rejected any kind of moral principle that we have deliberately, systematically divorced poor relief from moral sanctions and incentives. This reflects in part the theory that society is responsible for all social problems and should therefore assume the task of solving them; and in part the prevailing spirit of relativism, which makes it difficult to pass any moral judgments or impose any moral conditions upon the recipients of relief.
We are now confronting the consequences of this policy of moral neutrality. Having made the most valiant attempt to “objectify” the problem of poverty, to see it as the product of impersonal economic and social forces, we are discovering that the economic and social aspects of that problem are inseparable from the moral and personal ones. And having made the most determined effort to devise policies that are “value-free,” that do not stigmatize the recipients of relief or their “style of life,” we find that these policies imperil both the moral and the material well-being of their intended beneficiaries.
In de-moralizing social policy–divorcing it from any moral criteria, requirements, even expectations–we have demoralized, in the more familiar sense, both the individuals receiving relief and society as a whole. We are, in fact, operating on something like a principle of “more-eligibility.” People on welfare often receive more, by way of allowances, food stamps, housing subsidies, and medical benefits, than workers earning a minimum or modest wage, thus providing incentives to go on relief rather than seek work. Or we give unmarried mothers (including teenagers) benefits and services that married mothers do not have, thus penalizing marriage and rewarding illegitimacy. Or we define drug addiction and alcoholism as “disabilities,” thus making chronic addicts and alcoholics eligible for relief, while the addict or alcoholic who makes a serious effort to be cured is removed from the relief rolls. And in a myriad other ways, our policies have the unwitting effect of favoring (making “more eligible”) what the Victorians called the “undeserving” poor over the “deserving.”
“Deserving” and “undeserving”–these terms epitomize the difference between the Victorians and ourselves. The Victorians spoke the language of morality, and acted on that language by devising social policies in accord with it. We too have created a language consistent with our actions; we have de-moralized our rhetoric together with our policies. We go to great lengths to avoid any suspicion of moral disapprobation or stigmatization (itself a taboo word). Relief has become “welfare.” “Illegitimacy” is officially known as “non-marital child-bearing” or “alternative mode of parenting.” Promiscuous teenagers are said to be “sexually active.” Juvenile criminals are “delinquents.” And cold-blooded murderers are the victims of a society-induced “rage.”
We think we have devised a “value-free” vocabulary in keeping with our “value-free” policies. In fact, we have substituted one value-laden rhetoric and policy for another. We have legitimized illegitimacy by calling it an “alternative mode of parenting,” and then by rewarding that alternative mode with material benefits unavailable to married, self-supporting parents. Similarly, we have legitimized teenage promiscuity by labelling it “sexually active” (implying that the unpromiscuous are deficient in the normal complement of hormones), and then by providing teenagers with condoms to indulge their socially sanctioned “activity.”
In retrospect, we can see that the “social pathology”–“moral pathology,” I would call it–of crime, violence, illegitimacy, welfare dependency, drug addiction, is intimately related to the “counterculture” of the 1960s. That counterculture was intended to liberate us from the stultifying influence of “bourgeois values”–from the Victorian virtues, in fact. It is no accident, as a Marxist would say, that the rapid acceleration of illegitimacy and crime started at just the time that the counterculture got under way.
In a powerfully argued book, Myron Magnet has analyzed the symbiotic relationship between what he calls the “Haves” and the “Have-Nots.” It was the Haves, the cultural elites, that legitimized and glamorized the counterculture, which dislocated their own lives, for the most part, temporarily and peripherally, but which had a disastrous effect on the poor. For it disparaged the Puritan ethic–deferral of gratification, thrift, work, self-discipline–that had made for social mobility and economic improvement.
The underclass is thus not only the victim of its own “culture of poverty.” It is also the victim of the upperclass culture around it. The kind of “delinquency” that a white suburban teenager can absorb with relative impunity may be literally fatal to a black innercity teenager. Similarly, the child of a single, affluent professional woman (a Murphy Brown) is obviously in a very different condition from the child (more often, children) of a woman on welfare. The effects of the culture, however, are felt at all levels. It was only a matter of time before there emerged, as Charles Murray has pointed out, a white underclass with much the same pathology as the black. And not only a white underclass but a white upper class; the most affluent suburbs are beginning to exhibit the same pathological symptoms: teenage alcoholism, drug addiction, crime, promiscuity.
By now this “liberated,” anti-bourgeois ethic no longer seems so liberating. It is finally permissible to speak of the need for “family values.” President Clinton himself has put the official seal of approval on family values, even going so far as to concede–a year after the event–that there were “a lot of very good things” in Quayle’s speech about family values.
It is perhaps also time to rehabilitate some of the Victorian virtues that have been for so long derided. Take cleanliness, for example. When Margaret Thatcher quoted the adage, “Cleanliness is next to godliness,” one historian said that it was “so much pious nonsense” in Mrs. Thatcher’s own youth, let alone in Victorian England: “a dirty, smelly age in which a largely dirty, smelly population was sorely afflicted by all manner of diseases rooted in a chronic lack of hygiene at all levels.” How could even the middle classes be clean, this historian asked, sweating in all those layers of thick clothing, and “without much in the way of regular dry cleaning!”–or with all that elaborate furniture in “the pre-vacuum cleaner age”?
Historians often accuse the Victorians of imposing their middle-class values upon the poor. But if anyone is guilty of imposing his values upon the Victorians, it is surely a historian who cannot conceive of cleanliness, either as a value or as a reality, in the absence of dry-cleaning and vacuum-cleaners. (It was, by the way, John Wesley, the founder of Methodism, who popularized the motto, “Cleanliness is next to godliness,” in the early eighteenth century, at a time when sanitary conditions were even more primitive than in Victorian times.)
Of course, the Victorians, and the working classes especially, were dirty and smelly by our standards. But what is impressive in reading working-class memoirs is the enormous effort made to be clean: the scouring of scullery floors and doorsteps, the blackleading of stoves and grates, the ritual of weekly baths (carefully planned and timed, so that girls and boys, mother and father, could bathe separately in a tub in the kitchen); the washing of clothes and linens (which involved heating the water in buckets on the stove, transferring the water to the tub, scrubbing the clothes, emptying the buckets, refilling and reheating them for the rinse, wringing out the clothes, drying them on lines in the yard, or when it rained, as it often did, in the kitchen, and finally ironing them).
Foreigners regarded the obsession with cleanliness as yet another English eccentricity. The German historian Heinrich Treitschke once observed, “The English think soap is civilization.” But that idea is not as fatuous as it may seem; soap does signify civilization, in contrast to the filth of animality and barbarity. Nor does the adage, “Cleanliness is next to godliness,” deserve the derision it receives; cleanliness does connote purity, of body and soul. Nor need we mock the Victorians who made a virtue of being clean (rather than, like the French, well-perfumed), and who did so at such great cost of time and labor.
And so with the other Victorian virtues: work, thrift, temperance, respectability. These are modest, mundane virtues, dependent upon no special breeding, or status, or talent, or wisdom, or grace, or money. They are, one might say, democratic virtues. Those who denigrate Victorian reformers and philanthropists for trying to impose these so-called “bourgeois” values upon the working classes do not realize, first, that these were as much the values of the working classes as of the middle classes (the working classes aspired to those values even if they could not always realize them in practice); and, more important, that it was a great tribute to the working classes to assume that they were capable of realizing the same virtues that the middle classes valued for themselves.
This, you must remember, was a time of great class disparities–in income, work, housing, education, manners, speech, and political and legal rights. It could only be because of a powerful sense of spiritual and moral equality, the sense of a common human nature transcending all those economic and social inequalities, that the working classes could be credited with those “bourgeois values.” And it was because of those values, I would argue, that many of the other inequalities (the political, most notably) were eventually mitigated or eliminated.
The Victorian virtues were democratic virtues–and also liberal virtues. By putting a premium on ordinary virtues attainable by ordinary people, the Victorian ethos located responsibility within each individual. In an aristocratic age, only the exceptional, privileged individual had been seen as a free moral agent, the master of his fate. In the evolving democracy that was Victorian England, all individuals were assumed to be free moral agents, hence, potentially at least, their own masters.
This is why the Victorians put such a premium on the self–not only on self-help and self-interest, but also self-control, self-discipline, self-respect. A liberal society, they believed, required a moral citizenry. The more effective the voluntary exercise of morality on the part of each individual, the more internalized that morality in the self (in the form of conscience, character, habit, or religion), the less need there would be for the external, coercive, punitive instruments of the state. It was the great mentor of the Victorians, Edmund Burke, who enunciated this principle:
Men are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites…. Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less of it there is within, the more there must be without.
The French historian, Elie Halévy, the most acute observer of nineteenth-century England (he was to England what that other great Frenchman, de Tocqueville, was to America), posed the question of the “miracle of modern England.” Why was England spared the bloody political revolutions that convulsed the continent? His answer, which ran to seven substantial volumes, covered such subjects as the political and constitutional system of England, its social structure, economy, culture, and religion.
But there was another “miracle” of Victorian England: the fact that it was spared the kind of moral revolution we have recently undergone, and which might have been expected in that period of momentous economic and social changes. It is often said that the process of modernization–industrialism, capitalism, individualism, liberalism, secularism, democracy–inevitably undermines the respect for authority, tradition, religion, and thus morality. But Victorian England went through that period of modernization without experiencing such a moral revolution. Indeed it emerged from it not in a state of de-moralization but of re-moralization.
Victorian England also survived Darwinism, which some contemporaries feared would subvert both religion and morality. In fact, most people did not experience a loss of religious faith as a result of Darwinism, and those who did substituted for it a secular morality that was no less rigorous than the religious morality it displaced. George Eliot uttered the classic expression of this secular faith: God was “inconceivable,” immortality “unbelievable,” but duty nonetheless “peremptory and absolute.” Darwin put it more prosaically: God was “beyond the scope of man’s intellect,” but man’s moral obligations were what they had always been, to “do his duty.”
In retrospect, one might say that Victorian England was living off the moral capital of religion, and that post-Victorian England, well into the twentieth century, was living off the capital of a secularized morality. Perhaps what we are now witnessing is the moral bankruptcy that comes with the depletion of both the religious and the secular capital.
In 1888, Nietzsche sneered at those “English flatheads” like John Stuart Mill, those “little moralistic females à la Eliot,” who thought it possible to have morality without religion. “They are rid of the Christian God and now believe all the more firmly that they must cling to Christian morality.” For the moment, he predicted, “morality is not yet a problem,” but it would become a problem when the people discovered that without religion there was no morality. “When one gives up the Christian faith, one pulls the right to Christian morality out from under one’s feet.”
One can think of other reasons why today morality is a problem: the affluence of the young, stimulating the demand for material goods and for the immediate gratification of that demand; technology (TV, transistor radios) making the popular culture, including the most degrading forms of that culture, widely accessible; the birth control pill, facilitating the sexual revolution; the expansion of higher education, bringing the “adversary culture,” or counterculture (once confined to Bloomsbury or Greenwich village), to every college campus; and perhaps the sense of moral lassitude that comes with the release from the more urgent problems that preoccupied earlier generations–depression, war, the perils of Nazism and Communism. But foremost among these, I believe, is the Nietzschean explanation: the death of God and the death of morality.
Before concluding, I must say, what I should have said at the outset (and perhaps at five-minute intervals throughout this talk): that in inviting a more respectful attention to the Victorians, I do not mean to condone everything in the Victorian ethos, still less in Victorian society. Late-Victorian England was more open, liberal, and humane than early-Victorian England, but it was less open, liberal, and humane than we today would think desirable. Social and sexual discriminations, class rigidities and political inequalities, autocratic men, submissive women, and overly disciplined children, constraints, inhibitions, abuses of all kinds–there is enough to give pause to the most ardent Victoriaphile. In any case, one could not, even if one so desired, seek to emulate a society at so different a stage of economic, technological, social, political, and cultural development.
But some things we can surely learn from the Victorians: not only the importance of such virtues as work, temperance, self-discipline, self-reliance, but the importance of the idea of virtue as governing both public and private affairs. The Victorians were, candidly and proudly, “moralists.” In recent years that has almost become a term of derision. Yet contemplating our own society, we may be prepared to take a more favorable view of Victorian moralism. If the Victorians, at the height of the industrial revolution, could retain and even strengthen an ethos that had its roots in religion and tradition, it may be that we are not as constrained by the material conditions of our own time as we have thought. A postindustrial economy, we may conclude, does not necessarily entail a postmodernist society or culture, still less a de-moralized society or culture.
We may, in fact, already be witnessing the beginnings of a moral reformation. One of the many ironies in the current debate about values, as James Q. Wilson has observed, is that the word has taken on something of the connotation of the older “virtues.” In a thoroughly relativistic climate such as ours, even “values” are seen as a retreat from relativism and a reassertion of moral principles. More remarkable is the fact that “virtue” itself is beginning to emerge as a respectable word. William Bennett’s The Book of Virtues, which has sold almost two million copies and is still going strong, celebrates such familiar Victorian virtues as self-discipline, work, responsibility, perserverance, and honesty. (Not cleanliness or chastity, but perhaps those will appear in a sequel.)
The word “moral” has also been rehabilitated; one can hardly pick up a newspaper without reading of the moral crisis of our time. A few days ago the Washington Post (hardly a right-wing journal) featured two articles on its Op-ed page: one by the liberal columnist William Raspberry on the need for a new “moral center,” and another by Joseph Califano, formerly in the Carter administration, deploring the “medicalization” of teenage pregnancy and explaining that it is a moral, not a medical problem. And then there was the remarkable sight, the other week, of the cover of Newsweek emblazoned with the word “Shame,” and below it the subtitle, “How Do We Bring Back a Sense of Right and Wrong?”
Even an inveterate pessimist like myself may be forgiven for thinking that the day of redemption is nigh.
Gertrude Himmelfarb is a Distinguished Professor of History Emeritus at City University of New York.
There are no comments available.
1789 Massachusetts Avenue, NW, Washington, DC 20036
© 2019 American Enterprise Institute