July 1, 2013

In defense of culture

We live in an era in which immigration, urbanization, the lingering effects of the 1960’s counterculture, government policy, and various other factors are driving rapid changes in American society. These changes have not distributed themselves uniformly across the country, but vary by region, by the urban-rural divide, by race, ethnicity, and age. America is now, in effect, an assemblage of separate nations. I am skeptical that there ever was a truly unified American identity, but it is obvious that there isn’t one now. The old metaphor of the melting pot – in which, after a generation or two, immigrants assimilated themselves to the general culture – is certainly not currently applicable. Rather, in contemporary America, people fracture along a variety of lines. Even people of a common general background now find themselves in different and often antithetical cultures.

So – what is a culture, really?

A culture is a set of constraints imposed on the individual as a prerequisite for acceptance in a given group.

Some cultural constraints are the product of environmental conditions. Traditional Inuit culture, for example, is characterized by a high degree of patience and cheerfulness. If you live in close proximity to your family in an igloo for several months of the year, moodiness and negativity are likely to have disastrous consequences. Inuit must either be nice or suffer miserably with one another. The famous politeness and formality of the Japanese, however, has little to do with the mountains and climate of Japan. Rather, the politeness and formality of the Japanese is essentially arbitrary. Their culture carries these traits by the usual mechanisms of societal tradition and individual upbringing, but Japan could just as easily have been populated by a race of rude, obnoxious slobs. For the actual Japanese, politeness and formality are self-defining rather than physically necessary characteristics.

Very many cultural constraints, perhaps even most of them, are arbitrary. For example, apart from climatic considerations at higher latitudes, there is no physical reason that we couldn’t all be nudists. The fact that some hunter-gathers groups do live without clothing proves conclusively that it is possible. Still, if you live in virtually any modern nation, east or west, you probably find the notion of universal nudism rather unpleasant. You really wouldn’t want to see your ugly neighbors mowing the lawn or standing in a checkout line in the nude, and, unless you are a deviant, you would not want to dispense with all of your own clothes either. You feel this way because you have certain arbitrary cultural standards. You have been taught to believe certain behaviors are acceptable while others aren’t – and you know that the great majority of others in your society share the same beliefs. Wearing clothing, often quite specific clothing, is a defining feature of your culture. While it is not a physical necessity, it is a social necessity.

The consequences of abrogating social constraints are many and varied, from mild rebuke to capital punishment, but one broad consequence is a constant. If you violate a cultural taboo, you weaken the bonds you have with other members of your group. A Japanese who decides to be a rude, obnoxious slob gives up a large share of his Japaneseness. To share a culture with another person is precisely to share certain fairly reliable expectations about one another’s possible behaviors. It may be quite possible for you to attend you aunt Margaret’s funeral in the nude, but everyone else in attendance is likely to become rather anxious about what you might do next. You will have branded yourself as someone who cannot be expected to behave within the usual social bounds. You will be treated as irresponsible, erratic, and mentally unstable.

Multiculturalism, as a culture in itself, makes hash of any coherent system of social limitations. Rational people can learn a certain degree of tolerance – which is to say, to accept a rather broad range of behaviors or at least suppress one’s learned revulsion to certain things – but one cannot accept two or more incompatible cultural systems simultaneously. Contrary to what multiculturalists might hope, one cannot find homosexuality acceptable and also find Islam, which considers homosexuality a serious or even fatal transgression, acceptable. You can find one, or neither, acceptable – but not both. When the Saudi or Iranian government puts a human being to death for the crime of homosexuality you cannot consider it a tragic injustice and a reasonable expression of another culture’s beliefs at the same time. What multiculturalists usually do to defend their worldview is to either ignore those aspects of reality that make their position self-contradictory, or to gloss them over with comforting narrative. I once heard of a feminist who argued that the burka – the bag-like garment under which many Muslim women are compelled to conceal themselves – should be properly viewed as a symbol of female empowerment. This would be like calling the shackles that bound a chain-gang together instruments for promoting Afro-American unity, or like calling Buchenwald a weight-loss spa for Jews.

Multiculturalism, as actually practiced in America, suffers the further problem of simple hypocrisy. To continue with the convenient example above, it is irrational to stretch one’s tolerance of one conservative religion, Islam, to absurd lengths – but show so little tolerance for the much less militant followers of Catholicism.1 How can tolerance of some cultures and intolerance of others be multiculturalism in any robustly meaningful sense? If a person is intolerant of relatively benign cultures nearby, but tolerant of hostile cultures he or she rarely encounters, that isn’t much different from being intolerant in general. I can say that I am tolerant of the culture of New Guinea headhunters – but that’s not really much of a claim if I don’t live in New Guinea.

If we attempt to examine the phenomenon of culture as objectively as possible, it should be plain that any viable culture must have at least two traits. First, a culture must have a set of characteristics that define it. A culture has to be something other than a boundary drawn around some random group of human beings. It can be defined by subjectively good characteristics (such as patience) or subjectively nasty ones (such as racism) but it must have some consistent nature – something to bind its members in a sense of solidarity. Second, perhaps as a corollary to the first, a culture must have a means of recognizing what a non-member is – or, perhaps more to the point, of recognizing what an enemy is. Islam, it happens, does both of these things extremely well. Muslims know what being a Muslim means. They know what they should and should not do, what observances they need to make, and what they are prescribed to believe. Likewise, they know what is not Islam, what threatens Islam, and they are unapologetic in this knowledge. Western multiculturalists, on the other hand, have no idea who they are. By attempting to throw their arms around a large assortment of alien cultures, they leave themselves without any common characteristics. The multiculturalist cannot even claim tolerance as a dogma, for the reasons I’ve already outlined: some of the people they’ve included under their imaginary umbrella are decidedly intolerant, and most multiculturalists harbor hatreds of their own conservative brethren. Further, the multiculturalist, in a fantasy of universal inclusion, finds it impossible to fully reconcile himself (or herself) to the very concept of an out-group, let alone an enemy. The political and religious conservatives of their own ethnicity can be the target of enmity precisely because they share a common background with the multiculturalist. Conservatives are a sort of alter-ego, symbolic of what the multiculturalist strives to reject. True enemies, outsiders who despise the multicultural anti-culture for its flaccid tolerance and amorality, must be embraced, placated, or imagined not to exist.

A very recent example of an in-group / out-group disconnect is to be found in the case of Paula Deen, the once-popular cooking show host. Deen was discovered to have used the word “nigger” in a private conversation many years ago, and, as a consequence, was dropped by her network and is being actively harassed by MSNBC and others. I don’t doubt that many people find the “n-word” offensive, but why is it any less offensive when wielded causally by hip-hop performers in an entirely public context? Deen is vilified because, as a successful, Christian, heterosexual Caucasian, she is perceived as part of the evil conservative alter-ego that must be constrained. This, despite her actual political affiliations. Hip-hop performers, as symbolic victims of conservative oppression, are exempt from the standards of political correctness that apply to Deen.

To be clear, I am not saying that some degree of tolerance isn’t nice, or that traditional standards are necessarily laudable. What I am saying is that in a conflict between cultures, one that is coherent and cohesive has a substantial advantage over one that is inconsistent and heterogeneous. I’m not saying xenophobia is good – I’m saying xenophobia has often proven successful.

Another thing I am not doing is fully equating western liberalism with multiculturalism. The latter is a dominant meme of the former, but the two are not synonymous. In America, at least, the government has usually been able to recognize out-groups so long as they remained beyond the US border. There are interesting parallels between 19th century gunboat diplomacy and the policy of drone warfare. Ethical considerations aside, American political leadership is not completely unwilling, even now, to conduct international affairs with blunt instruments in the time-honored human way. What our leaders do not seem to understand anymore is what the in-group is – or, more romantically, what a citizen is.

One need look no further that the illegal immigration problem to grasp the growing irrelevance of American citizenship. Many Americans, some of them in positions of high authority, take it for granted that the circumstances and sufferings of illegal aliens are our responsibility. Being within the US border makes them automatically part of the in-group. Imagine you were to entertain this attitude on a person basis? What if a squatter entered your house through an open window, sat down on your couch and asked, in an alien language, to be treated as one of the family. Wouldn’t you ask him to leave? Wouldn’t you call the police if he refused? Would you feel the slightest guilt to see him ejected? Almost all of us would be indignant at the effrontery. Could you imagine sneaking across the border of a foreign country, with no intention of learning that country’s language or adapting to that country’s culture – but simply showing up there with the expectation of carving out a niche?

Immigrants can approach their relocation to a new country in one of two ways. First, they can assimilate – adopting the language, customs and other cultural aspects of their new country. With some notable exceptions, this is what most 19th century immigrants to the United States eventually did, if not fully in the first generation then certainly in the second.2 Alternatively, immigrants can colonize – which is to say, they can settle in discrete and permanent enclaves, keeping their old culture and rejecting that of natives. 19th century European colonists to Africa and Asia obviously did this, remaining British, French, Belgians, Germans and Italians in new exotic surroundings. Muslim immigrants in America and Europe have done largely the same, not merely rejecting but often despising the cultures of the countries they inhabit. Recent waves of Latinos, too, seem reluctant to assimilate – although it remains to be seen whether or not they will follow the path of the Irish in time. Language may be key in this process – both as an indicator of an immigrant’s intentions and as an actual cultural barrier. It is obvious that a person who moves to another country with no intention of learning the language has no intention of assimilating either, but intends to live in an enclave with fellow members of the same culture. When the dominant culture accommodates multilingualism it actually encourages a continued and entrenched sense of separateness, aiding and abetting a sort of neo-colonialism in reverse. Add to such excessive accommodation the nebulosity of the multiculturalist anti-culture itself, and assimilation becomes all the more unlikely. Which is easier for the immigrant: to learn a new language and adopt a new culture with a bewildering lack of defining characteristics, or to keep the old language and old customs in a new context? The multiculturalist may consider his or her society a cornucopia of attractive possibilities, but it can equally be perceived as a weak and decadent mess with something for everyone to find revolting.

One point I have already alluded to needs emphasis. Considering the remorseless Darwinian processes that actually shape history, it may not be the best educated or intellectually sophisticated culture that endures. A few years ago there was a minor stir over whether or not President Obama believed in American exceptionalism. When asked, his somewhat nuanced answer amounted to “no.” Broadly speaking, “no” is the “correct” answer. All nations have a certain uniqueness within the greater context of history – some are more powerful than others; some put more emphasis on particular rights; others are more elaborate in their artistic expression; etc. Most human beings love their own nation (if perhaps not their own government) more than any other. In this context, American exceptionalism seems little more than an expression of one particular bias. To understand that we are nothing intrinsically special is a rational achievement. It is also a serious impediment to cultural cohesion. People can identify with a high ideal, even a fictitious one, in a way that they cannot identify with a prosaic fact. Perhaps it is naïve to stand in reverence to Ronald Reagan’s “shining city on a hill” – but it is hard to imagine anyone standing in reverence to a historically accidental superpower whose behavior is sometimes good and sometimes bad. Even the narrow aspects of multiculturalism that are actually rational are culturally corrosive. Many people have died for Christianity, Islam, France, Japan, and even for Communism or Nazism – but no one ever risked life or limb for the greater glory of relativism. To the contrary – cynicism and intellectual sophistication may well go hand-in-hand. Much of the corruption we now see in government may simply be the product of a progressive erosion of coherent cultural standards and myth-infused ideals.

Anti-intellectual as the above argument may sound, it is by no means self-evidently false. No law of nature promises us a stable society at the end of our efforts to sort the world out objectively. It may well be that our pursuit of open inquiry will turn out disastrously in the end. I am not suggesting that we surrender ourselves to tradition and superstition, but I am pointing out that tradition and superstition do yield certain strengths which may turn out to be necessary to cultural survival. It is not by accident that they have survived.3

1 Not all cultures are religions, but all religions are cultures.

2 The Amish did not, of course. The Jews retain a considerable distinctiveness although they have adapted to a high degree. Blacks were handicapped from assimilating through the peculiarities of their history. The Irish required more time than most groups, but now retain only a nominal separate identity.

3 There is a further irony here. Progressive anti-theists, like Richard Dawkins and Daniel Dennett, make the implicit assumption that an empirical pursuit of truth will, in fact, continue to make a better and more stable society. While one could produce a body of interesting if inconclusive evidence to support this, they tend to just take the assertion for granted. The belief is so deeply imbedded in their particular culture that it functions very much like an article of faith. Theirs is a culture that excludes religions and seeks to convert everyone to a certain epistemic schema – while rarely bothering to apply that schema to their own core principles. Thus, they have a dogma despite themselves. Unfortunately for them, it is not a very unifying dogma. The low number of people willing to self-identify as “brights” reveals this. (See: “Why I am not a Bright” [ http://cadwaladr.blogspot.com/2011/08/why-i-am-not-bright.html ] )

No comments:

Post a Comment