Several years ago, I did some minor volunteer work for a local sheltered workshop, a facility that gives the mentally or physically disabled the opportunity to do light assembly work for local businesses. The work typically consists of stuffing envelopes or constructing simple machines – whatever is within the capacity of this most marginal group of workers. Despite the necessary simplicity of the work involved, you shouldn’t imagine that employing these people is act of charity. In fact, with the assistance of good organizers and a few physical aids to steady shaky hands, their output is good and their quality is often better than that of people with abler bodies and minds. They are happy to get a paycheck – to be adult, productive members of society. There is a lesson here for all of us.
As a culture, we don’t respect work much anymore. Many people, rich or poor, do their best to avoid it. The trick is to make money without working, and whether one does that by siphoning wealth out of other people’s misfortune through a hedge fund or by mooching off an undeserved public entitlement the motive is the same. Work is for suckers. The awkward people at the sheltered workshop, struggling to assemble mailers and put little plastic parts together, are the biggest suckers of all. Not only do they work more than they have to, they have the naïve belief that it gives their lives some measure of meaning and dignity. What a bunch of retards.
Like it or not, the very basis of society is reciprocity. In our personal relationships, most of us recognize that a person that takes from us without giving anything in return is not our friend. Unless you are possessed by pathological self-loathing, you’ll do your best to avoid people whose goal is simply to use you. You will also do your best to be at least decent to the people that matter to you. In a healthy society, this simple understanding – that we have responsibilities as well as needs – is widespread, shaping most of our relationships with others. Work, in almost every society on the planet up until now, has been the reciprocity of default. To work is to make things or do things for others. To be paid for working is to earn the right to have things made and done for you. There have always been inequalities in this exchange, just as there have always been inequalities in personal relationships, but reciprocity has always bound us together nonetheless.
A society that encourages stealing, cheating, swindling and mooching will not be a society very long. To be fair, neither will one that cannot provide enough real jobs for most of its citizens. The official unemployment rate of around 8% is a widely acknowledged joke. As the Federal government sees it, if you’ve been out of work for a few months you’re not unemployed anymore. You’re “discouraged”. You don’t count in the statistics. If you are collecting disability because you have some back pain, you are not unemployed either – even though you could work at a desk, answer a phone, stuff envelopes, or do a million other things. All of the “disabled” I know are far more “abled” than any of the people I saw working in the sheltered workshop.
The real unemployment number is hard to estimate, but it is probably more than double that laughable 8%. If all of the real unemployed got up one morning and decided to look for jobs, it is obvious that most of them would not be able to find one. We live in a society in which working and acquiring a means of support are not reliably connected things. We departed from that path a long time ago. Instead, we went down a path of increasing automation (euphemistically touted as “high worker productivity”), outsourcing, low-wage part-time service jobs, and more-or-less permanent positions on the dole. If you have no means of support you can get help from the government, who will take the money (despite what anybody tells you) from the pockets of the middle class. It is nice, of course, that we do not let people starve – but is it fair that fewer and fewer people put in all the productive effort?
If there is a solution to the current structural problems of the economy, it will have to be more imaginative than either “just get off your ass and get a job” or “let’s raise taxes on the greedy rich”. It is will have to include, among other things, both the idea that work is worth doing, and the idea that satisfied employees are worth having. It is going to require people who know how to think of themselves as citizens, rather than as elites or victims.
September 3, 2013
August 19, 2013
On yachts and human beings
I spent some time recently in the harbor town of Saugatuck, Michigan. The land is awash with upscale boutiques and restaurants, and the harbor is awash in cabin cruisers and yachts. I don’t spend much time among the rich, but when I do it tends to make me reflect. I neither love nor hate them. Taken as a group, there isn’t much collective virtue there to love, but neither is there much virtue in grumbling about those who, through hard work, driving ambition, or the blind luck of heredity happen to be materially well off. It is usually better not to make sweeping judgments about other people, and it is almost never good to make self-serving ones. I stand on the shore; an edifice of fiberglass and dreams rumbles up the channel. That is all. The world is as it is.
A modern yacht, taken as an object, looks like nothing so much as an enormous wedding cake in a hurricane. It is a streamlined pile of white layers with a little man in a golf shirt standing on top. For all I know there may be women who own such objects too, but I have never seen a woman feel the need to steer one. The women are either inside or in the stern with a collection of underdressed teenagers. The teenagers are still caught up in the business of showing themselves off, rather than in displaying their limited possessions. A yacht is just a setting for this sort of human activity – a thing neither beautiful nor ugly in itself.
There is nothing very different from one yacht to the next apart from size. They are too big for their clean lines to convey any plausible sense of speed. If they are fast, they can only be so in open water – not lumbering and rumbling around in the shallows among other boats. They engage in a slow ritual, going up and down the channel. The smaller boat must make way for the larger. Thus, even in leisure there is competition. The man who proudly sheers the 70-foot yacht must pull aside for the 80-foot yacht, and must feel the eyes of the shore-bound spectators shift from his boat to the even more impressive one. The captain of the 80-footer, I can only guess, lives with a quiet dread that something even bigger may come grunting into port tomorrow. This concern, I know, is not so hard a fate as struggling against an ordinary mortgage. Nor is an ordinary mortgage as bitter as the struggle for a meal that many people on the planet have to go through every day. Still, human beings have an unerring capacity to scale their emotions to their circumstances, and can be miserable amid plenty or content with only a little more than bare subsistence. To see yachts and cabin cruisers coming and going is to see only those things that people have acquired in the attempt to please themselves – they are not guarantors of anything, least of all happiness.
I watched, amused, in this intermission between my own concerns, which swell or ebb in accordance with forces I can only now and then control. I envy not the yachtsman, but the water on which he rides– which parts in his wake but always returns to its serenity. Which neither minds being the wave, nor attaches itself to its interlude of calmness. Which makes the yacht, the harbor, and the reflection of the seagull possible. Even to envy such a thing, of course, is to disturb the very peace one seeks. Went I am in my right mind, I only watch, enjoy, and laugh.
A modern yacht, taken as an object, looks like nothing so much as an enormous wedding cake in a hurricane. It is a streamlined pile of white layers with a little man in a golf shirt standing on top. For all I know there may be women who own such objects too, but I have never seen a woman feel the need to steer one. The women are either inside or in the stern with a collection of underdressed teenagers. The teenagers are still caught up in the business of showing themselves off, rather than in displaying their limited possessions. A yacht is just a setting for this sort of human activity – a thing neither beautiful nor ugly in itself.
There is nothing very different from one yacht to the next apart from size. They are too big for their clean lines to convey any plausible sense of speed. If they are fast, they can only be so in open water – not lumbering and rumbling around in the shallows among other boats. They engage in a slow ritual, going up and down the channel. The smaller boat must make way for the larger. Thus, even in leisure there is competition. The man who proudly sheers the 70-foot yacht must pull aside for the 80-foot yacht, and must feel the eyes of the shore-bound spectators shift from his boat to the even more impressive one. The captain of the 80-footer, I can only guess, lives with a quiet dread that something even bigger may come grunting into port tomorrow. This concern, I know, is not so hard a fate as struggling against an ordinary mortgage. Nor is an ordinary mortgage as bitter as the struggle for a meal that many people on the planet have to go through every day. Still, human beings have an unerring capacity to scale their emotions to their circumstances, and can be miserable amid plenty or content with only a little more than bare subsistence. To see yachts and cabin cruisers coming and going is to see only those things that people have acquired in the attempt to please themselves – they are not guarantors of anything, least of all happiness.
I watched, amused, in this intermission between my own concerns, which swell or ebb in accordance with forces I can only now and then control. I envy not the yachtsman, but the water on which he rides– which parts in his wake but always returns to its serenity. Which neither minds being the wave, nor attaches itself to its interlude of calmness. Which makes the yacht, the harbor, and the reflection of the seagull possible. Even to envy such a thing, of course, is to disturb the very peace one seeks. Went I am in my right mind, I only watch, enjoy, and laugh.
Posted by
E.M. Cadwaladr
July 8, 2013
The Prospect of Nuclear Terrorism
In order to justify various activities which have undermined some of the Constitutional protections Americans once enjoyed, certain elements of the political establishment (notably Dick Cheney) have raised the terrifying prospect the nuclear terrorism. Since no one in the media, left or right, has adequately addressed the likelihood of such an event, in seems well worth a look.
Let’s begin by allowing that there are terrorists who would certainly be willing to use a nuclear weapon if they had one. I see no reason to think, given the nature of Al Qaida’s various actions of a smaller scale, that the zealots of that movement would have any qualms about destroying a city in the name of their cause. Less fanatical sovereign governments have been willing to bomb and incinerate civilians by the tens of thousands, as the citizens of London, Hamburg and Tokyo discovered seventy years ago – thus, it would be deeply naïve to believe that Islamic extremists would show moral restraint in this regard. The restraint the nuclear powers have shown over the last seven decades has had more to do with self preservation than morality. The doctrine of mutual assured destruction (MAD) has worked. The nuclear powers have understood that starting a nuclear war is likely to bring about their own demise. It is not clear that stateless terrorists, scattered across the globe, willing, in many cases, to die for their cause are deterred by anything. So, we can dispense with the question of willingness and proceed to the question of capability.
A scenario often cited or implied is one in which a rogue nuclear state – Iran, or more imaginatively, North Korea – might supply a terrorist organization with a bomb. While this might make an interesting movie plot, there are reasons to believe it would not be likely. Consider, to begin with, that nuclear weapons are a hard-won technology that states develop at enormous expense. Not only is there a huge material and financial cost, but the diplomatic and physical hardship of possible economic sanctions. At the end of the long development process, middle-tier nations take years to produce even a small stockpile of bombs, perhaps sufficient to survive an enemy airstrike and thus provide a viable deterrent. Is it plausible that a nation, having suffered much to attain such weapons, would turn any number of them over to a vaguely sympathetic organization of fanatics over which that nation exercises little control? While it is at least conceivable that a council of radical mullahs in Iran or the “dear leader” of North Korea might decide to employ a bomb to please Allah or fulfill some twisted notion of personal destiny, it is very unlikely they would turn the matter over to some even more unstable middle men. If a bomb explodes in Tel Aviv or New York, it isn’t likely the originator of the device will go long without suffering retaliation in kind. The origin of these devices can be deduced with some considerable technical accuracy, and even if they couldn’t there would be an enormous impulse on the part of the victim to lash out at the most likely suspects. Thus, while Iran might conceivably use a bomb, handing one over to Hamas would gain them nothing. Doing so would also entail secondary risk. He who has the bomb has power, and there is always the risk a nuclear-equipped terrorist might turn that power against his own benefactors. It is notable that, in the entire history of the atomic age, every nation has produced its own weapons – and even such close allies as the US and the UK have shared the actual weapon technology only very sparingly and reluctantly.
A more plausible scenario in which stateless terrorists acquire the bomb, or the fissionable uranium to make a bomb, is based on the collapse of an existing nuclear power. This happened when the Soviet Union collapsed, and could happen if Pakistan disintegrates in civil war. In the former case, while no whole weapons went missing (as far as open sources can tell us), a considerable amount of fissionable material was smuggled out, and only some of it has been retrieved. What will happen in Pakistan in the next few years is anybody’s guess. One can only hope they manage to hold their shaky state together somehow. Still, as real as such possible threats might be, it is far from obvious that they are substantially mitigated by dismantling the civil rights of Americans within the borders of the United States. If 100 kilos of marijuana can be smuggled into the country by sea, smuggling a bomb across the border should be only slightly more difficult. Once here, no further communication traffic of the kind the NSA routinely intercepts would be required. In other words, while the threat is plausible, the efficacy of the intelligence countermeasures deemed necessary for our safety is marginal. Surely, a focus on intelligence gathering closer to the sources of the fissionable material would do more good.
In the final analysis, it is true that an utterly closed police state would be all but immune to terrorism – nuclear or otherwise. The Soviet Union, throughout its illustrious history, had little to fear from even the most dedicated groups of foreign radicals. A xenophobic police state is, however, a very high price to pay for security.
Let’s begin by allowing that there are terrorists who would certainly be willing to use a nuclear weapon if they had one. I see no reason to think, given the nature of Al Qaida’s various actions of a smaller scale, that the zealots of that movement would have any qualms about destroying a city in the name of their cause. Less fanatical sovereign governments have been willing to bomb and incinerate civilians by the tens of thousands, as the citizens of London, Hamburg and Tokyo discovered seventy years ago – thus, it would be deeply naïve to believe that Islamic extremists would show moral restraint in this regard. The restraint the nuclear powers have shown over the last seven decades has had more to do with self preservation than morality. The doctrine of mutual assured destruction (MAD) has worked. The nuclear powers have understood that starting a nuclear war is likely to bring about their own demise. It is not clear that stateless terrorists, scattered across the globe, willing, in many cases, to die for their cause are deterred by anything. So, we can dispense with the question of willingness and proceed to the question of capability.
A scenario often cited or implied is one in which a rogue nuclear state – Iran, or more imaginatively, North Korea – might supply a terrorist organization with a bomb. While this might make an interesting movie plot, there are reasons to believe it would not be likely. Consider, to begin with, that nuclear weapons are a hard-won technology that states develop at enormous expense. Not only is there a huge material and financial cost, but the diplomatic and physical hardship of possible economic sanctions. At the end of the long development process, middle-tier nations take years to produce even a small stockpile of bombs, perhaps sufficient to survive an enemy airstrike and thus provide a viable deterrent. Is it plausible that a nation, having suffered much to attain such weapons, would turn any number of them over to a vaguely sympathetic organization of fanatics over which that nation exercises little control? While it is at least conceivable that a council of radical mullahs in Iran or the “dear leader” of North Korea might decide to employ a bomb to please Allah or fulfill some twisted notion of personal destiny, it is very unlikely they would turn the matter over to some even more unstable middle men. If a bomb explodes in Tel Aviv or New York, it isn’t likely the originator of the device will go long without suffering retaliation in kind. The origin of these devices can be deduced with some considerable technical accuracy, and even if they couldn’t there would be an enormous impulse on the part of the victim to lash out at the most likely suspects. Thus, while Iran might conceivably use a bomb, handing one over to Hamas would gain them nothing. Doing so would also entail secondary risk. He who has the bomb has power, and there is always the risk a nuclear-equipped terrorist might turn that power against his own benefactors. It is notable that, in the entire history of the atomic age, every nation has produced its own weapons – and even such close allies as the US and the UK have shared the actual weapon technology only very sparingly and reluctantly.
A more plausible scenario in which stateless terrorists acquire the bomb, or the fissionable uranium to make a bomb, is based on the collapse of an existing nuclear power. This happened when the Soviet Union collapsed, and could happen if Pakistan disintegrates in civil war. In the former case, while no whole weapons went missing (as far as open sources can tell us), a considerable amount of fissionable material was smuggled out, and only some of it has been retrieved. What will happen in Pakistan in the next few years is anybody’s guess. One can only hope they manage to hold their shaky state together somehow. Still, as real as such possible threats might be, it is far from obvious that they are substantially mitigated by dismantling the civil rights of Americans within the borders of the United States. If 100 kilos of marijuana can be smuggled into the country by sea, smuggling a bomb across the border should be only slightly more difficult. Once here, no further communication traffic of the kind the NSA routinely intercepts would be required. In other words, while the threat is plausible, the efficacy of the intelligence countermeasures deemed necessary for our safety is marginal. Surely, a focus on intelligence gathering closer to the sources of the fissionable material would do more good.
In the final analysis, it is true that an utterly closed police state would be all but immune to terrorism – nuclear or otherwise. The Soviet Union, throughout its illustrious history, had little to fear from even the most dedicated groups of foreign radicals. A xenophobic police state is, however, a very high price to pay for security.
Posted by
E.M. Cadwaladr
July 1, 2013
In defense of culture
We live in an era in which immigration, urbanization, the lingering effects of the 1960’s counterculture, government policy, and various other factors are driving rapid changes in American society. These changes have not distributed themselves uniformly across the country, but vary by region, by the urban-rural divide, by race, ethnicity, and age. America is now, in effect, an assemblage of separate nations. I am skeptical that there ever was a truly unified American identity, but it is obvious that there isn’t one now. The old metaphor of the melting pot – in which, after a generation or two, immigrants assimilated themselves to the general culture – is certainly not currently applicable. Rather, in contemporary America, people fracture along a variety of lines. Even people of a common general background now find themselves in different and often antithetical cultures.
So – what is a culture, really?
A culture is a set of constraints imposed on the individual as a prerequisite for acceptance in a given group.
Some cultural constraints are the product of environmental conditions. Traditional Inuit culture, for example, is characterized by a high degree of patience and cheerfulness. If you live in close proximity to your family in an igloo for several months of the year, moodiness and negativity are likely to have disastrous consequences. Inuit must either be nice or suffer miserably with one another. The famous politeness and formality of the Japanese, however, has little to do with the mountains and climate of Japan. Rather, the politeness and formality of the Japanese is essentially arbitrary. Their culture carries these traits by the usual mechanisms of societal tradition and individual upbringing, but Japan could just as easily have been populated by a race of rude, obnoxious slobs. For the actual Japanese, politeness and formality are self-defining rather than physically necessary characteristics.
Very many cultural constraints, perhaps even most of them, are arbitrary. For example, apart from climatic considerations at higher latitudes, there is no physical reason that we couldn’t all be nudists. The fact that some hunter-gathers groups do live without clothing proves conclusively that it is possible. Still, if you live in virtually any modern nation, east or west, you probably find the notion of universal nudism rather unpleasant. You really wouldn’t want to see your ugly neighbors mowing the lawn or standing in a checkout line in the nude, and, unless you are a deviant, you would not want to dispense with all of your own clothes either. You feel this way because you have certain arbitrary cultural standards. You have been taught to believe certain behaviors are acceptable while others aren’t – and you know that the great majority of others in your society share the same beliefs. Wearing clothing, often quite specific clothing, is a defining feature of your culture. While it is not a physical necessity, it is a social necessity.
The consequences of abrogating social constraints are many and varied, from mild rebuke to capital punishment, but one broad consequence is a constant. If you violate a cultural taboo, you weaken the bonds you have with other members of your group. A Japanese who decides to be a rude, obnoxious slob gives up a large share of his Japaneseness. To share a culture with another person is precisely to share certain fairly reliable expectations about one another’s possible behaviors. It may be quite possible for you to attend you aunt Margaret’s funeral in the nude, but everyone else in attendance is likely to become rather anxious about what you might do next. You will have branded yourself as someone who cannot be expected to behave within the usual social bounds. You will be treated as irresponsible, erratic, and mentally unstable.
Multiculturalism, as a culture in itself, makes hash of any coherent system of social limitations. Rational people can learn a certain degree of tolerance – which is to say, to accept a rather broad range of behaviors or at least suppress one’s learned revulsion to certain things – but one cannot accept two or more incompatible cultural systems simultaneously. Contrary to what multiculturalists might hope, one cannot find homosexuality acceptable and also find Islam, which considers homosexuality a serious or even fatal transgression, acceptable. You can find one, or neither, acceptable – but not both. When the Saudi or Iranian government puts a human being to death for the crime of homosexuality you cannot consider it a tragic injustice and a reasonable expression of another culture’s beliefs at the same time. What multiculturalists usually do to defend their worldview is to either ignore those aspects of reality that make their position self-contradictory, or to gloss them over with comforting narrative. I once heard of a feminist who argued that the burka – the bag-like garment under which many Muslim women are compelled to conceal themselves – should be properly viewed as a symbol of female empowerment. This would be like calling the shackles that bound a chain-gang together instruments for promoting Afro-American unity, or like calling Buchenwald a weight-loss spa for Jews.
Multiculturalism, as actually practiced in America, suffers the further problem of simple hypocrisy. To continue with the convenient example above, it is irrational to stretch one’s tolerance of one conservative religion, Islam, to absurd lengths – but show so little tolerance for the much less militant followers of Catholicism.1 How can tolerance of some cultures and intolerance of others be multiculturalism in any robustly meaningful sense? If a person is intolerant of relatively benign cultures nearby, but tolerant of hostile cultures he or she rarely encounters, that isn’t much different from being intolerant in general. I can say that I am tolerant of the culture of New Guinea headhunters – but that’s not really much of a claim if I don’t live in New Guinea.
If we attempt to examine the phenomenon of culture as objectively as possible, it should be plain that any viable culture must have at least two traits. First, a culture must have a set of characteristics that define it. A culture has to be something other than a boundary drawn around some random group of human beings. It can be defined by subjectively good characteristics (such as patience) or subjectively nasty ones (such as racism) but it must have some consistent nature – something to bind its members in a sense of solidarity. Second, perhaps as a corollary to the first, a culture must have a means of recognizing what a non-member is – or, perhaps more to the point, of recognizing what an enemy is. Islam, it happens, does both of these things extremely well. Muslims know what being a Muslim means. They know what they should and should not do, what observances they need to make, and what they are prescribed to believe. Likewise, they know what is not Islam, what threatens Islam, and they are unapologetic in this knowledge. Western multiculturalists, on the other hand, have no idea who they are. By attempting to throw their arms around a large assortment of alien cultures, they leave themselves without any common characteristics. The multiculturalist cannot even claim tolerance as a dogma, for the reasons I’ve already outlined: some of the people they’ve included under their imaginary umbrella are decidedly intolerant, and most multiculturalists harbor hatreds of their own conservative brethren. Further, the multiculturalist, in a fantasy of universal inclusion, finds it impossible to fully reconcile himself (or herself) to the very concept of an out-group, let alone an enemy. The political and religious conservatives of their own ethnicity can be the target of enmity precisely because they share a common background with the multiculturalist. Conservatives are a sort of alter-ego, symbolic of what the multiculturalist strives to reject. True enemies, outsiders who despise the multicultural anti-culture for its flaccid tolerance and amorality, must be embraced, placated, or imagined not to exist.
A very recent example of an in-group / out-group disconnect is to be found in the case of Paula Deen, the once-popular cooking show host. Deen was discovered to have used the word “nigger” in a private conversation many years ago, and, as a consequence, was dropped by her network and is being actively harassed by MSNBC and others. I don’t doubt that many people find the “n-word” offensive, but why is it any less offensive when wielded causally by hip-hop performers in an entirely public context? Deen is vilified because, as a successful, Christian, heterosexual Caucasian, she is perceived as part of the evil conservative alter-ego that must be constrained. This, despite her actual political affiliations. Hip-hop performers, as symbolic victims of conservative oppression, are exempt from the standards of political correctness that apply to Deen.
To be clear, I am not saying that some degree of tolerance isn’t nice, or that traditional standards are necessarily laudable. What I am saying is that in a conflict between cultures, one that is coherent and cohesive has a substantial advantage over one that is inconsistent and heterogeneous. I’m not saying xenophobia is good – I’m saying xenophobia has often proven successful.
Another thing I am not doing is fully equating western liberalism with multiculturalism. The latter is a dominant meme of the former, but the two are not synonymous. In America, at least, the government has usually been able to recognize out-groups so long as they remained beyond the US border. There are interesting parallels between 19th century gunboat diplomacy and the policy of drone warfare. Ethical considerations aside, American political leadership is not completely unwilling, even now, to conduct international affairs with blunt instruments in the time-honored human way. What our leaders do not seem to understand anymore is what the in-group is – or, more romantically, what a citizen is.
One need look no further that the illegal immigration problem to grasp the growing irrelevance of American citizenship. Many Americans, some of them in positions of high authority, take it for granted that the circumstances and sufferings of illegal aliens are our responsibility. Being within the US border makes them automatically part of the in-group. Imagine you were to entertain this attitude on a person basis? What if a squatter entered your house through an open window, sat down on your couch and asked, in an alien language, to be treated as one of the family. Wouldn’t you ask him to leave? Wouldn’t you call the police if he refused? Would you feel the slightest guilt to see him ejected? Almost all of us would be indignant at the effrontery. Could you imagine sneaking across the border of a foreign country, with no intention of learning that country’s language or adapting to that country’s culture – but simply showing up there with the expectation of carving out a niche?
Immigrants can approach their relocation to a new country in one of two ways. First, they can assimilate – adopting the language, customs and other cultural aspects of their new country. With some notable exceptions, this is what most 19th century immigrants to the United States eventually did, if not fully in the first generation then certainly in the second.2 Alternatively, immigrants can colonize – which is to say, they can settle in discrete and permanent enclaves, keeping their old culture and rejecting that of natives. 19th century European colonists to Africa and Asia obviously did this, remaining British, French, Belgians, Germans and Italians in new exotic surroundings. Muslim immigrants in America and Europe have done largely the same, not merely rejecting but often despising the cultures of the countries they inhabit. Recent waves of Latinos, too, seem reluctant to assimilate – although it remains to be seen whether or not they will follow the path of the Irish in time. Language may be key in this process – both as an indicator of an immigrant’s intentions and as an actual cultural barrier. It is obvious that a person who moves to another country with no intention of learning the language has no intention of assimilating either, but intends to live in an enclave with fellow members of the same culture. When the dominant culture accommodates multilingualism it actually encourages a continued and entrenched sense of separateness, aiding and abetting a sort of neo-colonialism in reverse. Add to such excessive accommodation the nebulosity of the multiculturalist anti-culture itself, and assimilation becomes all the more unlikely. Which is easier for the immigrant: to learn a new language and adopt a new culture with a bewildering lack of defining characteristics, or to keep the old language and old customs in a new context? The multiculturalist may consider his or her society a cornucopia of attractive possibilities, but it can equally be perceived as a weak and decadent mess with something for everyone to find revolting.
One point I have already alluded to needs emphasis. Considering the remorseless Darwinian processes that actually shape history, it may not be the best educated or intellectually sophisticated culture that endures. A few years ago there was a minor stir over whether or not President Obama believed in American exceptionalism. When asked, his somewhat nuanced answer amounted to “no.” Broadly speaking, “no” is the “correct” answer. All nations have a certain uniqueness within the greater context of history – some are more powerful than others; some put more emphasis on particular rights; others are more elaborate in their artistic expression; etc. Most human beings love their own nation (if perhaps not their own government) more than any other. In this context, American exceptionalism seems little more than an expression of one particular bias. To understand that we are nothing intrinsically special is a rational achievement. It is also a serious impediment to cultural cohesion. People can identify with a high ideal, even a fictitious one, in a way that they cannot identify with a prosaic fact. Perhaps it is naïve to stand in reverence to Ronald Reagan’s “shining city on a hill” – but it is hard to imagine anyone standing in reverence to a historically accidental superpower whose behavior is sometimes good and sometimes bad. Even the narrow aspects of multiculturalism that are actually rational are culturally corrosive. Many people have died for Christianity, Islam, France, Japan, and even for Communism or Nazism – but no one ever risked life or limb for the greater glory of relativism. To the contrary – cynicism and intellectual sophistication may well go hand-in-hand. Much of the corruption we now see in government may simply be the product of a progressive erosion of coherent cultural standards and myth-infused ideals.
Anti-intellectual as the above argument may sound, it is by no means self-evidently false. No law of nature promises us a stable society at the end of our efforts to sort the world out objectively. It may well be that our pursuit of open inquiry will turn out disastrously in the end. I am not suggesting that we surrender ourselves to tradition and superstition, but I am pointing out that tradition and superstition do yield certain strengths which may turn out to be necessary to cultural survival. It is not by accident that they have survived.3
-----------------------------------------------------------
1 Not all cultures are religions, but all religions are cultures.
2 The Amish did not, of course. The Jews retain a considerable distinctiveness although they have adapted to a high degree. Blacks were handicapped from assimilating through the peculiarities of their history. The Irish required more time than most groups, but now retain only a nominal separate identity.
3 There is a further irony here. Progressive anti-theists, like Richard Dawkins and Daniel Dennett, make the implicit assumption that an empirical pursuit of truth will, in fact, continue to make a better and more stable society. While one could produce a body of interesting if inconclusive evidence to support this, they tend to just take the assertion for granted. The belief is so deeply imbedded in their particular culture that it functions very much like an article of faith. Theirs is a culture that excludes religions and seeks to convert everyone to a certain epistemic schema – while rarely bothering to apply that schema to their own core principles. Thus, they have a dogma despite themselves. Unfortunately for them, it is not a very unifying dogma. The low number of people willing to self-identify as “brights” reveals this. (See: “Why I am not a Bright” [ http://cadwaladr.blogspot.com/2011/08/why-i-am-not-bright.html ] )
So – what is a culture, really?
A culture is a set of constraints imposed on the individual as a prerequisite for acceptance in a given group.
Some cultural constraints are the product of environmental conditions. Traditional Inuit culture, for example, is characterized by a high degree of patience and cheerfulness. If you live in close proximity to your family in an igloo for several months of the year, moodiness and negativity are likely to have disastrous consequences. Inuit must either be nice or suffer miserably with one another. The famous politeness and formality of the Japanese, however, has little to do with the mountains and climate of Japan. Rather, the politeness and formality of the Japanese is essentially arbitrary. Their culture carries these traits by the usual mechanisms of societal tradition and individual upbringing, but Japan could just as easily have been populated by a race of rude, obnoxious slobs. For the actual Japanese, politeness and formality are self-defining rather than physically necessary characteristics.
Very many cultural constraints, perhaps even most of them, are arbitrary. For example, apart from climatic considerations at higher latitudes, there is no physical reason that we couldn’t all be nudists. The fact that some hunter-gathers groups do live without clothing proves conclusively that it is possible. Still, if you live in virtually any modern nation, east or west, you probably find the notion of universal nudism rather unpleasant. You really wouldn’t want to see your ugly neighbors mowing the lawn or standing in a checkout line in the nude, and, unless you are a deviant, you would not want to dispense with all of your own clothes either. You feel this way because you have certain arbitrary cultural standards. You have been taught to believe certain behaviors are acceptable while others aren’t – and you know that the great majority of others in your society share the same beliefs. Wearing clothing, often quite specific clothing, is a defining feature of your culture. While it is not a physical necessity, it is a social necessity.
The consequences of abrogating social constraints are many and varied, from mild rebuke to capital punishment, but one broad consequence is a constant. If you violate a cultural taboo, you weaken the bonds you have with other members of your group. A Japanese who decides to be a rude, obnoxious slob gives up a large share of his Japaneseness. To share a culture with another person is precisely to share certain fairly reliable expectations about one another’s possible behaviors. It may be quite possible for you to attend you aunt Margaret’s funeral in the nude, but everyone else in attendance is likely to become rather anxious about what you might do next. You will have branded yourself as someone who cannot be expected to behave within the usual social bounds. You will be treated as irresponsible, erratic, and mentally unstable.
Multiculturalism, as a culture in itself, makes hash of any coherent system of social limitations. Rational people can learn a certain degree of tolerance – which is to say, to accept a rather broad range of behaviors or at least suppress one’s learned revulsion to certain things – but one cannot accept two or more incompatible cultural systems simultaneously. Contrary to what multiculturalists might hope, one cannot find homosexuality acceptable and also find Islam, which considers homosexuality a serious or even fatal transgression, acceptable. You can find one, or neither, acceptable – but not both. When the Saudi or Iranian government puts a human being to death for the crime of homosexuality you cannot consider it a tragic injustice and a reasonable expression of another culture’s beliefs at the same time. What multiculturalists usually do to defend their worldview is to either ignore those aspects of reality that make their position self-contradictory, or to gloss them over with comforting narrative. I once heard of a feminist who argued that the burka – the bag-like garment under which many Muslim women are compelled to conceal themselves – should be properly viewed as a symbol of female empowerment. This would be like calling the shackles that bound a chain-gang together instruments for promoting Afro-American unity, or like calling Buchenwald a weight-loss spa for Jews.
Multiculturalism, as actually practiced in America, suffers the further problem of simple hypocrisy. To continue with the convenient example above, it is irrational to stretch one’s tolerance of one conservative religion, Islam, to absurd lengths – but show so little tolerance for the much less militant followers of Catholicism.1 How can tolerance of some cultures and intolerance of others be multiculturalism in any robustly meaningful sense? If a person is intolerant of relatively benign cultures nearby, but tolerant of hostile cultures he or she rarely encounters, that isn’t much different from being intolerant in general. I can say that I am tolerant of the culture of New Guinea headhunters – but that’s not really much of a claim if I don’t live in New Guinea.
If we attempt to examine the phenomenon of culture as objectively as possible, it should be plain that any viable culture must have at least two traits. First, a culture must have a set of characteristics that define it. A culture has to be something other than a boundary drawn around some random group of human beings. It can be defined by subjectively good characteristics (such as patience) or subjectively nasty ones (such as racism) but it must have some consistent nature – something to bind its members in a sense of solidarity. Second, perhaps as a corollary to the first, a culture must have a means of recognizing what a non-member is – or, perhaps more to the point, of recognizing what an enemy is. Islam, it happens, does both of these things extremely well. Muslims know what being a Muslim means. They know what they should and should not do, what observances they need to make, and what they are prescribed to believe. Likewise, they know what is not Islam, what threatens Islam, and they are unapologetic in this knowledge. Western multiculturalists, on the other hand, have no idea who they are. By attempting to throw their arms around a large assortment of alien cultures, they leave themselves without any common characteristics. The multiculturalist cannot even claim tolerance as a dogma, for the reasons I’ve already outlined: some of the people they’ve included under their imaginary umbrella are decidedly intolerant, and most multiculturalists harbor hatreds of their own conservative brethren. Further, the multiculturalist, in a fantasy of universal inclusion, finds it impossible to fully reconcile himself (or herself) to the very concept of an out-group, let alone an enemy. The political and religious conservatives of their own ethnicity can be the target of enmity precisely because they share a common background with the multiculturalist. Conservatives are a sort of alter-ego, symbolic of what the multiculturalist strives to reject. True enemies, outsiders who despise the multicultural anti-culture for its flaccid tolerance and amorality, must be embraced, placated, or imagined not to exist.
A very recent example of an in-group / out-group disconnect is to be found in the case of Paula Deen, the once-popular cooking show host. Deen was discovered to have used the word “nigger” in a private conversation many years ago, and, as a consequence, was dropped by her network and is being actively harassed by MSNBC and others. I don’t doubt that many people find the “n-word” offensive, but why is it any less offensive when wielded causally by hip-hop performers in an entirely public context? Deen is vilified because, as a successful, Christian, heterosexual Caucasian, she is perceived as part of the evil conservative alter-ego that must be constrained. This, despite her actual political affiliations. Hip-hop performers, as symbolic victims of conservative oppression, are exempt from the standards of political correctness that apply to Deen.
To be clear, I am not saying that some degree of tolerance isn’t nice, or that traditional standards are necessarily laudable. What I am saying is that in a conflict between cultures, one that is coherent and cohesive has a substantial advantage over one that is inconsistent and heterogeneous. I’m not saying xenophobia is good – I’m saying xenophobia has often proven successful.
Another thing I am not doing is fully equating western liberalism with multiculturalism. The latter is a dominant meme of the former, but the two are not synonymous. In America, at least, the government has usually been able to recognize out-groups so long as they remained beyond the US border. There are interesting parallels between 19th century gunboat diplomacy and the policy of drone warfare. Ethical considerations aside, American political leadership is not completely unwilling, even now, to conduct international affairs with blunt instruments in the time-honored human way. What our leaders do not seem to understand anymore is what the in-group is – or, more romantically, what a citizen is.
One need look no further that the illegal immigration problem to grasp the growing irrelevance of American citizenship. Many Americans, some of them in positions of high authority, take it for granted that the circumstances and sufferings of illegal aliens are our responsibility. Being within the US border makes them automatically part of the in-group. Imagine you were to entertain this attitude on a person basis? What if a squatter entered your house through an open window, sat down on your couch and asked, in an alien language, to be treated as one of the family. Wouldn’t you ask him to leave? Wouldn’t you call the police if he refused? Would you feel the slightest guilt to see him ejected? Almost all of us would be indignant at the effrontery. Could you imagine sneaking across the border of a foreign country, with no intention of learning that country’s language or adapting to that country’s culture – but simply showing up there with the expectation of carving out a niche?
Immigrants can approach their relocation to a new country in one of two ways. First, they can assimilate – adopting the language, customs and other cultural aspects of their new country. With some notable exceptions, this is what most 19th century immigrants to the United States eventually did, if not fully in the first generation then certainly in the second.2 Alternatively, immigrants can colonize – which is to say, they can settle in discrete and permanent enclaves, keeping their old culture and rejecting that of natives. 19th century European colonists to Africa and Asia obviously did this, remaining British, French, Belgians, Germans and Italians in new exotic surroundings. Muslim immigrants in America and Europe have done largely the same, not merely rejecting but often despising the cultures of the countries they inhabit. Recent waves of Latinos, too, seem reluctant to assimilate – although it remains to be seen whether or not they will follow the path of the Irish in time. Language may be key in this process – both as an indicator of an immigrant’s intentions and as an actual cultural barrier. It is obvious that a person who moves to another country with no intention of learning the language has no intention of assimilating either, but intends to live in an enclave with fellow members of the same culture. When the dominant culture accommodates multilingualism it actually encourages a continued and entrenched sense of separateness, aiding and abetting a sort of neo-colonialism in reverse. Add to such excessive accommodation the nebulosity of the multiculturalist anti-culture itself, and assimilation becomes all the more unlikely. Which is easier for the immigrant: to learn a new language and adopt a new culture with a bewildering lack of defining characteristics, or to keep the old language and old customs in a new context? The multiculturalist may consider his or her society a cornucopia of attractive possibilities, but it can equally be perceived as a weak and decadent mess with something for everyone to find revolting.
One point I have already alluded to needs emphasis. Considering the remorseless Darwinian processes that actually shape history, it may not be the best educated or intellectually sophisticated culture that endures. A few years ago there was a minor stir over whether or not President Obama believed in American exceptionalism. When asked, his somewhat nuanced answer amounted to “no.” Broadly speaking, “no” is the “correct” answer. All nations have a certain uniqueness within the greater context of history – some are more powerful than others; some put more emphasis on particular rights; others are more elaborate in their artistic expression; etc. Most human beings love their own nation (if perhaps not their own government) more than any other. In this context, American exceptionalism seems little more than an expression of one particular bias. To understand that we are nothing intrinsically special is a rational achievement. It is also a serious impediment to cultural cohesion. People can identify with a high ideal, even a fictitious one, in a way that they cannot identify with a prosaic fact. Perhaps it is naïve to stand in reverence to Ronald Reagan’s “shining city on a hill” – but it is hard to imagine anyone standing in reverence to a historically accidental superpower whose behavior is sometimes good and sometimes bad. Even the narrow aspects of multiculturalism that are actually rational are culturally corrosive. Many people have died for Christianity, Islam, France, Japan, and even for Communism or Nazism – but no one ever risked life or limb for the greater glory of relativism. To the contrary – cynicism and intellectual sophistication may well go hand-in-hand. Much of the corruption we now see in government may simply be the product of a progressive erosion of coherent cultural standards and myth-infused ideals.
Anti-intellectual as the above argument may sound, it is by no means self-evidently false. No law of nature promises us a stable society at the end of our efforts to sort the world out objectively. It may well be that our pursuit of open inquiry will turn out disastrously in the end. I am not suggesting that we surrender ourselves to tradition and superstition, but I am pointing out that tradition and superstition do yield certain strengths which may turn out to be necessary to cultural survival. It is not by accident that they have survived.3
-----------------------------------------------------------
1 Not all cultures are religions, but all religions are cultures.
2 The Amish did not, of course. The Jews retain a considerable distinctiveness although they have adapted to a high degree. Blacks were handicapped from assimilating through the peculiarities of their history. The Irish required more time than most groups, but now retain only a nominal separate identity.
3 There is a further irony here. Progressive anti-theists, like Richard Dawkins and Daniel Dennett, make the implicit assumption that an empirical pursuit of truth will, in fact, continue to make a better and more stable society. While one could produce a body of interesting if inconclusive evidence to support this, they tend to just take the assertion for granted. The belief is so deeply imbedded in their particular culture that it functions very much like an article of faith. Theirs is a culture that excludes religions and seeks to convert everyone to a certain epistemic schema – while rarely bothering to apply that schema to their own core principles. Thus, they have a dogma despite themselves. Unfortunately for them, it is not a very unifying dogma. The low number of people willing to self-identify as “brights” reveals this. (See: “Why I am not a Bright” [ http://cadwaladr.blogspot.com/2011/08/why-i-am-not-bright.html ] )
Posted by
E.M. Cadwaladr
June 18, 2013
The high cost of a false sense of security
For the last week I have been listening to establishment politicians, Republican and Democrat, attempting to justify the NSA’s PRISM program [ http://en.wikipedia.org/wiki/PRISM_(surveillance_program) ] on the grounds that it helps prevent acts of terrorism. No doubt is does. What we should be asking ourselves, however, is how much security we are gaining for the surrender of both our immediate privacy and the possible eventual surrender of the rest of our freedom.
The total number of American civilians killed by Al Qaida related terrorists thus far has amounted to a little over 3000 – almost all of them killed during the attacks of 9/11. The US population stands at over 300,000,000 so, in rough terms, your odds of having been an American civilian killed by Al Qaida terrorists stands at about 1-in-100,000. To put this into perspective, nearly 500,000 Americans have died in auto accidents since 2001 – about 1-in-600 of the population.
If we assume the elimination of PRISM and other forms of mass surveillance would increase our individual risk by a factor of ten – to 1-in-10,000 – I would still quite willingly accept that risk against the unknown but historically plausible risk of seeing our republic degenerate into a totalitarian police state. Against that latter risk, Edward Snowden has staked his life. Like Patrick Henry, he has thrown down the gauntlet to the rest of us – “Give me liberty, or give me death.”
The total number of American civilians killed by Al Qaida related terrorists thus far has amounted to a little over 3000 – almost all of them killed during the attacks of 9/11. The US population stands at over 300,000,000 so, in rough terms, your odds of having been an American civilian killed by Al Qaida terrorists stands at about 1-in-100,000. To put this into perspective, nearly 500,000 Americans have died in auto accidents since 2001 – about 1-in-600 of the population.
If we assume the elimination of PRISM and other forms of mass surveillance would increase our individual risk by a factor of ten – to 1-in-10,000 – I would still quite willingly accept that risk against the unknown but historically plausible risk of seeing our republic degenerate into a totalitarian police state. Against that latter risk, Edward Snowden has staked his life. Like Patrick Henry, he has thrown down the gauntlet to the rest of us – “Give me liberty, or give me death.”
Posted by
E.M. Cadwaladr
June 14, 2013
Science, Authority and Narrative (oh my!)
It is common for human beings to confuse narrative for fact. We are often attracted to explanations that actually have little going for them except some measure of coherence. To illustrate this for yourself, consider any of the major, popular religions that you don’t happen to believe in. In such a religion, you will find a deity (or deities) with an interesting but largely unverifiable history, possessing vast powers that can either be attributed to normal physical causes or, at least, to the powers of someone else’s deities. Religion is one kind of narrative – one attempt to explain the behavior of world. Whatever you happen to believe, you have to admit that many people in the world have erroneous beliefs.1 Even atheists and theists can at least agree that they cannot both be right.
I used to think, rather naïvely, that science was a cure for the ailment of magical narrative thinking. It is not. To show this I will need to define “science” for the purposes of my argument – or perhaps to isolate the abstraction of science from the institutional practice of science. Much worthwhile effort has already been made in this area by Thomas Kuhn and others. I don’t intend to spend much time reiterating the work of examining the internal processes of science, rather, I am interested primarily in the way that science (and scientists) influence what non-scientists believe.
Let’s begin with the abstract notion of science. Dig into any text on the philosophy of science, and you’ll find out pretty quickly that “science” is surprisingly difficult to define. I’m going to dispense with most of the subtleties and propose a rough definition that works for my purposes:
Science is the formalized process of determining the truth of assertions solely by empirical means.
Let’s parse this.
It is important to note that science is a formalized process to distinguish it from all sorts of common behavior. When one of my cats gets hungry and walks to the kitchen to examine his bowl for food, he is testing a mental proposition by empirical means – but I wouldn’t say he’s doing science. Science is formal in the sense that scientists are conscious not only of the subject of their inquiries, but also of a set a rules stipulating how those inquires may be conducted. To do science one must have rules regarding what constitutes acceptable evidence, and consciously and consistently adhere to those rules.
Science is a pursuit of truth.2 To pursue truth scientifically is to presume that a context-independent form of truth exists. It is to assume there are states of affairs in nature which are not subject to our biases and desires, even if they are sometimes beyond the reach of our unaided senses.
Lastly, science in the abstract is a method of determining truth solely by empirical means – observation (whether direct or augmented with instruments), measurement, and experiment. Scientific truth is the antithesis of belief arrived at through persuasion, or belief based on acquiescence to accepted authority. To put this another way, scientific truth is that truth based on actual observation of events, rather than on traditional belief or other social mechanisms.3
Given this definition of what science is ideally, two problems are apparent with the proposition that science has supplanted narrative or magical beliefs in contemporary society. The first is that the systems by which scientists themselves actually operate tend to deviate from the ideal. The second is that the public’s faith in science is itself unscientific. I will address the second first.
It is safe to say that the average person who venerates science is not very knowledgeable about the accumulated body of scientific knowledge. Rather, the typical admirer of science knows a few interesting scientific facts and has a sort of vague notion of something popularly called “the scientific method.” What most people believe, really, is that a faith in science is justified because technology works – the two being bound together in the public imagination. In earlier times, when technology was not a transformative force in ordinary people’s lives, the general public had little awareness of science. They respect it now because it conjures iPads and other nifty shiny things into being from time to time. People would respect wizards or witchdoctors who had the same capability. Their faith in science as a whole stems from the great practical successes in physics and chemistry. That this faith has spilled over into fields like psychology and sociology has little to do with great successes in those fields, but depends on a generalization of scientists as a class. Once the witches and priests worked miracles; now the Ph.D.s do. Popular belief is not in a body of facts, but in a body of people who use special words, work in special places, and have special titles. While such belief is not altogether divorced from empirical justification, it rests on the same sort of authoritarian trappings any medieval peasant would have recognized.
The other problem with science as a working institution – the problem that the practice of science is corrupted by non-empirical factors – is best illustrated with a few examples.
Consider Climategate, a good summary of which appears at the following link: http://www.guardian.co.uk/environment/2010/jul/07/climate-emails-question-answer . An even shorter summary of Climategate is that an important body of climate scientists were engaged in restricting access to data that might throw doubt on the man-made global warming hypothesis, and at least discussed the possibility of smearing their critics in the press rather than attempting to refute them scientifically. While a significant portion of the general public believes that the Climategate emails prove that the man-made global warming hypothesis is false, the published evidence actually does not accomplish that. A scientific theory is not disproved by showing that some of its proponents were practicing bad science. A scientific theory can only be utterly disproven in the same way that it can be proven – which is to say, with empirically derived facts. However, by showing that a body of scientists were willing to defend their theory in explicitly non-scientific ways, it does throw reasonable doubt on the credibility of contemporary science as a public institution. This is really neither new nor surprising. If you believe that science is immune to political influence, Google the name Trofim Lysenko. In fact, scientists are human beings, subject to the influence of their particular cultures, the social acceptance of their peers, and the power of the educational, commercial or political institutions that support them. When their inquiries have obvious social consequences, empirical evidence ceases to be the sole requisite of what they promote as scientific truth.
The picture in the social sciences is even worse. A recent and glaring example is that of Diederik Stapel: http://www.nytimes.com/2013/04/28/magazine/diederik-stapels-audacious-academic-fraud.html?pagewanted=all&_r=0 In brief, Stapel, a renowned Dutch sociologist, was ultimately forced to admit that he invented the data he used in over 50 published studies in sociology. Again, this does not prove that all, or even most, social scientists are that unscrupulous. What it does prove is that the peer review process, whose purpose is to ensure that scientists are actually practicing science, failed to uncover Stapel’s outright fraud at least 55 times. The social sciences, already questionable as “sciences” due to their general lack of predictive success, are doubly burdened by such failures of their own procedural integrity. Sociology, psychology, and related disciplines all struggle with a common problem. A case study of an individual, or of a particular group, does not yield results that can be generalized to all people at all times. A study in the social sciences is, at best, a snapshot of a certain unique set of circumstances. Studies in sociology and psychology are not repeatable in the way experiments in chemistry are. This makes the social sciences all that much more prone to the influences of culture, personal prestige, and politics. Stapel was believed because he was, himself, a recognized authority. He was, to some considerable extent, a recognized authority because he understood what his peers and his society wanted to believe.
In 2012, the summarized results of a series of studies from the University of California at Berkeley appeared in the popular press: http://www.theatlantic.com/health/archive/2012/03/are-rich-people-more-ethical/254689/ These studies purported to show a variety of ways in which the rich were less ethical than the rest of us – from a study of traffic behavior to a study of, believe it or not, literally stealing candy from children. These studies were presented as science and conducted, or at least directed, by academics of high standing. Even without examining the methodologies for systematic bias, it is immediately interesting that the studies were specifically targeted to compare the wealthy with an artificially homogenous class of everybody else. At least in the popular summaries, I have found no reference to any other social divisions. Considering the magnitude of the research, wouldn’t it have been more scientifically enlightening to rank all recognizable classes and groups – instead of just the rich vs. everybody else? But let’s be serious. How likely is it that researchers at UC Berkeley would ever release a set of findings showing that welfare recipients, blacks, or Latinos were less ethical on average than everybody else? I’m not asserting here that welfare recipients, blacks, or Latinos are innately less ethical, but simply that researchers in a liberal university know better than to ask the question. The studies that were actually conducted were tailored to produce findings that are consistent with the dominant ideology at UC Berkeley. In that context, the findings were entirely unsurprising. I am not suggesting, either, that the researchers would have falsified the data if they had gotten results that were surprising (i.e. contrary to their desired outcomes) but I am confident that in such a case the studies would not have been so widely published. Practically speaking, there are some assertions you are allowed to make and others you are not. In our culture, as in all cultures, certain inquiries are taboo.
An even more culture-bound political exercise masquerading as science is to be found in Right Wing Authoritarian (RWA) research. I wrote a critique of RWA in an earlier post, and append it here in the extended footnote below.4
Science in the sense that I outlined at the beginning – the formalized process of determining the truth of assertions solely by empirical means – is probably the most useful tool the human species has. By testing our beliefs about reality in a broadly scientific way, we can struggle forward with some confidence. With only faith, tradition, and consensus to guide us we hardly struggle forward at all. Science in the sense of the cultural institution embodied in a handful of professionals, however, has as much to do with authority as it does with truth. While it is likely that those with specialized educations will be especially fluent in their chosen disciplines, it does not follow that they will always be right, or that those with less formal training should surrender their skepticism, especially in the arena of the social sciences. Assertions are not true or false in accordance with the status of their originators, but in accordance to their correspondence to material reality.
It should always be remembered that the very core mechanisms of scientific inquiry, far from being the privileged sphere of the academic elites, are nearly universal. All animals with senses use them to uncover truth. If my cat is not exactly doing science when he examines his food bowl, he is at least doing things that science depends on. He makes a hypothesis, albeit an unstated one, that his bowl might contain food. He conducts a kind of experiment by walking to the bowl. He makes an observation. He adjusts his view of reality based on that observation. These rudiments of science are what sense organs and nervous systems evolved to do. It is the social, mystical, and authoritarian roads to belief which are peculiar to us. While most animals, including humans, suffer from certain instinctive reactions which drive us to occasionally behave against our better interests, only the most socially complex animals can summon up real delusions. The capacity to plan is the same capacity that allows us to imagine the world in ways that it is not. The capacity to communicate complex ideas is the same capacity that allows us to extend our imaginary world to others. Thus, the very mechanisms that make science possible put it continually at risk.
----------------------------------------------------------------
1 If there are any true relativists out there who believe that everybody’s truth is just as good as everybody else’s, they have to believe in my epistemic absolutism too, so they do not qualify as an exception.
2 Here I admittedly generalize right over the messy business of philosophy and all of its interesting nuances. Is a scientific theory true because it works – in other words, because it yields correct predictions? Are scientific theories expressions of reality, or merely symbolic analogies of what are ultimately imperceptible states of affairs? Interesting questions, but beyond my current scope.
3 I am tempted to say that tradition and authority based forms of “truth” are not truth at all, but this would be begging the question. It is sufficient for my purposes to roughly define what I mean by scientific truth, and leave other definitions of truth alone for now.
4 RWA research ( http://en.wikipedia.org/wiki/Right-wing_authoritarianism ) suffers from the same problem IQ tests do – the test itself becomes the definition of the property you are testing for. This is a hazard with almost all standardized assessments of this nature. You test against the biases of the people who compose the test. If the people who compose the test have an agenda you get a very bad test indeed. Consider what the article cites as the first item on the new RWA scale:
"Our country desperately needs a mighty leader who will do what has to be done to destroy the radical new ways and sinfulness that are ruining us."
The article explains: "People who strongly agree with this are showing a tendency toward authoritarian submission (Our country desperately needs a mighty leader), authoritarian aggression (who will do what has to be done to destroy), and conventionalism (the radical new ways and sinfulness that are ruining us)." Well, that sounds like very frightening stuff. Now, let’s alter the language only slightly, while trying to maintain the same essential content:
"Our country desperately needs a forceful leader who will do what has to be done to stamp out the new extremist policies and runaway corruption that are ruining us."
This still sounds like... authoritarian submission (Our country desperately needs a forceful leader), authoritarian aggression (who will do what has to be done to stamp out), and conventionalism (the new extremist policies and runaway corruption that are ruining us). Of course, this sentence would have dovetailed neatly into any Democratic candidate's nomination speech during the 2008 US election cycle. Well, amusing as it might be, we can't all be Ring-wing authoritarians.
What the proponents of RWA have done is to assemble a compact set of stereotypically conservative traits that most liberals find especially abhorrent, then constructed a quite precise linguistic trap that would snare conservatives – and only conservatives – into identifying with that definition.
A common hallmark of good science (though I admit not one that occurs in absolutely all cases) is that it produces some surprising results. The RWA assessment appears to be so carefully crafted that the results are about as surprising as discovering that optometrists write more glasses prescriptions than other people.
The RWA article continues:
“In a study by Altemeyer, 68 authoritarians played a three hour simulation of the Earth's future entitled the Global change game ( http://en.wikipedia.org/wiki/Global_change_game ). Unlike a comparison game played by individuals with low RWA scores, which resulted in world peace and widespread international cooperation, the simulation by authoritarians became highly militarized and eventually entered the stage of nuclear war. By the end of the high RWA game, the entire population of the earth was declared dead.”
Again, if you look at the Global change game objectively you will have to admit the findings are rather problematic. As a socio-economic-military simulation of the world, the game is both crude and overly subjective. The game world is quantified along resource and population lines based on real numbers, but little if any attempt is made to model cultural or historic relationships between nations. Military and economic models are oversimplified for the sake of playability. Assessments of the effects of player’s decisions are often not handled algorithmically (by some neutral mathematical rule) but by the rulings of “facilitators” with their own personal biases. I have no doubt the game is an enjoyable exercise, but it proves little. A global simulation designed and refereed by conservative economists might be equally enjoyable, would probably yield very different results, and would be every bit as useless.
A classic study of authority like the Milgram experiment ( http://en.wikipedia.org/wiki/Milgram_experiment ) had real validity because it attempted to hide the game from the experimental subjects. They thought they were doing something real. The Global change game is, straightforwardly, a game – not reality. Further, while the Milgram experiment put people into an unusual situation, it was one that was at least plausible for them to be in. The tiny population of world leaders the Altemeyer game attempts to have players represent are, in the real world, not drawn from some sampling of people from a common culture, screened only in accordance with how they performed on a psychologist’s test. On average, real leaders in the real world are a more cautious and deliberative breed. They have something real to lose. The “global change game” that actually played out over the forty-four years of the Cold War failed to produce a nuclear exchange, even though there were often authoritarians on both sides and always authoritarians on at least one side. Any candidate for a valid simulation of the future ought to also be a credible simulation of the past. While Altemeyer’s game is dramatic and interesting, I don’t see the Rand Corporation seizing on it anytime soon as means of predicting the future behavior of actual nations.
I used to think, rather naïvely, that science was a cure for the ailment of magical narrative thinking. It is not. To show this I will need to define “science” for the purposes of my argument – or perhaps to isolate the abstraction of science from the institutional practice of science. Much worthwhile effort has already been made in this area by Thomas Kuhn and others. I don’t intend to spend much time reiterating the work of examining the internal processes of science, rather, I am interested primarily in the way that science (and scientists) influence what non-scientists believe.
Let’s begin with the abstract notion of science. Dig into any text on the philosophy of science, and you’ll find out pretty quickly that “science” is surprisingly difficult to define. I’m going to dispense with most of the subtleties and propose a rough definition that works for my purposes:
Science is the formalized process of determining the truth of assertions solely by empirical means.
Let’s parse this.
It is important to note that science is a formalized process to distinguish it from all sorts of common behavior. When one of my cats gets hungry and walks to the kitchen to examine his bowl for food, he is testing a mental proposition by empirical means – but I wouldn’t say he’s doing science. Science is formal in the sense that scientists are conscious not only of the subject of their inquiries, but also of a set a rules stipulating how those inquires may be conducted. To do science one must have rules regarding what constitutes acceptable evidence, and consciously and consistently adhere to those rules.
Science is a pursuit of truth.2 To pursue truth scientifically is to presume that a context-independent form of truth exists. It is to assume there are states of affairs in nature which are not subject to our biases and desires, even if they are sometimes beyond the reach of our unaided senses.
Lastly, science in the abstract is a method of determining truth solely by empirical means – observation (whether direct or augmented with instruments), measurement, and experiment. Scientific truth is the antithesis of belief arrived at through persuasion, or belief based on acquiescence to accepted authority. To put this another way, scientific truth is that truth based on actual observation of events, rather than on traditional belief or other social mechanisms.3
Given this definition of what science is ideally, two problems are apparent with the proposition that science has supplanted narrative or magical beliefs in contemporary society. The first is that the systems by which scientists themselves actually operate tend to deviate from the ideal. The second is that the public’s faith in science is itself unscientific. I will address the second first.
It is safe to say that the average person who venerates science is not very knowledgeable about the accumulated body of scientific knowledge. Rather, the typical admirer of science knows a few interesting scientific facts and has a sort of vague notion of something popularly called “the scientific method.” What most people believe, really, is that a faith in science is justified because technology works – the two being bound together in the public imagination. In earlier times, when technology was not a transformative force in ordinary people’s lives, the general public had little awareness of science. They respect it now because it conjures iPads and other nifty shiny things into being from time to time. People would respect wizards or witchdoctors who had the same capability. Their faith in science as a whole stems from the great practical successes in physics and chemistry. That this faith has spilled over into fields like psychology and sociology has little to do with great successes in those fields, but depends on a generalization of scientists as a class. Once the witches and priests worked miracles; now the Ph.D.s do. Popular belief is not in a body of facts, but in a body of people who use special words, work in special places, and have special titles. While such belief is not altogether divorced from empirical justification, it rests on the same sort of authoritarian trappings any medieval peasant would have recognized.
The other problem with science as a working institution – the problem that the practice of science is corrupted by non-empirical factors – is best illustrated with a few examples.
Consider Climategate, a good summary of which appears at the following link: http://www.guardian.co.uk/environment/2010/jul/07/climate-emails-question-answer . An even shorter summary of Climategate is that an important body of climate scientists were engaged in restricting access to data that might throw doubt on the man-made global warming hypothesis, and at least discussed the possibility of smearing their critics in the press rather than attempting to refute them scientifically. While a significant portion of the general public believes that the Climategate emails prove that the man-made global warming hypothesis is false, the published evidence actually does not accomplish that. A scientific theory is not disproved by showing that some of its proponents were practicing bad science. A scientific theory can only be utterly disproven in the same way that it can be proven – which is to say, with empirically derived facts. However, by showing that a body of scientists were willing to defend their theory in explicitly non-scientific ways, it does throw reasonable doubt on the credibility of contemporary science as a public institution. This is really neither new nor surprising. If you believe that science is immune to political influence, Google the name Trofim Lysenko. In fact, scientists are human beings, subject to the influence of their particular cultures, the social acceptance of their peers, and the power of the educational, commercial or political institutions that support them. When their inquiries have obvious social consequences, empirical evidence ceases to be the sole requisite of what they promote as scientific truth.
The picture in the social sciences is even worse. A recent and glaring example is that of Diederik Stapel: http://www.nytimes.com/2013/04/28/magazine/diederik-stapels-audacious-academic-fraud.html?pagewanted=all&_r=0 In brief, Stapel, a renowned Dutch sociologist, was ultimately forced to admit that he invented the data he used in over 50 published studies in sociology. Again, this does not prove that all, or even most, social scientists are that unscrupulous. What it does prove is that the peer review process, whose purpose is to ensure that scientists are actually practicing science, failed to uncover Stapel’s outright fraud at least 55 times. The social sciences, already questionable as “sciences” due to their general lack of predictive success, are doubly burdened by such failures of their own procedural integrity. Sociology, psychology, and related disciplines all struggle with a common problem. A case study of an individual, or of a particular group, does not yield results that can be generalized to all people at all times. A study in the social sciences is, at best, a snapshot of a certain unique set of circumstances. Studies in sociology and psychology are not repeatable in the way experiments in chemistry are. This makes the social sciences all that much more prone to the influences of culture, personal prestige, and politics. Stapel was believed because he was, himself, a recognized authority. He was, to some considerable extent, a recognized authority because he understood what his peers and his society wanted to believe.
In 2012, the summarized results of a series of studies from the University of California at Berkeley appeared in the popular press: http://www.theatlantic.com/health/archive/2012/03/are-rich-people-more-ethical/254689/ These studies purported to show a variety of ways in which the rich were less ethical than the rest of us – from a study of traffic behavior to a study of, believe it or not, literally stealing candy from children. These studies were presented as science and conducted, or at least directed, by academics of high standing. Even without examining the methodologies for systematic bias, it is immediately interesting that the studies were specifically targeted to compare the wealthy with an artificially homogenous class of everybody else. At least in the popular summaries, I have found no reference to any other social divisions. Considering the magnitude of the research, wouldn’t it have been more scientifically enlightening to rank all recognizable classes and groups – instead of just the rich vs. everybody else? But let’s be serious. How likely is it that researchers at UC Berkeley would ever release a set of findings showing that welfare recipients, blacks, or Latinos were less ethical on average than everybody else? I’m not asserting here that welfare recipients, blacks, or Latinos are innately less ethical, but simply that researchers in a liberal university know better than to ask the question. The studies that were actually conducted were tailored to produce findings that are consistent with the dominant ideology at UC Berkeley. In that context, the findings were entirely unsurprising. I am not suggesting, either, that the researchers would have falsified the data if they had gotten results that were surprising (i.e. contrary to their desired outcomes) but I am confident that in such a case the studies would not have been so widely published. Practically speaking, there are some assertions you are allowed to make and others you are not. In our culture, as in all cultures, certain inquiries are taboo.
An even more culture-bound political exercise masquerading as science is to be found in Right Wing Authoritarian (RWA) research. I wrote a critique of RWA in an earlier post, and append it here in the extended footnote below.4
Science in the sense that I outlined at the beginning – the formalized process of determining the truth of assertions solely by empirical means – is probably the most useful tool the human species has. By testing our beliefs about reality in a broadly scientific way, we can struggle forward with some confidence. With only faith, tradition, and consensus to guide us we hardly struggle forward at all. Science in the sense of the cultural institution embodied in a handful of professionals, however, has as much to do with authority as it does with truth. While it is likely that those with specialized educations will be especially fluent in their chosen disciplines, it does not follow that they will always be right, or that those with less formal training should surrender their skepticism, especially in the arena of the social sciences. Assertions are not true or false in accordance with the status of their originators, but in accordance to their correspondence to material reality.
It should always be remembered that the very core mechanisms of scientific inquiry, far from being the privileged sphere of the academic elites, are nearly universal. All animals with senses use them to uncover truth. If my cat is not exactly doing science when he examines his food bowl, he is at least doing things that science depends on. He makes a hypothesis, albeit an unstated one, that his bowl might contain food. He conducts a kind of experiment by walking to the bowl. He makes an observation. He adjusts his view of reality based on that observation. These rudiments of science are what sense organs and nervous systems evolved to do. It is the social, mystical, and authoritarian roads to belief which are peculiar to us. While most animals, including humans, suffer from certain instinctive reactions which drive us to occasionally behave against our better interests, only the most socially complex animals can summon up real delusions. The capacity to plan is the same capacity that allows us to imagine the world in ways that it is not. The capacity to communicate complex ideas is the same capacity that allows us to extend our imaginary world to others. Thus, the very mechanisms that make science possible put it continually at risk.
----------------------------------------------------------------
1 If there are any true relativists out there who believe that everybody’s truth is just as good as everybody else’s, they have to believe in my epistemic absolutism too, so they do not qualify as an exception.
2 Here I admittedly generalize right over the messy business of philosophy and all of its interesting nuances. Is a scientific theory true because it works – in other words, because it yields correct predictions? Are scientific theories expressions of reality, or merely symbolic analogies of what are ultimately imperceptible states of affairs? Interesting questions, but beyond my current scope.
3 I am tempted to say that tradition and authority based forms of “truth” are not truth at all, but this would be begging the question. It is sufficient for my purposes to roughly define what I mean by scientific truth, and leave other definitions of truth alone for now.
4 RWA research ( http://en.wikipedia.org/wiki/Right-wing_authoritarianism ) suffers from the same problem IQ tests do – the test itself becomes the definition of the property you are testing for. This is a hazard with almost all standardized assessments of this nature. You test against the biases of the people who compose the test. If the people who compose the test have an agenda you get a very bad test indeed. Consider what the article cites as the first item on the new RWA scale:
"Our country desperately needs a mighty leader who will do what has to be done to destroy the radical new ways and sinfulness that are ruining us."
The article explains: "People who strongly agree with this are showing a tendency toward authoritarian submission (Our country desperately needs a mighty leader), authoritarian aggression (who will do what has to be done to destroy), and conventionalism (the radical new ways and sinfulness that are ruining us)." Well, that sounds like very frightening stuff. Now, let’s alter the language only slightly, while trying to maintain the same essential content:
"Our country desperately needs a forceful leader who will do what has to be done to stamp out the new extremist policies and runaway corruption that are ruining us."
This still sounds like... authoritarian submission (Our country desperately needs a forceful leader), authoritarian aggression (who will do what has to be done to stamp out), and conventionalism (the new extremist policies and runaway corruption that are ruining us). Of course, this sentence would have dovetailed neatly into any Democratic candidate's nomination speech during the 2008 US election cycle. Well, amusing as it might be, we can't all be Ring-wing authoritarians.
What the proponents of RWA have done is to assemble a compact set of stereotypically conservative traits that most liberals find especially abhorrent, then constructed a quite precise linguistic trap that would snare conservatives – and only conservatives – into identifying with that definition.
A common hallmark of good science (though I admit not one that occurs in absolutely all cases) is that it produces some surprising results. The RWA assessment appears to be so carefully crafted that the results are about as surprising as discovering that optometrists write more glasses prescriptions than other people.
The RWA article continues:
“In a study by Altemeyer, 68 authoritarians played a three hour simulation of the Earth's future entitled the Global change game ( http://en.wikipedia.org/wiki/Global_change_game ). Unlike a comparison game played by individuals with low RWA scores, which resulted in world peace and widespread international cooperation, the simulation by authoritarians became highly militarized and eventually entered the stage of nuclear war. By the end of the high RWA game, the entire population of the earth was declared dead.”
Again, if you look at the Global change game objectively you will have to admit the findings are rather problematic. As a socio-economic-military simulation of the world, the game is both crude and overly subjective. The game world is quantified along resource and population lines based on real numbers, but little if any attempt is made to model cultural or historic relationships between nations. Military and economic models are oversimplified for the sake of playability. Assessments of the effects of player’s decisions are often not handled algorithmically (by some neutral mathematical rule) but by the rulings of “facilitators” with their own personal biases. I have no doubt the game is an enjoyable exercise, but it proves little. A global simulation designed and refereed by conservative economists might be equally enjoyable, would probably yield very different results, and would be every bit as useless.
A classic study of authority like the Milgram experiment ( http://en.wikipedia.org/wiki/Milgram_experiment ) had real validity because it attempted to hide the game from the experimental subjects. They thought they were doing something real. The Global change game is, straightforwardly, a game – not reality. Further, while the Milgram experiment put people into an unusual situation, it was one that was at least plausible for them to be in. The tiny population of world leaders the Altemeyer game attempts to have players represent are, in the real world, not drawn from some sampling of people from a common culture, screened only in accordance with how they performed on a psychologist’s test. On average, real leaders in the real world are a more cautious and deliberative breed. They have something real to lose. The “global change game” that actually played out over the forty-four years of the Cold War failed to produce a nuclear exchange, even though there were often authoritarians on both sides and always authoritarians on at least one side. Any candidate for a valid simulation of the future ought to also be a credible simulation of the past. While Altemeyer’s game is dramatic and interesting, I don’t see the Rand Corporation seizing on it anytime soon as means of predicting the future behavior of actual nations.
Posted by
E.M. Cadwaladr
June 10, 2013
What could the government possibly do with phone call metadata?
Here’s a quick scenario based on my own experience. I’ve attended my local Tea Party group’s meetings many times. I know how they are organized. They have a web site with a fair amount of contact information. The organizers are proud of what they do, and make no attempt to hide their identities. Meeting attendees often provide their phone numbers so they can stay informed about future meetings, upcoming speakers, and so forth. The organization’s volunteer secretary feeds these numbers into a robocaller for that purpose. The robocaller makes a series of calls in steady succession, which has a certain pattern no doubt easily identifiable by NSA software. An analyst can easily look up the originator of the calls, a Tea Party group secretary, and reasonably infer that she isn’t robocalling birthday greetings to her grandchildren. Thus, without a warrant, a list of people connected with that Tea Party organization can be arrived at. When people have to worry that participating in a political organization that supports the US Constitution might result in harassment from the IRS or some other government entity, they may decide to stay at home and keep their mouths shut. They may take a step, in fear, away from freedom. That is why a little metadata matters.
Posted by
E.M. Cadwaladr
Subscribe to:
Posts (Atom)