No Clean Feed - Stop Internet Censorship in Australia

Ebola, Ferguson and political narratives

By Lorenzo

The Ebola virus reaching the US and the ongoing troubles and controversy over a police shooting in Ferguson, Missouri display the power and the dangers of political narratives from all sides, both of US politics and more broadly.

Thus, one of the more tired and embarrassing responses to Ebola mis-steps in the US has to been to decry “budget cuts” at the National Institutes of Health (NIH) and related agencies, thereby fulfilling two perennial progressivist tropes–there is never enough money and more money makes it better.

Evading responsibility
Embarrassing because:

  • dealing with viral outbreaks is rather their core business [particularly the CDC, but the US Health and Human Services Department generally], and having an appropriate action plan ready to go should not be very expensive [even if implementing it may be]; and
  • (2) the NIH spends a considerable amount of money, an amount which has gone up dramatically over the last decade and a half.

In 2000, the NIH had a total budget of $17.8bn, which rose rapidly to $28.6bn in 2005 and has hovered around $29-$30bn ever since. Quite a lot of money and not subject to any serious cuts. (It is a bigger budget than the Australian Defence Force.) This did not stop the current head of the NIH blaming the failure to come up with an Ebola vaccine on “a decade of stagnant spending“. Yes, that is a bureaucrat evading responsibility, but the Huffington Post headline blames “budget cuts”; and the “budget cuts!” and “more money!” memes are very useful for evading responsibility.

The Centres for Disease Control and Prevention (CDC) has a budget of around $6.5bn in recent years, also after considerable increases under the Bush II Administration.

Then there is the US Department of Health and Human Services (HHS), which the NIH and CDC are part of. It, and its subordinate agencies, has a total budget of, in 2013, $886bn; in 2014, $958bn; and, in 2015, $1trn. That is a significantly bigger budget than the US Defence Department and more than twice the expenditure of the entire Federal Government of the Commonwealth of Australia.

Again, effective plans for dealing with a possible viral outbreak which has been raging in West Africa for months, how much does that cost, really? [Including health guidelines one might adopt from people with experience.] The HHS has, for example, enormously more resources than, say Nigeria or Senegal, who have both successfully dealt with much worse outbreaks and provide learning experiences that a competent bureaucracy might notice. (Though Peter Turchin raises a rather nastier possible explanation for the somewhat lacklustre response.)

If a trillion dollar budget does not generate satisfactory competence in a basic area of responsibility, no amount of money is going to. Indeed, at that sort of scale, more money, and the extra responsibility that does with it, is almost certainly going to generate less basic competence, not more. It does rather look like something of a failure of the administrative state (though the degree of “failure” is being rather overdone).

Apart from being easy tropes, fulfilling a preferred political narrative, “budget cuts!”, “more money!” also do something such narratives are often about–they divert attention from awkward facts likely to cause cognitive dissonance. The cognitive dissonance here being a (very well-funded) government bureaucracy does absolutely nothing to provide any guarantee of effectiveness, or even basic competence. The omni-competent state that progressivist politics implicitly or explicitly postulates will solve problems–if just given the correct goals and the funding-which-is-never-enough–does not really exist.

Worse, as we have seen with the head of the NIH, the memes in question actively work to evade responsibility–and that is precisely the point. Holding government agencies and spending programs genuinely accountable for their competence and effectiveness not only makes “the government will fix it” much more complicated, it can actively and seriously undermine that central presumption.

This is not merely an “political narratives” issue. It goes right to the heart of holding governments and their agencies accountable. Political narratives matter, and in a very direct sense.

President Obama’s response to the agencies on the ground letting him down–in the “embarrassing the President in the news cycle” sense–was to appoint an Ebola “czar”. Both he and his predecessor have been very inclined to such appointments, far more so than previous postwar presidents. That is partly because both President Obama and President Bush II are mediocre administrators, by US Presidential standards. It is also likely to be partly a response to the 24-hour news cycle–President Clinton was much more inclined than his postwar predecessors to appoint such folk, though not nearly as inclined as his two successors. It may also be partly a response to the growth of the US Federal Government–the more it does, the harder it is to coordinate.  But I would rate administrative competence as the main driver: Bush II and Obama are simply not very good at such (witness Obama’s appalling failure to appoint people to vacancies on the US Federal Reserve Board), and political officers are what you turn to when you can’t make the ordinary bureaucracy do what you want.

[This piece on problems in the administration of National Security by the Administration is less than re-assuring, further indicating a lack administrative competence. Jeb Bush--who, as a former Governor 0f Florida, has a lot of experience in crisis-management--has criticised the Administration's simple message management, contrasting it with his own efforts in somewhat similar circumstances: also not an expensive matter.]

Three languages of politics
Which brings us to Ferguson, Missouri and the police-and-blacks issue that the killing of Michael Brown by police offer Darren Wilson and subsequent riots brought (yet again) to the fore. The controversy over what did and did not happen (the killing itself remains distinctly murky) provides an excellent example of Arnold Kling‘s The Three Languages of Politics (which he discusses here, I recommend listening): the progressive oppression/oppressor axis, the conservative civilisation/barbarism axis and the libertarian freedom/coercion axis.

Reading progressivist and conservative online commentary on matters Ferguson is to enter two different world views that barely interact. Among conservatives, it was about “race baiting”, appropriate behaviour when stopped by a police offer and (lack of) civic engagement–in other words, how progressivists make things worse and the civilisation v. barbarism axis. Among progressivists, it was yet another unarmed black men being killed in police-initiated or massively over-reacting incidents, police incitement and abuse of authority, narrow and unbalanced reporting of a mainly black community–in other words, a civil rights matter, one of oppression and oppressed.

Then there was the libertarian commentary, which particularly focused on the militarisation of US police forces–notably in Republican Sen. Rand Paul’s opinion piece in Time. Libertarians have been warming about the militarisation of US police for some time, as in this 2006 article by Glenn Reynolds in Popular Mechanics. A concern that has spread to conservatives, as in this 2013 Heritage Foundation analysis. Sen. Paul managed nods to both the civilisation/barbarism narrative:

The outrage in Ferguson is understandable—though there is never an excuse for rioting or looting. There is a legitimate role for the police to keep the peace, but there should be a difference between a police response and a military response.

And to the oppressor/oppression narrative:

Given the racial disparities in our criminal justice system, it is impossible for African-Americans not to feel like their government is particularly targeting them. …

Anyone who thinks that race does not still, even if inadvertently, skew the application of criminal justice in this country is just not paying close enough attention. Our prisons are full of black and brown men and women who are serving inappropriately long and harsh sentences for non-violent mistakes in their youth.

While focusing on critiquing the militarisation of US police forces (freedom/coercion):

When you couple this militarization of law enforcement with an erosion of civil liberties and due process that allows the police to become judge and jury—national security letters, no-knock searches, broad general warrants, pre-conviction forfeiture—we begin to have a very serious problem on our hands.

Militarisation combines power and separation: it separates the police from the local citizenry while elevating their sense of power over them, not a happy combination. At which point, (conservative and libertarian) opposition to gun control is surely a factor. A recurring claim in favour of widespread gun ownership is that “an armed society is a polite society”. Historically, that is not true; it more often breeds a violent, honour-obsessed society. What an armed society does apparently breed is not polite police folk, but paranoid ones. And, with the militarisation of US police forces, courtesy of the US Federal Government, ludicrously over-armed paranoid ones; also not a happy combination.

But not randomly paranoid ones. Young men are the most likely corpses from fatal police-initiated/disproportionate reaction shootings, particularly young black men.

To the extent that these shootings become matters of public debate, they tend to disappear in the talking-past-each other self-supporting political narratives seen regarding events in Ferguson, Missouri. But freedom is, ultimately, indivisible. A long history of US police forces being able to evade responsibility for how they treat black folk (and other low-status groups, but particularly black folk) turns out to be not something that can be quarantined away from, well, everyone else. As a man whose son was shot by a police officer 10 years ago wrote recently:

Our country is simply not paying enough attention to the terrible lack of accountability of police departments and the way it affects all of us—regardless of race or ethnicity. Because if a blond-haired, blue-eyed boy — that was my son, Michael — can be shot in the head under a street light with his hands cuffed behind his back, in front of five eyewitnesses (including his mother and sister), and his father was a retired Air Force lieutenant colonel who flew in three wars for his country — that’s me — and I still couldn’t get anything done about it, then Joe the plumber and Javier the roofer aren’t going to be able to do anything about it either.

Michael Z. Williamson (website here) is a military SF writer of libertarian views with a strong interest in military history. (His novel Freehold, for example, is basically the Winter War in space.) His recent collections of short stories and other writings, Tour of Duty, contains two pieces which detail his experiences with the IPD (Indianapolis Police Department). His view of the police:

Lesson here: they’re hired goons, not at all concerned with law and order (p.446).

They are mercenary thugs, hired by my tax dollars to oppress me in the name of corporate America. Not even whores, as whores are paid for their work (p.447).

His view of correction officers after being arrested and held overnight:

I have learned that you are petty, gutless Fascists who are so pitiful as to find solace in your own wretched lives in bullying people with problems, helpless to resist you, until they turn into caged animals for your amusement (p.462).

Remember, he is a white US military veteran. (If a conservative is a liberal who has been mugged by reality, perhaps a libertarian is a conservative who has had one too many dealings with government officials.)

[Frank Serpico--yes, from the movie--discusses writes about the continuing problems of (lack of) police accountability in the US.]

Techniques of evasion of accountability can spread–both from bureaucracy to bureaucracy and from low-status group to well, anyone and everyone. Which, as this article in conservative journal National Review sets out, leads to a pattern of inadequately accountable government agencies:

It’s perverse: If an ordinary citizen makes a typo on his 1040EZ, he could be on the hook for untold sums of money, fines, even jail time. When the IRS abuses its power to harass political enemies, nothing happens. A few years ago, an employer of mine entered the wrong Social Security number on my paperwork — I have barbaric handwriting — and the error took months of telephone calls and mail to fix, a period of time over which I was threatened with all sorts of nasty consequences by the Social Security Administration and the IRS. But when the Social Security Administration oversees the payment of millions of dollars in benefits to Nazi war criminals summering on Croatian beaches, nothing happens. If you’re an ordinary schmo, a typo can land you in jail. If you work for the government, you can burn the face off a baby and walk.

The clear and present danger
Discussions of the uncivil tribalism of contemporary US politics and the power of political narratives tend to talk about it as unfortunate, regrettable, be nice if we could do better. But the problem is much deeper than that. The way the tribal narratives are actually operating is to frustrate political accountability and breed dangerously unaccountable government agencies.

If one is trying to deal with the world as it is, rather than as you would like to think of it, then the question becomes; is that an appropriate axis to view this problem? Or merely one you find congenial? For example, phenomena such as the Islamic State and al-Qaeda are really not usefully viewed through the oppressor/oppressed axis (unless, perhaps, you realise that their aim is to be oppressors). The civilisation-v-barbarism and even liberty/coercion axes are much more appropriate. (Which is why progressivists tend to end up saying such inane, or worse, things on the issue.) Though, that is a relative, rather than absolute judgement, since blanket condemnations of Islam are not useful either. Conversely, equal rights for queer folk really is not raging barbarism, not a threat to civilised order.

But the virulent political tribalism and war-of-the-narratives of contemporary US politics are having much more invidious effects in fostering a whole lots of distracting delusions about issues that seriously matter:

  • Government agencies are not automatically reliable toys which can be waved to generate social justice.
  • If you are going to be so keen on an armed society, you better think a lot more seriously about the position that puts police officers in.
  • Yes, there is a problem with police treatment of specific groups (particularly young black men) and no that is not about keeping you safe; obsessing with not conceding an inch to the concerns of those other folk is just creating homicidally unaccountable police departments and, by copying and contagion, undermining accountability in government agencies generally.
  • No, abusing due process to target those “evil others” really is a big deal.
  • Yes, there does come a point when privileging public sector unions undermines basic effectiveness and accountability.

And so on.

Creating cultures and processes of accountability in government agencies is hard, grinding work. Not least because it means giving up so congenial notions on the way through. But if the shouting political tribes of the US do not look up from their status games and start noticing what their cognitive civil war is doing in corrupting basic processes of government and government administration, then the culture of inadequate accountability among US government agencies is just going to get worse and worse. Which can lead to places I doubt few, if any, of the shouting political tribalists want to go.

ADDENDA After I posted this, I came across this comment by SF writer John Scalzi:

Broadly speaking, the Republicans are frothing ideologues, the Democrats are incompetent …

Sounds about right.

A political scientist notes the lack of interest in cooperation embodied in the competing narratives.

Public sector pensions are driving US city and state governments towards bankruptcy.

[Cross-posted from Thinking Out Aloud.]

Orientation and action

By Lorenzo

The case of Gordon College (via) in Massachusetts, which propounds a traditional Christian view of homosexuality with a rather less traditional coda of sympathy, puts into sharp relief the “orientation is not sinful, acts are” position.

The policy of Gordon College is:

The orientation/action distinction has two major problems with it. First, it sets up an utterly unreasonable standard. Homosexuals are not permitted to act upon their erotic desires or to seek intimate companionship. To see how unreasonable this is, consider telling heterosexual people: you cannot have sex with anyone of the opposite sex, but marrying someone of the same sex is just fine.

Clearly, this is a standard that people are (mostly) not going to achieve. When the (predictable) high level of failure to achieve it then occurs, homosexuals are held blameworthy for failing to keep to an utterly unreasonable standard.

This is, of course, very much in the interest of priests and clerics–that a vulnerable minority have this completely unreasonable standard, that they are mostly bound to fail, imposed upon them. (Remembering that queer folk grow up us isolated individuals in overwhelmingly straight families and social milieus.) When you are in the gatekeepers of righteousness business, differentiation, complexity and effortless virtue are very much part of the game. This imposing of an unreasonable standard on a vulnerable minority sells effortless virtue to the overwhelmingly heterosexual majority (imposing a standard that is little or no effort for them, but which they can feel terribly virtuous for keeping and terribly morally superior to those who do not), distinguishes between the “righteous” and the “unrighteous” and establishes a criteria of righteousness that has to be (at least originally) told to folk by said gatekeepers. (Most human societies, at least pre-monotheism, did not find such matters to be of much moral moment. Nowadays, it tends to be a differentiator between the West and much of the Rest, many of whom–outside Islam–were taught that it was of moral moment by European colonial masters. What this piece does not get is that queer folk being a relatively small minority is, and has always been, the point–much like with the Jews, really.)

Devaluing people

Second, the action/orientation distinction wildly devalues the moral fact that there are people–millions and millions of people–with such orientation. In theist terms it amount to “God made a mistake, again and again and again; millions upon millions of times, and God keeps making it”. Claims about homosexuals having a “special calling” are nonsense on stilts, as we can see from the (finally now failing) endless efforts to deny homosexuals who act upon their erotic nature access to social goods. (That queer folk grow up as isolated individuals in overwhelmingly straight families and social milieus also means queer folk disproportionately benefit from urbanisation and improved information technology, hence the increased contemporary saliency of queer rights.)

More generally, the action/orientation distinction holds that people (and the moral implications to be drawn from that) are not to defined by how all people are, only by how some people are. I have been reading Pierre Manent‘s The Metamorphoses of the City: On the Western Dynamic, a book I find alternative frustrating and enlightening. Manent spends considerable time on Augustine‘s masterwork De Civitae Dei (The City of God), providing a revealing summary of Augustine’s views on nature and will:

We have here a fundamental Christian thesis that Augustine more than anyone else contributed to formulate and sharpen: man’s nature is good; his will is bad or inclined to evil … The very definition of a bad will is that it is the perversion of a nature that is good or capable of good. Augustine explains at some length how the human will, naturally attracted by the good, can nonetheless choose evil. The bad will does not have its cause in good nature; it is some way without cause.

Augustine was not an Aristotelian as such, his philosophical roots were in Neoplatonism, but that in itself is very much a philosophy after Aristotle. Moreover, Augustine looked widely for ideas and was a child of Aristotle in the sense that almost all Westerners are (and Muslims are generally not), accepting that there was a moral realm beyond revelation and that the world has an independent existence beyond the habits of God–so Augustine argued that, being the direct creation of God, the created world had greater authority (if they were in contradiction) than Scripture, which was the word of God mediated by fallible humans. (The Quran, by contrast, is the eternal, direct word of God unmediated by anything.)

We can see here the issue with same-sex attraction in this worldview. On the one hand, it is simply perverted will against nature. On the other hand, it is an orientation–people are strongly inclined to such “perversion”; millions of people. Hence formulations such as being “intrinsically disordered“. In the words of the then Cardinal Ratzinger:

At the same time the Congregation took note of the distinction commonly drawn between the homosexual condition or tendency and individual homosexual actions. These were described as deprived of their essential and indispensable finality, as being “intrinsically disordered”, and able in no case to be approved of (cf. n. 8, $4).

As this author reminds us, Aquinas–the supreme reconciler of Catholicism and Aristotelianism–held that all sexual desire outside marriage was “intrinsically disordered”. But there is a difference between heterosexual eros–which can find an approved outlet in marriage–and homosexual eros, which never has any approved expression ever, but must always be denied and sublimated. The latter is “intrinsically disordered” at a much more basic level.

Presumptions selectively natural

We can also see where the muddle comes from, at least in natural law terms. The typical natural law theorist is a heterosexual male, his sexual desires are directed towards women, casual empiricism shows that male and female animals mate to produce offspring. So, easy conclusion–his desires define human nature and mating defines the purpose of sex.

Where, in economics, the representative agent can only be properly modelled if they do not know they are the representative agent, the typical natural law theorist is much more arrogant. He, and folk like him, define human nature and what he has noticed about the natural world defines the nature of sex.

But some men have sex with other men and some women have sex with other women. Well, they are unnatural, they are acting against nature. Then folk even notice that some animals do the same (in the medieval period, hares, hyenas and partridges had that reputation). Well, they are being unnatural too, they are also acting against nature.

And so does the conclusion set the ambit of its premises. People who do not conform to the decreed nature do not count (as evidence toward human nature), observations of nature that do not conform to the decreed purpose of sex also do not count (as evidence about the purpose, function or role of sex).

The entire argument about queer emancipation is, at bottom, literally about whether they count as “real people” or not. Hence conservative monotheists define them out of such, and are outraged at any attempt to include them in. It is literally about defining the human and about whether everyone with a human face is “properly” human.

I (mostly) agree with Andrew Sullivan’s plea for genuine liberalism (and Scott Alexander has a helpful post about political tribalism and tolerance which is apposite), especially as Gordon College has a general ban on sexual activity amongst its students. Even more so given the rather repellant “secular commissars” trend identified by Damon Linker:

Contemporary liberals increasingly think and talk like a class of self-satisfied commissars enforcing a comprehensive, uniformly secular vision of the human good. The idea that someone, somewhere might devote her life to an alternative vision of the good — one that clashes in some respects with liberalism’s moral creed — is increasingly intolerable.

As someone has said in a related context, no one expects the Secular Inquisition.

And yet, the idea that Gordon College has, in a free society, a right to act upon has a deeply disturbing core. Even with the Christian missionaries in Africa Linker discusses, Christian evangelising has also had repellant consequences, notably in the recent attempts to make homosexuality a capital crime in Uganda. To be fair, it is not heroic doctors but more spin-offs from tele-evangelising (to which effortless virtue is such an attractive sell) that is responsible, but the latter are partly levering off the former.

The position that Gordon College takes has roots deep in Christian tradition and they are, if anything, being much more liberal than that tradition generally was. But the orientation/action distinction used to make that tradition more palatable remains deeply problematic in ways which very much touch on basic moral protections and participation in society.


[Cross-posted from Thinking Out Aloud.]

Quantity, physicality, source — the origins of currency names

By Lorenzo

The terms we use for units of currency–when they are not named after historical figures, terms for money or items once used as money–often come from one of three origins: quantity (number or, more commonly, weight); physicality (shape or content); or source. That pound (as in pound sterling, the oldest currency still in use) is originally a weight term is obvious–as it still is a weight term (at least, for those still using the old British weights and measures system). But can you tell me which of the three–quantity, physicality or source–the term dollar comes from? (Answer at the end of the post.)

(And if anyone could point me to the derivation of kip, the currency of Laos, that would be appreciated.)

Named after money or items used as money
Sometimes, it is hard to disentangle whether the unit derives from quantity, physicality or source. The dong, the currency of Vietnam, derives from the term for moneyreferring to Chinese bronze coins, with the Chinese terms it is derived from also referring to weight. So, is dong quantity, shape or source derived? The taka, the currency of Bangladesh also just means coin. As does the manat, the currency of Azerbaijan and of Turkmenistan. The Gambia dalasi probably derives from a local name for a 5-franc coin. The Peru sol comes from solidus (solid) a Roman coin but also means sun in Spanish.

The dobra of Sao Tome and Principe, comes from to fold; the connection to money is via doubloon, or in Portugese dobrao.

Currencies named after animals or shells typically have association with money or trade. The lev, the currency of Bulgaria comes from lion, as does the leu, the currencies of Romania and Moldovaas in the Dutch lion dollar or leeuwendaalder. The Croatia kuna means marten, whose pelts were used as trade items in medieval times. The Ghana cedi derives from a local name for cowrie shell, the most common money-item across time and space. The Guatemala quetzal, is named after the national bird, whose feathers were used as currency in Mayan times. The Papua New Guinea kina is named after a shell used in trade.

The Georgia lari derives from a word meaning hoard or property.

Historical figures
The lek of Albania is named after Alexander the Great, whose name is often shortened to Leka in Albanian. The Costa Rico colón is named after Christopher Columbus (Cristóbal Colón in Spanish). The Honduran lempira is named after Lempira, a folk hero who led native resistance against the Spanish. The Nicaragua cordoba is named after the country’s notional founder, Francisco Hernández de Córdoba. The Panamana balboa is named after the Spanish explorer, Vasco Numez de Balboa. The Tajikistan somoni is named after Isma’il ibn Ahmad (also known as Ismoil Somoni), regarded as founder of the Tajik nation. The Venezuelan bolivar is named after Simon Bolivar, the Venezuelan general central to the successful Spanish American wars of independence.

Plants, peoples, places
The gourde of Haiti means gourd as in plant. The Tongapa’anga is named after a vine.

Paraguay‘s guarani comes from an indigenous people whose language is taught in Paraguay. The loti of Lesotho derives from mountains. The kwacha, the currencies of Malawi and Zambia, means dawn.

The nakfa of Eritrea is named after the town that was at the centre of their independence struggle. The kwanza of Angola is named after a river. The pula of Botswana (also a dry country) means rain.

The oldest currency terms are almost all weight terms; such as shekel and talent. Shekels (sheqel) are still the currency unit of Israel. Some weight terms used in exchange never got beyond being a weight term–notably the Egyptian deben. Sometimes, the currency term just means weight–such as stater and peso. The currencies of ArgentinaChileColombiaCuba, the Dominican RepublicMexico, and Uruguay are all pesos. The Philippines also uses the peso, or piso. The Macau pataca comes from the Portuguese for peso.

Athenian “owl”, after 499BC

A (partial) exception on quantity and antiquity is the drachma, which comes from the verb to grasp, which does imply easy to handle. It is only a partial exception, as a drachma was also a small weight unit. The Athenian “owl” tetradrachm (because it had the owl of Athena on it) was perhaps the earliest trade currency coin. Drachma is the source (via Latin) for dirham, currently the currency of Morocco and the United Arab Emirates, and of dram, the currency of Armenia.

Using weight terms for money has been a continuing historical tendency–such as mark (still used in the currency of Bosnia-Hercegovina), the metical of Mozambique and the baht, the currency of Thailand. While the etymology of the Russian ruble is somewhat unclear, in the medieval period a ruble was a weight, the Russian equivalent to the markBelarus also has a ruble as it currency. The ouguiya of Mauritania derives from ounce in Arabic.

The tenge of Kazakhstan originally came from (weighing) scales. Apart from pound sterling (and its local derivatives in British territories), there is also the Egyptian pound, the Sudanese pound, the South Sudanese pound (and the former Irish punt).

Livre and lira are both derived from libra, a Roman unit of weight (also the source of the pound sign). The lira is still the currency of Turkey and is the local name for the Lebanese pound and the Syrian pound and colloquially for the Jordanian dinar.

Abbasid dinar 811

The most significant currency term derived from quantity which is a number rather than a weight was denarius (derived from containing ten), the source for the dinar and denaro, the Italian word for money. The currencies of AlgeriaBahrainIraqJordanKuwaitLibyaSerbia and Tunisia are all dinars, while Macedonia uses the (same derivation) denar. The former Iranian currency unit, the toman, also derives from a number.

The shilling, the currencies of KenyaSomaliaSomalilandTanzania and Uganda, derives from an old Anglo-Saxon accounting term.

Shape terms generally come from the use of coins. Yenyuan and won (the currency units of Japan, China and Korea respectively) all mean round or round object. The togrog of Mongolia originally meant circle or circular object.

Rupee derives from the Sanskrit rūpá, meaning beautiful form. The currencies of IndiaPakistanSri LankaNepalMauritius, and Seychelles are all rupees. While the Maldives uses the rufiyaa and Indonesia the rupiah (same derivations).

Ringgit (the currency of Malaysia, although it may also be used for the Brunei and Singapore dollars) means jagged, and refers to the jagged edges of the Spanish dollar (aka real de a ocho, aka peso de ocho, aka pieces of eight).

Kyat (the currency of Burma) comes from pulled together and apparently refers to the peacock seal of the original issuing King of Burma on the coin. The escudo, the currency of Cape Verde, comes from shield, referring to the heraldic shield on coins.

Other physicality
The material used could also be the origin of currency units; thus guilder derives from the Dutch or German for golden (gulden) and continued to be applied even when currency was no longer in gold. The Caribbean guilder is due to come into operation, replacing the Netherlands Antilles guilder as the currency of Curacao and Sint Maarten, in the Dutch Caribbean.  The zloty of Poland also means golden. The som, used by the Kyrgyz Republic and Uzbekistan means pure and implies pure gold.

Lübeck gulden 1341

The birr of Ethiopia means silver. The ngultrum, the currency of Bhutan, derives from silver bit.

The hryvania of Ukraine comes from a word meaning mane, but might also have implied something valuable worn around the neck. The word later came to be associated with silver or gold ingots of a certain weight, but that seems to have flowed from its use as a monetary term.

The earliest source term for currency I am aware of is the daric, named by the original issuer after himself. The most immediately obvious current source-derived currency is the euro, which has replaced quite a range of currencies. Names of countries, or contractions thereof, are used by several countries as their currencies. The currency of Afghanistan is the afghani; that of Bolivia the boliviano; that of Lithuania the litas; that of Nigeria is the nairaa contraction of Nigeria; that of Sierra Leone, the leone; that of Vanuatu, the vatu.

fiorino d’oro (florin) 1347

The first post-Roman gold coin minted in commercial quantities in Western Europe was the florin or fiorino d’oro, minted by the city of Florence. The florin is the currency of Aruba. The forint of Hungary is also derived from the fiorino d’oro. The solidus and the hyperpyron (super-refined) of the Eastern Roman Empire was also known as the bezant (Byzantium) after the original name of Constantinople.

Portuguese half real C15th

Ducat came from ducal, real means royal, and is still the currency of Brazil as well is the source for riel, the currency of Cambodiarial, the currencies of IranOman and Yemen; and riyal, the currencies of Qatar and Saudi Arabia. The ariary of Madagascar also derives from riyal. The Swaziland linlangeni means member of the royal family (i.e. royal).

The Czech Republic‘s koruna means crown, as does Denmark‘s kroneIceland‘ kronaNorway‘s krone and Sweden‘s krona.

The original franc 1360

The franc originally meant free (and frank), and became associated with coins from the Rex Francorum (King of Franks) on early coins. The original franc coin celebrated the freedom of Jean II, captured by the English at the Battle of Poitiers (so is apparently a pun). Francs are the currencies of BurundiComorosDemocratic Republic of CongoDjiboutiGuineaRwanda plus Switzerland (and Liechtenstein) and, in the form of the CFA franc, is the currency of France’s current overseas territories and of its former African empire; either as the West African CFA franc–the currency of BeninBurkina FasoGuinea-BissauIvory CoastMaliNigerSenegal and Togo–or as the Central African CFA franc–the currency of CameroonCentral African RepublicChadRepublic of CongoEquatorial Guinea and Gabon.

The rand, the currency of South Africa, comes from the Witwatersrand (“ridge of white waters”), the ridge Johannesburg was built on, and where most of the country’s gold deposits lie.

About the dollar
Dollar is most famously the currency of the US. The formerly US-administered territories of the Marshall Islands, the Federated States of Micronesia and Palau simply stayed on the US$ after independence. While East Timor just went straight onto the US$. Various local jurisdictions use the US$, such as the British Virgin Islands.

Countries sometimes “dollarise” because the local monetary authority proved to be too spectacularly incompetent in managing the preceding local currency. EcuadorEl Salvador and Zimbabwe fall into local mismanagement category (and other countries have also “dollarised” at various times). There are also countries where the US$ are as acceptable, or more acceptable than, the local currency.

Lots of countries have a dollar as their currency, apart from those already mentioned: AustraliaBahamasBarbadosBelizeBermudaBruneiCanadaCayman IslandsDominicaFiji, GuyanaHong KongJamaicaKiribatiLiberiaNamibiaNew ZealandSingaporeSolomon IslandsSurinameTaiwanTrinidad and TobagoTuvalu. Various Caribbean nations share the East Caribbean dollar. The tala of Samoa is dollar in Samoan. Most of these are former territories or protectorates of the British Empire–apparently, dollar is the preferred currency term for “not the pound (anymore)”.

Joachminsthaler 1525, the original dollar

Dollar may be the most widely used name for currency units (followed by franc), but it has a remarkably specific origin. In 1520, the Kingdom of Bohemia began to mint coins from silver mined in St Joachim’s valley, or Joachminsthal (modern day Jáchymov). The coins became known as Joachminsthalers. Which became shortened to thaler (thing or person from the valley), which became a very widely used coin name. Most famously, in the Maria Theresa thalerThaler became the Dutch daaldar and English dollar.

So, the tala of Samoa is actually closer to the original derivation than is the English dollar.

It was also a bit surprising to discover how large Rome and Portugal loomed in Islamic currency names: between lira, dirham, dinar and various derivations from real, the glory that was Rome and the brief Portuguese domination of the Indian Ocean seems to have left quite a monetary mark. Though less surprising given that Rome loomed so large in Islamic history, and the Ottomans had pretensions to being the Islamic successors to Rome, while the current-day users of derivatives of real have a long history of not being keen on the Ottomans.


[Cross-posted from Thinking Out Aloud.]

The good people syndrome

By Lorenzo

I doubt that there is any more corrupting element in contemporary public debate than the good people syndrome: talking heads who say things, not because they have any knowledge or understanding, but because it is what good people say.

There are forms of it on a wide range of issues, and on all sides of politics, but it seems unlikely that the public debate about any issue is as thoroughly corrupted by the good people syndrome as that on Islam. 

Ignorant familiarity

Part of the problem is quite straightforward: Islam is a religion which is omnipresent in the news but absent in the shared experience of the overwhelming majority of Westerners. Furthermore, it is not merely a religion, it is also a civilisation; one with superficial similarities to our own but quite deep differences. Faced with the deadly combination of surface familiarity and deep ignorance, the good people syndrome fills the gap. Especially for modern secular folk, who generally just can’t take religious motives seriously. 

To take perhaps the most important difference: we in the West are children of Aristotle and Muslims are mostly not. We are generally not actual Aristotelians (though Aristotelian philosophy is currently enjoying one it recurring resurgences within Western philosophy). But we do accept two basic Aristotelian ideas–that the world has its own inherent existence and structures and the moral realm exists independent of revelation.

These ideas may seem so basic one might wonder how anyone could think otherwise. Well, mainstream Islam thinks otherwise, for it accepts neither idea. A consequence of the defeat of Aristotelian ideas in mainstream Islam, particularly due to the efforts and influence of Abū Ḥāmid Muḥammad ibn Muḥammad al-Ghazālī (1058-1111), the most important figure in mainstream Islam after Muhammad himself. 

For al-Ghazali, and mainstream Islam ever since, causation is merely the habits of God, which He can change at any time, while there is no good outside the realm of revelation. That is, things are good because God wills it, not–as in Christianity and Judaism, especially after Mosheh ben Maimon aka Maimonedes (1138?-1204) and Thomas Aquinas (1225-1274)–God wills them because they are good. “Conversations” between the West and Islam are mostly dialogues of the deaf, because the underlying presumptions are so different.

The golden age of Islamic achievement largely predates al-Ghazali (and that of Arab achievement almost entirely does). Not entirely a coincidence, since causation as the habits of God and revelation as the limits of morality do rather inhibit intellectual effort being put anywhere other than religion. The shock of the Mongol incursions, including the end of the Baghdad Caliphate (1258), reinforced this inward looking tendency, this entrenched atavism. An atavism that Arab journalist Hisham Melhem identifies as central to the contemporary collapse of Arab civilisation but which he studiously fails to identify a source for. 


Islam became a civilisation remarkably uncurious about the outside world, poorly able to mobilise its resources. A civilisation which lacked responsive resilience, and so dealt badly with the challenges of history (as it largely still does, at least in the Middle East–Bengali and Malay Islam does rather better). Thus, Palestinian intellectual Ahmad Y. al-Hassan (1925-2012) can list a whole series of “bad things” which happened to Islam, but entirely fails to ask why Islam so persistently failed to rise to the challenges facing it. For example, Europe learnt far more from its (relatively minor) crusading effort (which al-Hassan paints as far more destructive than than it was) than Islam learnt from its centuries of far greater aggression against Europe and Christendom (which al-Hassan entirely ignores), even after Islam began to fall behind European technology and organisational capacity.

Awkward avoidance

One can understand the dilemma of Arab and Muslim intellectuals. It is not merely that not blaming Islam is what “good people” do, it is that opening up that issue makes any such intellectual a target for the homicidally enraged who are both a symptom and a cause of Middle Eastern Islam’s cognitive stagnation and disastrous divisions.

One can understand the dilemma of Western strategists dealing with the jihadis: say that the problem is Islam and that appears to make all Muslims (over a billion of them) the enemy. Yet, say the problem is not Islam, and one is basing one’s strategy on untruth and delusion–not a basis for any sort of success. For the jihadis are very much a product of Islam: indeed, they represent the modern iterations of continuing patterns within Islam.

So the problem is within Islam. Not an ideal rhetorical formulation, but one that has the advantage of being true.

The good person pay-off

But neither of these excuses hold for Western talking heads. They are not responsible for Western strategy and a clearly in minimal danger from enraged jihadis. Alas, that not-being-responsible-for-anything is much of the problem: given the lack of any responsibility (except,  clearly somewhat notional one to truth and understanding) aiming to be seen as one of the good people gives by far the best pay-off.

So ignorant nonsense gets spouted because it is established as what good people say.

I was confronted with a particularly egregious example of good people syndrome listening in a waiting room to some talking heads discuss the recent fatal (to the attacker) stabbing at a Melbourne police station. One of the talking heads opined about “disenfranchised youth”. The dead attacker (shot dead with a single bullet after stabbing two counter-terrorism officers at Endeavour Hills police station: a somewhat reassuring contrast to police killings in the US–i.e. not an unarmed man, not shot multiple times) fits in with a much larger pattern. The “disenfranchisement” of such homicidal males being that they are not–given their gender (male) and beliefs (Muslim)–master-belief overlords of what they survey, as promised by God through the Quran, the example of the Prophet and Sharia.

When Israeli PM Benjamin Netanyahu wanted to explain what the jihadis are about in his 29 September 2014 speech to the United Nations General Assembly all he had to do was quote them. Starting with the self-proclaimed Caliph of the Islamic State, Abu Bakr Al-Baghdadi two months previous:

A day will soon come when the Muslim will walk everywhere as a master… The Muslims will cause the world to hear and understand the meaning of terrorism… and destroy the idol of democracy. Now listen to Khaled Meshaal, the leader of Hamas. He proclaims a similar vision of the future: We say this to the West… By Allah you will be defeated. Tomorrow our nation will sit on the throne of the world.

Or, perhaps General Muhammad Ali Jafari, current commander of Iran Revolutionary Guards:

Our Imam did not limit the Islamic Revolution to this country… Our duty is to prepare the way for an Islamic world government…

Or Iran’s current Foreign Minister, Mohammad Javad Zarif, in a book written a few years ago:

We have a fundamental problem with the West, and especially with America. This is because we are heirs to a global mission, which is tied to our raison d’etre… A global mission which is tied to our very reason of being.

… How come Malaysia doesn’t have similar problems? Because Malaysia is not trying to change the international order.

Changing the international order to a Muslim order, of course. Such an order does not require everyone to be Muslim; just have the Muslims in charge and everyone obeying Sharia, the law of God, sovereign of all.

Such ambitions may seem mad–the master-race Nazis only wanted lebensraum; these ambitions are much more grandiose. But the Companions (Sahabah) of the Prophet overthrew the Sasanian Empire–heir to over a millennia of Zoroastrian empires–and half the Roman Empire in a few short decades. Ascribe the 1989-1991 fall of the Soviet Empire to the mujahideen in Afghanistan and the example of the Companions of the Prophet has powerful contemporary as well as religious resonance.

(As an aside, it is also worth remembering that in 1923 Hitler was a beer hall agitator, leader of a small movement, part of a coalition whose attempt to overthrow a provincial government was put down with almost contemptible ease: 18 years later, his armies had occupied Austria and the Czech lands, had conquered Poland, Denmark, Norway, the Low Countries, France, Yugoslavia, Greece and had reached the outskirts of Moscow.)

Besides, the journey itself is enough: die in the service of creating the Muslim World Order and off to Paradise you go. Not to mention a sense of brotherhood, purpose, masterly killing, plus possible rape and pillage on the way through. Hence Islam’s most obvious comparative advantage being in homicidal religious gangsterism.

But, hey, that is not what good people say.  And what they don’t know about Islam is almost everything.


[Cross-posted from Thinking-Out-Aloud.]

The eternal now of conservatism (3)

By Lorenzo

In my previous two posts, I looked at pieces by two conservatives–James Livingstone on gender and soldiering and Justice O’Scannlain on gender and marriage–who both imagine they are basing their reasoning on history and verities of human nature when they are doing nothing of the kind.

Sodom and genocide

In his 2013 lecture, Justice O’Scannlain alludes to the work of Robert George and associates on the nature of marriage, particularly in the context of US Supreme Court decisions such as Lawrence which, in the Judge’s words:

struck down a Texas criminal prohibition on homosexual sodomy

The term sodomy, or, as Robert George likes to write, sodomitical, alludes to the natural law interpretation (in fact, perversion) of Genesis 19, the story of Sodom and Gomorrah. The terms invoke killing people for their sexual practices. To use the term sodomy and its cognates is to invoke “the people God wants dead”, the people who should be dead–if not literally, at least to any definition of the human.


Ironically, in view of Justice O’Scannlain’s hostile invocation of “abstract theory”, that is precisely what is wrong with the natural law interpretation of Genesis 19, an interpretation that has since become traditional, at least in Christianity and Islam: the imposition of abstract theory to pervert understanding of the original text. The original rabbinical understanding of Genesis 19, based on oral tradition and close reading of Scripture, was that the sin of the Cities of the Plain was that they were anti-moral: that they actively punished those who looked after the weak and vulnerable. Being struck down by God’s wrath for this makes at least some sort of grim sense, especially if Genesis 19 is read as a rape scene attacking that most vulnerable figure–the guest from afar. For a social pattern of stripping the vulnerable of moral and legal protections can go on and on: as the history of the Catholic Church’s treatment of Jews and queers demonstrates.


What makes no sense is God destroying entire cities because He thought that butt sex was icky (especially as failure to engage in procreative sex means the “problem” goes away in a generation). But that is where the natural law interpretation of the Sodom story takes us–if only at some violence to the original scriptures. (Attempted rape no more invalidates same-sex activity than it does opposite sex-activity: and God had already decided to destroy the Cities before the apparent attempted rape of His messengers.) It makes entire sense if one’s role is to be gatekeepers of righteousness–for then the more bizarre and unexpected the demands of righteousness, the more you need said gatekeepers to tell you what they are.

Thus God “purifies” human society by killing the sexually divergent. As evidenced in the charming Jesus-the-genocidal story in the medieval bestseller compiled by a beatified Archbishop of GenoaThe Golden Legend.  But we really should not be surprised by such a tale being part of Church literature; the terms sodomy and sodomitical explicitly invoke the notion that society is purified by the death of such persons, the “unnatural” committers of treason against the immanent purposes of God’s natural order. 

Burning of Knight of Hohenburg & page for sodomy, Zurich, 1482

Norman Cohn famously labelled the The Protocols of the Elders of Zion as a Warrant for Genocide. Actually, the original warrant for genocide was the natural law interpretation of Genesis 19: the notion that society is purified by mass murder, by the slaughter of the different-and-vulnerable. Rather than, as in the original rabbinical interpretation, God particularly enjoining moral attention to, and protection of, the vulnerable. Natural law reasoning displaying its dark, and morally impoverishing, side.

Free-floating notions

Robert George and confreres base their arguments on claims about the nature of sex and the nature of marriage. As previously noted, the anthropologically defining aspect of marriage is that it creates in-laws: that is, it broadens kinship connections, it creates wider patterns of social support. That, along with the commitment to pooling effort and resources, is what makes marriage the preferred social mechanism for raising children–hence the anthropologically widespread practice of adoption. But to such natural law theorists, such anthropological evidence does not count. Much of the appeal of natural law theory is precisely the belief that one’s immediate apprehension of “the nature” of things is enough of a starting point.


There is also, in such writings, the perennial conflating, via the use of the terms procreation and procreative, of conception with child-raising (the bit marriage is actually useful for); a conflation which becomes bizarre in the significance given to acts of conceptive “form” even when actual conception is impossible. On the other hand, that such acts are in anyway problematised–as George and Bradly implicitly admit in their 1995 paper–is itself a mark of how they make the human dance to a conception of the narrowly physical so that structures of gonads become more important than the purposes of people.

That the approach problematises sexual activity so profoundly comes out when they write of, in said 1995 paper:

acts that might perform on each other’s bodies

A bizarre way to express giving another profound psychical and emotional joy. Pleasure, catharsis, bonding, expressing love: these profoundly human things are all imprisoned within the dictatorship of (the form) of conception. An impoverishing of erotic understanding which is also an impoverishing of biological understanding, since animals use sex in nature much more broadly than just conception–and the more cognitively complex the species, the more that tends to be true.


As for the notion that acts non-conceptional in form are an assault on the moral integrity of persons because it is mere instrumental use of oneself and another; that is just another manifestation of th aforementioned impoverished understanding. Not to mention one that would apparently make all soldiering (for example) inherently immoral, as generals regularly use soldiers in a quite instrumental fashion, to the point of expending their lives. The “preserving moral integrity” argument is just an attempt to make more palatable the underlying moral principle that gonads are more important than people. Successful only to the extent that, once again, human experience is ignored–particularly as queer people discover again and again, being open to themselves and others about their sexual nature is the path to psychological (and moral) integrity. 

But, in such natural law reasoning, anything in human psychology, social arrangements or in animal behaviour that contradicts the assertion that the structure of gonads counts more than the purposes, aspirations and experiences of people does not count. It is a particularly striking example of the besetting sin of natural law reasoning–that the conclusion gets to set the ambit of its premises.

Useful for righteousness gatekeeping

Which makes such reasoning very attractive as a mechanism to buttress religious doctrine. As is fairly obvious in George et al talking of an “adequate reason” to have sex, thereby expressing monotheism’s perennial problematising of sex that (in monotheism, but not animism or polytheism) separates us from, rather than connects us to, the divine–except via the creative function. Hence the rhetoric about the “unitive” nature of sex that is conceptual in form. 


So, those who fall in love with members of their own sex are not entitled to have sex, except with someone they are not erotically engaged with, but never with someone they are. The structure of human psyches–millions upon millions of them–are subordinated to a pathetically narrow characterisation of a specific organ. Queer folk become just perverted mistakes, natural law theory says so: natural law theory which is allegedly based on the objective facts of human nature and existence, said objective facts excluding the existence of queer folk except as perverted mistakes. Their existence, aspirations, even experience, do not count as evidence.

But, apparently, evidence is not actually required. George et al are very big on the notion of intrinsic value, though they note that not everyone grasps such value:

people who fail to grasp the intrinsic value of such basic human goods ordinarily do not judge them to be valueless. …

If intrinsic value takes such special understanding to grasp, it seems a very unlikely basis for morality. But a very good basis for justifying the role of gatekeepers of righteousness. But we are not talking of something grounded in anything much:

Intrinsic value cannot, strictly speaking, be demonstrated. Qua basic, the value of intrinsic goods cannot be derived through a middle term. Hence, if the intrinsic value of marriage, knowledge, or any other basic human good is to be affirmed, it must be grasped in noninferential acts of understanding. Such acts require imaginative reflection on data provided by inclination and experience, as well as knowledge of empirical patterns, which underlie possibilities of action and achievement.

Except, as we have seen, great masses of experience and empirical patterns do not count. The conclusion gets to set the ambit of its premises, where experiences and aspirations contrary to those “imaginative reflections” are discounted. Which do not turn out to be very “imaginative” at all, but, in fact, profoundly impoverished.

So narrow as to be not reasonable

It is clear that for Justice O’Scannlain, and for Robert George and his collaborators, by reason is meant what I am aware of and pay attention to.  Since what they say is based on “reason” what they do not know does not count and thy need not enquire into it. So the invocation of reason becomes a commitment to ignorance and to ignoring. Hence being highly selective of whose experience, and whose voices, counts.


Understanding the past requires not imposing our own preconceptions on it. Human nature is that which encompasses all humans, not just a selected subsection thereof. Tradition has to be judged in its context, history is wider than what is congenial or convenient. As is experience. Social arrangements are adaptations to circumstances, not magically grounded in verities of human nature. Merely waving around the words history, experience, tradition, reason, does not mean that you actually understand the first three, or are properly using the last.

And if history is based on a “fixed” human nature, but only some history counts, then those whose history does not count do not get to be part of what defines human nature. They get to be defined as outside the “properly” human.

Robert Livingstone, Justice O’Scannlain, Robert George all mistake historical contingencies for verities of human nature; they all invoke the “eternal now” of conservatism. An invocation far more marked by willful ignorance than understanding.

Given the history and dynamics of monotheism–and natural law reasoning within monotheism–it is not surprising that matters of sex and gender should operate in such a way. (Especially for Catholic conservatives.) But what we want to see can be a very unreliable guide to what is. Pioneer sociologist Emile Durkheim‘s explanation of the sexual division of labour was remarkably patronising of women:

According to his theory, among the very primitive (both in the distant past and today) men and women are fairly similar in strength and intelligence. Under these circumstances the sexes are economically independent, and therefore “sexual relations [are] preeminently ephemeral”. With the “progress of morality,” women became weaker and their brains became smaller. Their dependence on men increased, and division of labor by sex cemented the conjugal bond. Indeed, Durkheim asserts that the Parisienne of his day probably had the smallest human brain on record. Presumably she was able to console herself with the stability of her marriage, which was the direct result of her underendowment and consequent dependence.

Apparently, it took a female anthropologist to put the pieces together.  Contrast the above with the key passage in anthropologist Judith Brown’s 1970 note on the division of labour by sex:

Women are most likely to make a substantial contribution when subsistence activities have the following characteristics: the participant is not obliged to be far from home; the tasks are relatively monotonous and do not require rapt attention; and the work is not dangerous, can be formed in spite of interruptions, and is easily resumed once interrupted.

There is no necessary connection between what is congenial and what is true. Hence this is what happens when previously ignored or excluded perspectives get to have their say. We learn things and our understanding is broadened. But not if we invoke history, tradition and reason to block doing so under the delusion that the resultant “eternal now” is clear-eyed justification for anything much, beyond a certain smug, ignorant, self-righteousness.

Broadening moral understanding

Jonathan Haidt has argued that conservatives tend to have a broader range of moral foundations than do progressives (pdf). George and his confreres clearly believe that they have a profounder moral grasp than do supporters of same-sex marriage. But one is much more struck by how impoverished their viewpoint is, not merely in the sense of being factually impoverished (though it is profoundly that) but also morally impoverished in the lack of awareness, or active disregard, for the wider human implications of what they argue for.


It is beyond the capacity of  public policy to change human sexuality, but it can easily punish the vulnerable for being different. Treating people as being outside the “properly” human has dire consequences for family dynamics, for human relationships and human lives generally. Hardly surprising, as the point of morality is to permit us to live together in much richer lives than would otherwise be possible: so naturally, reducing the, or excluding from, moral standing entire categories of people blights lives. But if said categories of people are outside the properly human, their lives and experience do not count; at least not enough to change moral understanding.

Which is precisely why the arguments of George et al are losing. Because, in the words of Ghanaian philosopher Kwame Anthony Appiah:

The increasing presence of “openly gay” people in social life and in the media has changed our habits. And over the last 30 years or so, instead of thinking about the private activity of gay sex, many Americans and Europeans started thinking about the public category of gay people.

Or, in other words, that people are more important than gonads. This ongoing shift in opinion may represent a narrowing of what acts are regarded as morally significant, but it represents a broadening of who is accepted as fully human, as a fully legitimate manifestation of the human, as enjoying therefore the full protection of morality and the law. And that is a profound moral advance: not a loss of moral understanding, but an expansion of it.


 [Cross-posted from Thinking Out Aloud.]

The eternal now of conservatism (2)

By Lorenzo

Catholic writer James Livingstone (see previous post) is hardly the only conservative writer who sees inherited social arrangements as based in verities of human nature rather than contingent historical circumstances.

Not counting as human

This notion of social arrangements as being rooted in verities of human nature, not the contingencies of history, can have a very dark aspect. At its worst, it can exclude the historically disenfranchised from being regarded as human, or at least, as “properly” human. This dark aspect lingers very close to the surface in a 2013 lecture by US Federal Judge, The Hon. Diarmuid F. O’Scannlain. 

In his discussion of the contemporary implications of Justice Joseph Story‘s (1779-1845) natural law jurisprudence, Justice O’Scannlain writes of:

the philosophical blindness of abstract theory detached from experience, tradition, and the very nature of man.

What if experience and tradition contradict themselves? Why does not history include the experience of the excluded or repressed? The notion that history, experience and tradition form a mutually supporting triad can only be maintained at the cost of significant, highly selective, editing of both history and experience. The selection processes of history are very far from being morally pure, or morally reliable, hence tradition can be a very dubious guide, especially if circumstances–particularly technology and knowledge–change. If we are to give credence to the past, we need to give credence to all of it.


Justice O’Scannlain sees the natural law tradition as providing grounding for principles of justice and morality in human nature:

If there are universal principles of justice … then those universal principles must exist by virtue of what it means to be human, and if there is no such thing as a stable human nature, then there can be no such universal principles.

Of course there are principles of justice–the first of which is that people count as persons. The problem with the alleged “natural law” is precisely that is typically conceived in a way such that various categories of people, their experience and history, are deemed not to count.

Justice O’Scannlain draws attention to, and critiques, the so-called “sweet mystery of life” passage in the US Supreme Court decision Planned Parenthood v Casey wherein the majority opined that: 

… at the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life. Beliefs about these matters could not define the attributes of personhood were they formed under compulsion of the State.

This reads as an attempt by the US Supreme Court to acknowledge a wider range of voices and perspectives had entered into the public arena than had much purchase there in the past. Not so much new voices as the voices of the previously repressed.  Justice O’Scannlain writes that:

The passage does not necessarily deny that there is an objective human nature, but it insists that the law cannot reflect a particular conception of human nature. as the Casey passage says, each of us is to decide for ourselves what defines our existence and the mystery of life.

He is much concerned with the loss of a fixed notion of human nature. But the question at issue is not about fixed notions of human nature, but preconceived ones–not the same thing at all. All humans are part of defining human nature, not just the previously socially advantaged. The real fight here is not over objective conceptions of human nature versus malleable ones, as Justice O’Scannlain claims, but between a narrow and a broad view of human nature. Justice O’Scanlain does not wish to acknowledge–nor have the law acknowledge–a diverse human nature, that sexual and gender diversity is also part of human nature: not as abstract theory, but as simple human reality.


Justice O’Scannlain writes of the US Supreme Court decision United States v. Windsor, which struck down parts of the Defense of Marriage Act (DOMA), that the decision:

affirmatively declares that there is no objective reality to marriage and that any contrary view is irrational. This goes a long way toward ultimately declaring that the objective view of human nature is itself devoid of reason.

Anthropologists have found only one feature of marriage common across all human societies–that it creates in-laws. Marriage is a social creation and exists in varied forms across human societies. Of course that leaves it open to particular societies to define it in particular ways. Western societies have never had exactly the same conception of marriage and have changed their conceptions of marriage in various, sometimes, dramatic ways across the centuries: notably the abolition of coverture marriage.

Just as does Kenneth Livingstone, Justice O’Scannlain provides an invocation of history and tradition wildly lacking in any sense of history and ignorant of its own tradition. In his use of a mythic, ahistorical (indeed, metaphysical) notion of marriage, “objective” is being used to support the pre-conceived, but in a way which dramatically floats free from actual history, rather than, perhaps somewhat more surreptitiously, editing it conveniently.

In discussing the US Supreme Court Decision Lawrence v Texas, which struck down a Texas sodomy law, Justice O’Scannlain writes:

Lawrence was content to minimize the importance of pre–sexual revolution history. Windsor, after acknowledging that the conjugal definition of marriage has existed literally “throughout the history of civilization,” minimizes this highly significant fact in order to discuss the “new perspective” of same-sex marriage.

The sexual revolution responded to changed circumstances (particularly female control over fertility due to the contraceptive pill) and voices previously repressed–often with considerable brutality–being able to be heard. 


Technology and knowledge had changed. And again, history does not tell us quite what the good Justice believes. If the US Supreme Court took the view that same-sex marriage has no history, then it was engaging in bad history. So much of the contemporary debate over marriage, but particularly conservative invocations of human nature, is about being highly selective about what history counts, and whose history counts. Or simply being ignorant of history.

And if history is based on a “fixed” human nature, but only some history counts, then those whose history does not count do not get to be part of what defines human nature. They get to be defined as outside the “properly” human; and the notion of “proper” and “improper” forms of the human never leads to good places. Hence natural law reasoning, when based one’s understanding of history and social arrangements as reflecting eternal verities of human nature rather than the contingencies of history, can have very dark implications (as I will explore in my next, and final, post in this trilogy).

[Cross-posted from Thinking Out Aloud.]

The eternal now of conservatism (1)

By Lorenzo

In a paper on how to reliably measure political (i.e. economic and social) conservatism, psychologist Jim Everett makes a useful distinction:

authoritarianism and conservatism are distinct because authoritarianism focuses on aversion to difference across space (i.e. diversity of people and beliefs at the present time), while conservatism reflects aversion to difference over time (i.e. change). As such, there is no logical connection between the two, even if they often co-occur in practice.

They often co-occur in practice because societies have frequently used mechanisms to minimise diversity of people and beliefs, particularly in specific social roles. If those mechanisms weaken, or are explicitly challenged, that will lead to change across time (likely to be resisted by conservatives) as well as more diversity across space (likely to be resisted by authoritarians). So, conservatives will find themselves committed to blocking diversity and authoritarians to resisting change.

The more one is aware of the diversity of human arrangements, the less likely one is to be to be discomforted by diversity or change. Which is to say cosmopolitanism–awareness of the contingency of human arrangements–tends to be antithetical to both conservatism and authoritarianism. Especially as it generates a broader view of what can be made to work–making both diversity and change seem less threatening–and a broader view of what aspirations and capacities people have or aspire to.

Patrilineal plays

Not all human societies have, for example, been patrilineal (male-line descent). Plough-based farming societies and herding societies are, for entirely understandable reasons. As anthropologist Judith Brown’s 1970 note on the division of labour by sex explained, women have tended to do activities compatible with child-minding, men those that were not. More specifically:

Women are most likely to make a substantial contribution when subsistence activities have the following characteristics: the participant is not obliged to be far from home; the tasks are relatively monotonous and do not require rapt attention; and the work is not dangerous, can be formed in spite of interruptions, and is easily resumed once interrupted.

Neither herding nor plough-based farming were compatible with child-minding, so men did them. Which meant that men left the main economic asset (land or animals) to their sons (i.e. patrilineally), which meant that males controlled the dominant economic asset. So, women moved in with their husband’s family (i.e. such societies are overwhelmingly patrilocal).

Since maternity is certain but paternity is not, it became important to both control, and be seen to control, the fertility of women. The women of the husband’s family would help police the respectability of wives: the “damned whores or God’s police” phenomenon. (Some societies were bilateral–both male and female descent counted–but such societies were still mostly patrilocal, since patrilineal descent counted.) Families were expected to reliably deliver the controlled fertility of their daughters to any husband’s family: the dynamic which is the basis of (dis)honour killings. 

Needless to say, all this involved massive discounting of the decision-making rights and capacities of women, though the degree to which they did so varied considerably between societies.

Matrilineal alsos

But there have been matrilineal and matrilocal societies. Hoe-based farming is compatible with child-minding. So, in many such societies, women did the farming and left the land to their daughters. Husbands would move in with their wife’s family. As plough-based farming is more productive than hoe-based farming, and herding was the only significant alternative to farming for complex societies, patrilineal-and-patrilocal arrangements came to dominate human history. Making such arrangements seem “natural”. But they are simply historically contingent reactions to circumstances. As circumstances change, so will such arrangements.

For example, increased prosperity makes neolocal (husband and wife move into their own residence) much more common, which inhibits supervision of women by their husband’s relatives. Increased prosperity increases non-parental child-minding. Technology increases the demand for female labour. Most dramatically of all, the development of the contraceptive pill gives women safe (especially compared to pregnancy), cheap, reliable control over the own fertility.

All this dramatically weakens previously operating selection mechanisms directing women into particular social roles. It turns out–surprise!–that women have broader aspirations than those they were previously channelled into. Hence the development of the “women in all walks of life” phenomenon.

But if you think that those patrilineal-and-patrilocal mechanisms represented “natural” society, and operated for reasons which are inherent in the nature of the human, and not in contingent social circumstances–in other words, operated according to some proper “eternal now”–then all this can be a touch confronting.

As this piece on gender and warfare illustrates nicely. Catholic writer Kenneth Livingstone makes some perceptive points on why the Islamic State finds it so easy to recruit young Muslim men from all over the globe. (This piece by Danish psychologist Nikolai Sennels provides a supporting perspective.) It is when Livingstone comments on his own society that his ignorance of the contingency of human social arrangements, his commitment to the “eternal now” of conservatism, becomes most striking.

Utterly precedented

He finds the US Army’s acceptance of women and queer soldiers to be

unprecedented social experiments …

Which is precisely what they are not. Many societies had female warriors. About one fifth of Scythian and Sarmatian warrior graves are of women. (Possibly, the source for the Greek legends of Amazons.) Out in the steppes, restricting fertility had value, since it took a lot of grass to support animals sufficient to support one human and grasslands could not be made larger or more productive by human effort, given the technology of the time. Moreover, the distances were vast and the herding-men were often absent from camp: it made sense to teach women to fight, if only to defend the camp and the children. Once you did that, using young women as scouts and archers in raiding parties, and even armies, was a natural step, which was taken.

Low-population-density societies often made the same decisions: hence warrior women being a feature of both Celtic and Germanic societies. So, no it is not ”unprecedented” to have warrior women. Indeed, given that Celts, Germanics and Indo-Europeans generally are the source cultures for Britain, and those societies descending from them, it is part of the deeper history of Mr Livingstone‘s own cultural origins. Alfred the Great‘s daughter Æthelflæd, the Lady of the Mercians, was a noted war leader, for example. If, however, you live in an “eternal now”, it will not even occur to you that such history exists or, indeed, is possible (except as an “unprecedented social experiment”).

As for queer soldiers, since most human societies have not thought such matters to be of much significance, there have been plenty of queer soldiers in history. The Spanish, for example, recorded their startlement at finding “sodomite” warriors being both accepted and effective in Amerindian societies. (In accordance with their own understanding of higher morality, they burnt such as the stake if they captured them.) The Sacred Band of Thebes provides a particularly striking example of queer soldiers. The Greeks thought same-sex love interest so entirely compatible with the heroic warrior that localities would boast of which particular local young man was one of Heracles‘s lovers. While same-sex mentoring, including sexual relations, was an entirely accepted part of Spartan culture: particularly appositive, given that both kingly families claimed descent from Heracles.

That, under the influence of monotheism (despite much rationalising to the contrary, it had no other basis) Western armies banned same-sex relations between soldiers indicated that queer men were certainly willing to be soldiers. As, clearly, women were and are, if given the opportunity. The question comes back to what selection/control mechanisms were operating (and why).

Livingstone is too sophisticated to simply go for “women and queers being soldiers is icky” and so makes a somewhat different claim:

Well, yes, it can be a bit confusing when the person in charge of manhood training is a woman. It’s not a question of competency, it’s a question of gender roles.

Actually, no it is not. Being an effective warrior has no matching connection to gender roles, unless a society chooses to construct itself in that way. There is nothing in the human which requires that to be so.

Basically, Livingstone’s claim is that (straight) boys can’t cope if they have to share. Which is nonsense on stilts. The notion that Scythians, Sarmatians, Celts, Amerindians, Spartans etc were ineffective warriors because they fought with women and/or queers simply shows flagrant ignorance of history. But such ignorance is what the “eternal now” of conservatism so often proves to not only be based on, but to actively require.

Gang games

Middle Eastern oasis-herding–being far more localised–was different from steppe-herding in that it lacked incentives to arm women and was entirely compatible with their highly-controlled cloistering. The more women are blocked from participating in public life, or power-activities such as fighting, the more their standing in that society is weakened. It is no accident that women had much higher status in Celtic, Germanic and steppe-society than the Middle East. Nor is it likely an accident that women had higher status in the steppe-origin monotheism (Zoroastrianism) than they did in the Middle-Eastern origin monotheisms (Judaism, Christianity and Islam).

Hercules and Iolaus, with Eros between them, C4thBC.

In fact, the ability to utilise talents across its societies is one of the West’s great advantages. Livingstone can see the male-gang aspect of Islam but, because he looks at gender and sexuality as part of an eternal now, as rooted in verities of human nature rather than the contingencies of history, he is blind to where the West’s advantages lie. A fragile masculinity which cannot share, and so must be pandered to at the cost of the restriction of others, is not the basis of a resilient society. On the contrary, acceptance of the values of self-expression and tolerance of diversity are strongly correlated with better governance (and higher incomes).  

Understanding the past requires not imposing our own preconceptions on it. Hence archaeology is finding more evidence of women buried with weapons as archaeologists stopped insisting that anyone buried with a weapon had to be a man. Lost in his “eternal now”, Livingstone cannot see the present because he does not see the contingency of history, and so of social arrangements.


[Cross-posted from Thinking Out Aloud.]

In a nutshell

By Lorenzo

I can summarise my previous long post on the Middle East very simply: Palestinians are currently Jim Crow-era Southern blacks for whom the only sufficiently acceptable peace outcome is that they become Jim Crow-era Southern whites.

Given that Israel lacks the power to change the Palestinian mindset–which predates the creation of Israel–and the Palestinians lack the power to achieve their consensus peace-through-overlordship, the conflict will just go on and on.

ADDENDA And this is an example of what I mean.

[Cross-posted from Thinking Out Aloud].

What a difference framing makes

By Lorenzo

The current turmoil in the Middle East is, amongst other things, one long vindication of Zionism. Given that, to many folk, Zionism has no meaning other than “Israelis being nasty to Palestinians”, some explanation is in order.

European Zionism
Zionism was founded by Viennese journalist Theodore Herzl (1860-1904). Observing the trial of French officer Alfred Dreyfus and the rampant Jew-hatred in the French Republic–the land of liberty,  equality, fraternity–Herzl concluded that Jews were not safe in Europe, and so conceived the notion of a Jewish homeland as an active political project. A place of defence and refuge.

Theodor Herzl

Of all modern ideologies, none had its founding proposition more thoroughly vindicated than Zionism, as the Holocaust proved that, verily, Jews were not safe in Europe. The lead-up to the Holocaust, as countries such as the US and Australia declined to be the mass Jewish refuge required, helped to energise Zionism. The horror of the aftermath of the Holocaust, in a broken Europe, did so exponentially more.

Hence the post-war migration of European Jews to the British Mandate of Palestine and the UN-sponsored division of Palestine into a putative Jewish and Arab state. Then followed the refusal of Arab states to accept either a Jewish or a Palestinian state. Thus the Israeli War of Independence (Nov. 1947-July 1949) and the creation of the state of Israel. The war included a mass exodus of Palestinians, the most dramatic event of which was the Deir Yassim massacre of 107 Palestinian civilians by Zionist paramilitary groups, condemned at the time by official Israel, although it was clearly operationally useful in consolidating a Jewish-majority state.

This was followed by the 1956 Suez War, where Israel occupied and then returned territory. Then the 1967 Six Day War, which left Israel with the Golan Heights, the West Bank, Gaza and the Sinai. All territory it declined to return. Then the War of Attrition (1967-70). Then the 1973 Yom Kippur War. Then the 1979 Egypt-Israel Peace Agreement, whereby Sinai was handed back to Egypt. The one and only time that Israel relinquishing territory has resulted in a peaceful border.

Israel was also drawn into the Lebanese Civil War (1975-1990). This included the South Lebanon Conflict (1985-2000), which resulted in an Israeli withdrawal from Southern Lebanon. This strategic Hezbollah victory has since led to further conflicts across that border between Hezbollah and Israel, most notably the 2006 Lebanon War. In 2005, Israeli unilaterally withdrew from Gaza and dismantled all Jewish settlements therein. Hamas subsequently took over Gaza, and used as a base from which to attack Israel, leading to the 2008-9 Gaza War and the 2014 Gaza War.

So, Israeli experience over the last 15 years is that territory they withdraw from is used as base to attack them. Not exactly an encouragement for further withdrawals.

This is the framing (minus the point that withdrawal means attacks) that people are generally somewhat familiar with. Israel as “alien” European settler state “imposing itself” on the Middle East. Israel as Jewish apartheid lebensraum state. It is the framing that the mainstream media overwhelmingly uses–Israeli politics get reported, Palestinian politics very little; Israel acts, Palestinians suffer. Happenings that conflict with the framing are ignored or glossed over.

Middle Eastern Zionism
Except that framing, and even the above history, leaves more than half the story out. The story of Middle Eastern Jews. The story of Palestinian politics.

Palestinians killing Jewish civilians is an enduring picture of Palestinian terrorism; something that used to do a great deal to harm the Palestinian cause in the West and still does something to maintain support for Israel. It is often presented as “asymmetrical” warfare–the weapon of the weak (Palestinians) against the strong (Israel).

The only problem is that massacring Jewish civilians was an Arab/Palestinian tactic back when Jews were a minority in Mandatory Palestine and there was no Jewish state. Such as the 1929 Safed riots (18-20 Jewish dead), the 1929 Hebron massacre (67 Jewish dead and the end of Jewish presence in the city) and the 1939 Tiberias massacre (19 Jewish dead).

Jewish militias also got into the killing game, though they often targeted occupying British authority (police, soldiers): most famously in the King David Hotel bombing. The aforementioned Deir Yassam massacre was part of a series in both directions during the Israeli War of Independence. On the Jewish side, the killings hardened the determination to form Israel.

Muslim and Armenian quarters just after the 1909 Adana massacre.

There was a also a wider history of riots, pogroms and massacre of Jewish civilians in Arab [Muslim Middle Eastern] lands well outside Mandatory Palestine, such as the 1934 Thrace pogroms which saw 15,000 Jews flee the area, the Farhud massacre in Baghdad in 1941 (over 180 Jewish dead), the 1945 anti-Jewish riots in Tripolitania (over 140 Jewish dead), the 1947 Aden riots (82 Jewish dead), the 1947 Aleppo riots in Syria (75 Jewish dead), the 1948 Cairo bombings (70 Jewish dead), the 1948 riots in Morrocco (43 Jewish dead), the 1948 riots in Tripolitania (13-14 Jewish dead, 280 Jewish homes destroyed). All of which helped motivate Jews fleeing en masse from Arab lands to Israel and the West. About half of Israeli Jews are of Middle Eastern origin. Israel is very much a refuge for Middle Eastern Jews, not merely European ones.

Especially when the massacres are put in a wider context. For example, that Jews in Iran could not touch merchandise before purchasing it, as that made the merchandise “unclean” for believers. Or that massacre of non-Muslims had been very much a feature of recent Middle Eastern history.

The 1933 Simele massacre which killed at least 600 Assyrians in Iraq was a relatively mild instance. The Hamidian massacres of 1894-6 killed at least 80,000 Armenians, Assyrians and other Christians. The Adana massacre of 1909 killed another 20,000 or more. They were a prelude to the massacres during the Great War; the Armenian (1.5m dead), Assyrian (at least 275,000 dead) and Pontic-Greek (at least 750,000 dead) genocides.

The broader pattern was fairly clear–when some real or apparent enemy was giving Muslims a hard time, the local group allegedly connected to the same would be subject to massacre. Hence the upsurge in killing of Jews during the Israeli War of Independence. Mass homicide-as-protest hardly encouraged the targeted group to stick around if it had a place of refuge available.

The increasing intrusion into the Middle East of Western norms about equal protection of the law during the C19th made previous patterns of subordination seem increasingly burdensome and unacceptable to non-Muslim groups. One response was the development of Arab nationalism, which sought to provide a common identity for Arabs across religious persuasions; hence the importance of thinkers of non-Muslim or of heterodox Muslim origin in its development: such as Michel AflaqGeorge AntoniusButrus al-BustaniGeorge HabashAmeen RihaniIbrahim al-YazijiConstantine Zureiq (all raised as Christians), Shakib Arslan (Druze) and Zaki al-Arsuzi (Alawite).

Jewish population in various Arab countries.

But there was a deeper pattern, which connects to the notion of the dhimmi. The non-Muslim who was permitted to reside in dar al-Islam, the land of Islam, but only if they accepted their subordinate status. Any agitation for equal rights broke the Pact of Umar and made them targets. The Armenian agitation for equal rights and equal protection of law was very much part of the lead-up to the Hamidian massacres.

But the creation of a Jewish state was seen as an offence against Arab nationalism as well, so Jews were not included in this common Arab identity. All of which gave Zionism thoroughly grounded in the realities of the Middle Eastern every bit as much emotional power and justification as European Zionism. Hence the flight from Arab lands to the West and to Israel as it soon as it was established. It is quite false to see Zionism as merely an alien intrusion into the Middle East. As American Jonah Shepp nicely puts it:

Palestinians who say that “the Zionists” must go but “the Jews” can stay need to come to grips with the fact that Zionism, at its core, is about creating a space where Jews do not need someone else’s permission to live.

Death and rejectionism
What is striking about the Palestinian tactic of targeting Jewish civilians is that it has never worked. Not as an anti-Israel strategy. It helped energise the creation of the Jewish state, to maintain support for it, to create deep scepticism about the possibility of peace with Palestinians, leading to overwhelming popular support within Israel for the recent Gaza war. As an anti-Israel strategy, it has been massively counter-productive.

Except that the tactic of targeting Jewish civilians has worked very well. As an anti-Israel strategy, it has been utterly hopeless. As a political strategy within Palestinian politics, it has worked again and again. Which is why it is persisted with. If Palestinians kill Jews, they are heroic fighters for Israel. If Israel retaliates and kills Palestinians, it demonstrates the murderous evil of Israel. In Palestinian politics, violent death is a win-win.

Leading to a Palestinian politics divided between corrupt ex-terrorists (Fatah) and explicitly genocidal current terrorists (Hamas). (An English translation of Hamas’s founding Covenant is here.) Noting that genocide is not an alien intrusion into Middle Eastern politics: on the contrary, the Muslim Middle East has been a pioneer of modern genocide.

As the Islamic State is currently reminding us. As did the Islamist regime in Sudan with Darfur.

Overlords or victims
Maronite-dominated Lebanon, Alawite-dominated Ba’athist Syria, Sunni-dominated Ba’athist Iraq and Israel were all manifestations of the same logic–if you are a minority group in the Middle East, you are either overlords or victims. So, better be overlords and control the state.

Except Israel is the great exception–Jews are very much a majority within Israel. It is also the only stable democracy. Because Jews are very much a majority within Israel. It has not collapsed into civil war. Because Jews are very much a majority within Israel. Israel is a majority Jewish state, and that makes a profound difference.

Amin al-Husseini

When the Jews first started migrating in serious numbers to Ottoman-controlled Palestine, they brought capital with them, raising the value of land and labour in the area. Under British rule, the trend accelerated. The rising wages and commercial opportunities encouraged movement of non-Jews to the area. A proportion of current Palestinians only have any connection to the land of Palestine at all because the Jews came. Which is why the definition of a Palestinian refugee is any who lived in Palestine for 2 years or more prior to May 1948, or is descended from someone who did. (Pausing here, the notion of multi-generation hereditary refugees is an interesting one in itself: but Israel accepted and integrated its Jewish refugees; the Arab world prefers to keep the Palestinians as permanent refugees, stateless sticks to beat the Zionist entity with–it is easier for a Palestinian to become a citizen of the US or Australia than most Arab countries.)

There were those in the Palestinian community who thought cooperation with the newcomers who were bringing opportunities with them was worth exploring. But the influx of new workers and rising wages undermined the debt bondage which was the conventional form of subordination to the landlord class in the area. Jews-as-enemies suited both religious leaders–operating as gatekeepers of righteousness–and a landlord class threatened by modernity. Hence the role of Haj Mohammed Effendi Amin el-Husseini (c.1897-1974), the Grand Mufti of Jerusalem, in setting the dominant and enduring model of Palestinian politics of death and rejectionism.

One that proved to have enduring appeal in the wider Arab world–the Zionist entity became the perfect scapegoat, the way to redirect anger from the failings and oppressions of Arab regimes. Or just Arab failings generally.

Not on offer
To concerned Westerners, and the dwindling Israeli Left, it seems obvious that the only moral solution is for Israel to accept Palestinians as worthy of equal protection of the law. As Israeli journalist and writer Noam Sheizaf declares:

The two-state solution is not a progressive cause and neither is a single-state solution — they are just possible means to an end. The only possible goal for progressive politics in Israel/Palestine can be full human, civil and political rights for everyone living on this land. …

I support equal rights for all people living in this land, between the Mediterranean and the Jordan River.

Or as American Jonah Shepp writes:

The Israeli right remains convinced that the Palestinians must learn to accept Israel before the occupation can end. That is about as convincing as someone claiming in 1960s America that the end of segregation would have to wait until black people stopped resenting white people. Peace is nearly always made between leaders before it is made between peoples. Israel is no exception to this rule; claiming otherwise just avoids the issue. And Israel must take the lead on this, precisely because the balance of power is so lopsided.

In fact, the problem is much deeper than accepting Israel. It is accepting. Accepting Jews, Christians, Shia, Yezidis, Alawites, Kurds …  The Middle East has been a pioneer of, and continuing arena for, genocide not because Jews exist, nor because the Jewish state exists, but because the idea of different-but-equal has so little resonance. After all, Jews are not even close to the majority victims of genocide in the region, which became a modern pioneer of genocide well before the Jewish state seemed a serious prospect. What is different about the Jews is that they are the only local minority who has found an effective solution (thanks crucially to the influx and skills of European Jews). Although the Kurds would like to copy them. With the Peshmerga having the role of the Israeli Defense Force (IDF). Aspirations that has official and popular support in Israel.

Israeli PM Ehud Barak made a two-state offer to Yassir Arafat at the 2000 Camp David summit. Arafat made no counter-offer. And so the entrenched pattern of Palestinian politics continued. It is all very well for folk to say Israel must take the lead, but Israelis have no reason to believe that any peace deal is possible. As Israeli journalist Matti Friedman points out,

The fact that Israelis quite recently elected moderate governments that sought reconciliation with the Palestinians, and which were undermined by the Palestinians, is considered unimportant and rarely mentioned.

Without death and rejectionism, what is there to Palestinian politics and Palestinian identity? Not enough, apparently. The demand to be overlords, without the power to make themselves such, bedevils Palestinian politics.

Which leaves Israel with what exactly? To relinquish territory it can expect to be attacked from? It cannot unilaterally declare peace. And retaliation may feed the dynamic of death, but so does not doing so, while reverting Jews back to the role of powerless victims Israel exists to deny. The murderous civil was of Sudan, Lebanon, Syria and Iraq demonstrate that stopping being a Jewish-majority state is the road to disaster. The continuing existence of the IDF is not negotiable.

Or does Israel just decide to grab what it can when it can? Not an admirable position, but an understandable one.

“Correct” moral responses are “obvious” if you frame the issues narrowly enough. But Israel lives in the complete Middle East, not a conveniently packaged version of it.


[Cross-posted from Thinking Out Aloud.]

Ahistorical pomposity and gnostic sneering: why academics write deep crap about “neoliberalism”

By Lorenzo

Humanities and social science academics write a remarkable amount of nonsense about “neoliberalism”, typically understanding neither the reasons for the general shift in public policy nor the motivations and ideas behind it.

A nice example of such nonsense is provided in a post by philosopher Robin James:

neoliberals think everything in the universe works like a deregulated, competitive, financialized capitalist market.

No one believes this. For a start, no “neoliberal” believes the state works like that. Nor do they advocate abolition of the state–anarcho-capitalism is not a widely held position, particularly not among policy wonks, policy advisors or policy-setting politicians (what we might call policy makers). It is one thing to be struck by how remarkable it is that there is any economic order at all, it is quite another to think such is the template for all that there is.

I am aware of the ideas, constraints, and reasonings involved in the “neoliberal” policy shift because I was involved in “neoliberal” policy advocacy. I moved in such circles, read the literature, even wrote minor bits of it. So, I am familiar. Including with the policy context that policy makers have been wrestling with.

I do not much like the term neoliberal. It is usually used as a “boo” word, and boo words are always analytically suspect. Moreover, the term is often used in a very unclear way, in meaning and scope. Is it a period of history? Is it an ideology? Is it a policy program? If so, what specific policies?

To the extent the term has a useful meaning, neoliberalism is economic liberalisation in the context of an expansive state: either the welfare state (in the developed world) or the development state (in the developing world). Key underpinning ideas in the original “neoliberal turn” include Milton Friedman’s rehabilitation of monetary economics (pdf) and his critique of (pdf) policy reliance on a presumed trade-off between inflation and unemployment, Friedrich Hayek’s analysis of the uses of knowledge in society, the development of public choice theoryfeeding into (pdf) the analysis of rent-seeking (pdf), Ronald Coase’s development of (pdf) the concept of transaction costs and its application to property rights (pdf), and the development of supply-side ideas (pdf).  The Lucas critique (plus rational expectations) and Fama’s efficient-market hypothesis came along a bit too late to have much influence on the original “neoliberal turn”.

What these key ideas have in common is that they cast strong doubt on belief in the omni-competent state, either directly or by comparison with market-based alternatives. Hence the “neoliberal trifecta” of corporatisation (restructuring of state institutions), privatisation (transfer or creation of property rights) and de-regulation (reduction of transaction costs). Plus the adoption of inflation-targeting by central banks, as a way of operationalising their responsibility for inflation as a monetary phenomenon. The critique of the widely assumed omni-competence of the state also encouraged taking gains from trade more seriously, while the policy premium for economic efficiency (see below) put the issue of opportunity costs in sharper policy focus.

The policy debates in which “neoliberals” have been engaged in have been very much about boundaries between state and non-state action, but that is a very different matter than the sort of absolutist claim James claims as defining “neoliberal” belief. You only have debates about the proper boundaries between realms of social action if no particular realm is universal–whether as underlying reality, as created order or as ideal order. A nice example of such thinking, with included critique of overweening confidence in state action, is provided in a a recent blog post by economist Scott Sumner:

Intellectuals on the left go through the following thought process. First they observe a “problem.” Then they declare a “market failure.” Then they consider what sort of government policy could remedy the problem. What they often overlook is that the problem is usually the side effect of other government policies. That doesn’t mean the free market solution is always best; there may be cases where those other government policies are needed, and hence further regulation is required to overcome the side effects. The real problem is that it’s much easier to dream up straightforward government policies to remedy a situation, than to envision how a problem is the side effect of other regulations. Or what further side effects will result from your proposed solution. That biases pundits toward supporting far too much government involvement in the economy.

“Neoliberals” do typically believe that spontaneous orders do and can exist. But markets are only a form of spontaneous order, and only in a rather specific sense. Nature, red in tooth and claw, is a spontaneous order, but it is not a market. (Though it is a realm of budget constraints and trade-offs, as can be seen in the distribution of ecological niches: but, then, the state is also such a realm, but is not a spontaneous order.)

So, in the above purported definition of “neoliberal belief”, we have a complete misunderstanding, both of notions of spontaneous order and of the ideas and motivations of “neoliberals”. And this from someone who has been teaching a graduate seminar on that very subject. (I wonder if someone with experience and knowledge from within the reform movement has ever been invited to address said seminar? Indeed, how often such folk are ever invited to talk to any such seminars or courses?)

There is no great mystery as to why academics write such nonsense about so-called “neoliberalism”: it is due to ignorance and irritation.

The irritation is straightforward: there was a comfortable sense in “progressive” circles that they knew where history was going. And then it wasn’t.

At the grand history level, there was an expectation that socialism was the direction of history. The dawning realisation from the 1950s onward that actually existing socialism was not the transformative manifestation of the logic of history that it had been assumed to be led to the rise of postmodernism. As philosopher Stephen Hicks explains in his lucid and revelatory Explaining Postmodernism: Skepticism and Socialism from Rousseau to Foucault (2004).

At a more mundane level, there was (and is) an expectation that (moral and social) “progress” means an ever larger role for the state. This notion that state=civilisation (and the more expansive the role and power of the state, the more civilised) is one of the oldest tropes in human history. It is often a highly misleading and self-serving one, as James C Scott explains in his lucid and revelatory The Art of Not Being Governed: An Anarchist History of Upland Southeast Asia (2010).

The state=civilisation trope does have the advantage of simplicity. Bigger state, good; smaller state, bad is an easy principle to rally one’s sense of moral and cognitive worthiness around. (It would also make North Korea the most civilised society on the planet–since the state has entirely taken over its society: but perhaps the bigger=better is only operating as an indicator at the margin.)

Then along came the “neoliberal” policy shift of corporatisation, privatisation and de-regulation. Suddenly, history got de-railed. This was not how things were supposed to be going. So, there is an underlying hostility to “neoliberalism” in much academic writing: more, there is a presumption that such hostility is the morally and cognitively correct posture to assume towards this “neoliberal” wrong turn.

Hence David Harvey can write (pdf) (via):

The corporatization, commodification, and privatization of hitherto public assets has been a signal feature of the neoliberal project. Its primary aim has been to open up new fields of capital accumulation in domains hitherto regarded off-limits to the calculus of profitability.

This is a wildly ahistorical reading, since much of what was done in privatisation was to put back into the commercial order things which the state had previously removed from that order. But the very ahistorical reading displays the underlying belief in history as having a “proper” direction and the expansion of the state as progress,

The resulting presumptive hostility towards the “neoliberal” de-railing of history’s proper course is nicely expressed in the aforementioned post by philosopher Robin James:

Generally, people use the term “neoliberal” to denote things they don’t like about our historical situation. It’s a kind of shorthand for “contemporary society” with a “which sucks” inflection. This shorthand sense is where all the looseness and imprecision comes in. As Hegel said, “now” can be narrowly particular because it can mean any particular point in time. So, “neoliberal” gets used to mean “now,” which means something specific because it can refer to any specific thing.

But things suck for a reason. This reason lies in the deeper sense of “neoliberalism,” …

A clear message of much academic writing on “neoliberalism” is that if you don’t understand that neoliberalism is bad, you are clearly morally and cognitively deficient: not exactly conducive to engaging with, and so understanding, the phenomenon being studied. Especially if signalling one’s distance therefrom is a key part of the exercise.

If the state=civilisation, that implies that the only social interactions truly worth having are state mediated or framed. There is a fairly clear underlying notion within this literature that commerce is not civilising (or, at least, is not morally uplifting), which connects this line of thinking into a trope that goes back at least 2,500 years–that commerce is morally and intellectually vulgar.

Leslie Kurke’s delightful Coins, Bodies, Games and Gold: The Politics of Meaning in Archaic Greece examines the tension in Archaic and Classical Greece between the gift-exchange essentialist order of aristocracy and the functionalist, coins-and-commercial political order of the demotic polis. There is considerable affinity between modern progressivism and the aristocratic disdain for vulgar, demotic commerce expressed in Plato’s Republic and in some of Aristotle‘s writings. But, then, progressivist academics in particular feel themselves to be a moral aristocracy, even though such a self-identity is typically cast as simply moral concern. Nevertheless, the moral elite pretensions are clearly displayed in the reflexive contempt for those with views outside the “progressive” magic circle.

This reflexive contempt performs a moral-status-and-opinion-conformity signalling function–it expresses one’s cultural placement. Hence it applies to non-”progressive” Westerners (against whom the status games are played), but rarely to non-Westerners, no matter how wildly their views diverge from progressivist norms, as the non-Westerners are much more likely to function as moral mascots–people for whom moral concern is signalled. Thus, contrast how the views of US evangelicals are treated as distinct from the views of more emphatic Muslims. The former are people against whom status is signalled, the latter people for whom moral concern is signalled, even though the actual views of the latter are likely to diverge far more from progressive norms on matters such as gender and sexuality than do the former. (To put it another way, the US evangelicals are to be culturally defeated, the Muslims culturally “respected”: though such “respect” often glosses over winners and losers from various conceptions of what precisely is to be “respected”.)

Westerners involved seriously in commerce are also a status-signalling target. There is little doubt that the irritation with, and antipathy to, “neoliberalism” is deeply connected to longstanding antipathy to commerce and to those who make their living by it. Hence, that “neoliberalism” expands the ambit of commerce is one of its defining sins.

An emphatic statement of such irritation being when folk burble on about “the ruling class”. Few terms are so flattening of social and historical complexity as ruling class. To use an evocative example, that NSDAP funding was disproportionately from those traumatised by revolutionary socialism tells us much. Describing the Nazi Party as a “tool of the ruling class” tells us nothing: indeed, less than nothing, for it flattens and obscures reality rather than illuminating it.

In its manichean over-reaching, ruling class washes away the complexities of history and politics, and the rise of the bargaining state (known in its current dominant manifestation as liberal democracy)–where lines between private and public, between market and command, are part of said social bargaining. But, of course, using the phrase ruling class casts the user in a heroic role as fighter against oppression: it performs a status-signalling role. Such a pose by tenured members of the safest scholar class in history is beyond pathetic, but it fits in nicely with the moral aristocracy self-conceit.

The irritation is easy to understand, even if it is much more grounded in a sense of superior status than its participants are likely to openly admit. Especially as it is a reworking in modern guise of very old patterns of thought, so rather less worthy of pretensions of being “cutting edge” than is generally felt to be the case. (This a recurring pattern: e.g. the post-modernist plaything indeterminacy of meaning was a hot topic for Socrates and the symposium boys, as well as medieval theologians.)

What is less forgivable than said irritation, though it is also thoroughly understandable, is the deep ignorance the aforementioned nonsense is based on. Such academic commentary remarkably often does not understand the role of economic models or of the rationality postulate in economics. (Hint, it’s a postulate.) Vernon Smith’s 2002 Nobel memorial lecture (written version here [pdf]) is an accessible presentation of the role of rationality postulate(s) in mainstream economics. It is also very worth noting that said postulate(s) are typically applied to where such work best–to aggregate behaviour.

The role and nature of economic models is similarly misunderstood. Models are, indeed, “caricatures of nature” when applied to the biological realm, and caricatures of society (and people) when applied to the social realm. But, as this discussion of the use of game theory in biology demonstrates, they can be highly useful caricatures for aiding our understanding. Yes, they can be misused or overdone; yes, there are issues with formalism in mainstream economics. But the modelling map is not the cognitive territory and the endless variants in models used within economics, the arguments and permutations over which “stylised facts” are most appropriate where, are parts of ongoing search for useful models, not for purported direct descriptions of reality in all its complexity. One cannot take a model, or even modelling, and say “neoliberals believe that …”.

So when Robin James writes:

if each individual is modeled as an algorithm (which is not too far-fetched a claim: big data and government model individual users’ behaviour in this way)

there is a very unfortunate cognitive shift. An individual in a model is not an actual individual and their entire behaviour is not being modelled. After all, uses algorithms to connect past purchases from them to what you are likely to be interested in purchasing from among their products. They use algorithms because they work, but only in a fairly narrow, statistical tendency sort of way. That James ends up in a wave form metaphor for social harmony (so quite fundamentally misconstruing what “neoliberalism” is about, as we will see) just illustrates how not useful for understanding said cognitive shift is. (Lord help us, she is apparently writing a book on the “neoliberal” idea of social harmony.)

Underpinning this are more than a few hints of CP Snow’s “two cultures” (pdf). If your idea of social enquiry is based on the gnostic sneering of Marxism, then the utilitarian model play of economics will be mysterious. So mysterious that it will be recast in sense that make sense to you, while missing the phenomenon one is allegedly describing. (Marxism is gnostic in that it presumes knowledge of where history is going, and the “real” “underlying” forces driving said history; sneering in that it is full of terms – bourgeois, petty-bourgeois, lumpenproletariat – which are ways of sneering at entire categories of people under the pretense of analysis.)

This confusion about modes of social enquiry is made all the more problematic as, far too often, academics do not engage directly with “neoliberal” writings, but rely on hostile summaries of the same. Summaries which are often highly unreliable. Robin James cites Foucault on Gary Becker and “Chicago School” economics, for example: citing Foucault on history, or the history of ideas, is never a good sign. (Rictor Norton’s The Myth of the Modern Homosexual provides a witty take-down of Foucault on queer history.)

A particular difficulty is that the ideas behind “neoliberalism” flow from the Sceptical Enlightenment. French intellectuals generally do not “get” the Sceptical Enlightenment. The longstanding worship of French and German intellectualism among many Anglosphere academics has meant that many of them don’t “get it” either. One of the fundamental ways in which they don’t get it is, steeped in Radical Enlightenment notions of a perfectible society, they presume grand system when there is something much more like a conjunction of values, principles and ideas held to be true, or at least useful. If you like, a working acceptance of ambivalence. ”Neoliberals” very much embrace Kant’s dictum that, from the crooked timber of humanity, nothing straight can be built. The classical liberal tradition, which is a key source for “neoliberalism”, famously views the state as a necessary evil: in what universe is that the basis for any strong expectation of social harmony?

It is true that some more “Austrian” folk can wax lyrical about the harmonising nature of unfettered markets. Nevertheless, as a matter of practical policy making, a rather more utilitarian presumption in favour of maximising gains from trade (and the employment gains and revenue flows therefrom) is a much more powerful underlying principle in the “neoliberal” policy turn.

Seeking dynamism
What “neoliberals” are typically about is recovering or achieving economic dynamism. This is very much not a social harmony goal, except in the sense of muting social conflict as sharing out a growing pie is much easier than struggles over a static (or worse) shrinking pie. A nice statement of the desire for (and conflict over) dynamism is provided by Virginia Postrel in her The Future And Its Enemies: The Growing Conflict Over Creativity, Enterprise, and Progress. (Note that all this leaves plenty of room for self-interested political manouevring among interest groups, whatever the policy framework.)

The larger policy context for the “neoliberal turn” is relatively straightforward. There was a major expansion of the welfare state across the Western world in the postwar period, particularly the 1960s. The postwar welfare state represented the imperial Western state colonising its own societies, instead of other people’s. Of course, unlike the territorially-expansive imperial state, the policy subjects of the domestically-expansive welfare state get to vote for the politicians in charge, which makes the operation somewhat different. Though not entirely different, as this would-be small businesswoman’s interaction with the domestically-colonising Greek state illustrates:

But as happens so often in Greece, the bureaucrats had other plans. In a country where you are viewed favorably when you spend money but are considered a criminal when you make it, starting a business is a nightmare. The demands are outrageous, and include a requirement that the business pay taxes in advance equal to 50 percent of estimated profit in the first two years. And the taxes are collected even if the business suffers a loss.

I needed only 20 square meters for my baking business, but inspectors told me they could not give me permission for less than 150 square meters. I was obliged to have a separate toilet for customers even though I would not have any customers visit. The fire department wanted a security exit in the same place where the municipality demanded a wall be built.

I, like thousands of others trying to start businesses, learned that I would be at the mercy of public employees who interpreted the laws so they could profit themselves.

And so in the winter of 2013, my business was finished before it had a chance to take off.

If the Greek state spent less effort in frustrating potential gains from trade amongst it citizens (i.e. “capitalist acts between consenting adults”), it would be less fiscally stressed. But then said state would be far less useful for those who benefit from its colonising of its own society. (One of the signs of how vulgar commerce is viewed to be, is that folk who are very strong on freedom to engage in sexual acts between consenting adults so often take a very different view of commercial acts between consenting adults.)

Outside the Western world, the development state followed decolonisation. Except that said development states often replicated and expanded imperial colonisations of such societies, but this time by local elites of their own societies. With democratic constraints being less common. Again, a longstanding pattern. Thus, the revolt of the Spanish colonies in the 1820s seems to have been largely about constitutional shifts in Spain threatening to undermine local elite income extraction. (Not that income-extraction via the maintenance of slavery and wish to expropriate Amerindian land were entirely absent from the American Revolution, but the rebellious British colonies’s political institutions were already much broader-based than that of their metropole, as they were mass-settler colonies rather than narrow extraction ones.)

The celebration of the expanding welfare state (and the development state) generated new forms of the aforementioned state=civilisation trope. Where, as ever, resistance to the imperial state’s pretensions show a lack of moral understanding, show one to be lower on the ladder of civilisation, an enemy of progress and moral enlightenment.

There is more than a little sense that the small, pervasive decencies and comforts of bourgeois society lack the moral grandeur for such great minds. (Which, by the way, is by no means to cast any blemish on the wish to do better, including in a wider moral-order sense: much of the dynamism of Western society comes from precisely its openness to that.)

This frustrated moral grandeur is particularly clear when the deeper antipathy over conceptions of individuals choices and preferences comes to the surface. If you don’t like the choices people actually make, it is a congenial move to attack the entire notion of such choices as legitimate, to claim that any ideology which celebrates them somehow misses out deeper truths. Particularly if some profound social harmony is one’s goal, as the sheer messiness of individual choices and preferences do not sit well with such an aim. (De-legitimising the actual choices and revealed nature of actual people in favour of some conception of what they ought to choose, and what they ought to be like, is where the Radical Enlightenment takes its turn to tyranny.)

This sort of discounting-choices move underlies philosopher Leigh Johnson’s comment, in a post praising and responding to Robin James’ piece, that:

Of course, the great irony evident in neoliberals’ ubiquitous efforts at data-collection– their constant, relentless and mostly covert encroachment into our “private” lives– is that such efforts are justified on the basis of safeguarding our individual freedom to engage in the market according to our own interests, as those interests are freely determined by us.

Never mind that what an uncritical surrender to algorithmic analyses actually does– little by little, Google search by Google search, Facebook like by Facebook like, Amazon purchase by Amazon purchase– is eventually come to determine not only our interests, but also our “freely, intentionally rational” selections among them.

Never mind the conflation of lots of different things into one great “neoliberal” wickedness (hence her recurring use of the drone metaphor). The obvious truth that if one changes the context people choose differently is being used to discount choices folk actually make. This in societies of unprecedented freedom and prosperity.

Back in policy reality, starting in 1973, productivity growth, which had helped pay for the postwar expansion of the welfare state, stagnated (for reasons we still do not entirely understand). This put the expanding welfare (and development) states under rising fiscal, and broader economic, pressure. This gave economic efficiency a persistent policy premium. The “neoliberal turn” in policy is a response to this pressure and policy premium. (Which, of course, leaves plenty of room for debate about specific measures.)

Been here, done
We (the Western world) have been here before. Responses to fiscal stress by states that involved restructuring of state institutions (corporatisation), sale, creation or re-allocation of property rights (privatisation) and reduction of transaction costs (deregulation) extend back into history, into the medieval period.  For instance, an appropriate modern term for the medieval borough or chartered town would be enterprise zone just as the modern corporation has medieval origins. Henry II’s “creation” of common law was a classic transaction-cost-reducing regulatory simplification (while also extending the reach of royal authority).

A more recent historical precedent would be postwar Germany, where Ludwig Erhard‘s “bonfire of controls” was the original “big bang” deregulation. This was in the context of Ordo-liberalism developing a concept of a social market economy: in many ways a direct forerunner of what later became known as “neoliberalism”. Hardly surprising, as postwar Germany confronted the legacy of Nazism, massive wartime destruction, and Western occupation. So how to rebuild an economy to sustain a welfare state was a central policy question, leading to the postwar West German “economic miracle“.

In the 1970s and 1980s, there developed, in Andrew Norton‘s nice description, a policy coalition to tackle these economic stagnation and fiscal stress problems in the wider Western world. The overlapping aim within the policy coalition was to create a sustainable welfare state. While Thatcher and Reagan get a lot of attention, in fact centre-left governments were prominent in the reform process–such as the Hawke-Keating Government in Australia and the Lange-Douglas Government in New Zealand. (I touch on this in my essay on postmodern conservatism.)

Deregulation actually started under the Carter Administration with Senator Edward Kennedy’s airline deregulation. The first “economic rationalist” (aka “neoliberal”) reforms in Australia were tariff reductions under the Whitlam Government (1972-5). If the policy goal with the broadest support was a sustainable welfare state, then it was natural for centre-left government to be prominent in the “neoliberal turn”. It is the same principle that Scandinavian states are ranked high for economic freedom, because a large welfare state requires a high level of economic efficiency and an expanding welfare state requires a high level of revenue, and thus economic, growth.

At the core of the policy turn are boundary issues for state activity arising from knowledge limits and incentive problems. More gains from trade means more tax revenue and employment prospects. A better targeted welfare state is a more sustainable welfare state. “Neoliberalism” (or “economic rationalism”) was mostly the application of fairly mainstream economic analysis to public policy problems, shorn of the presumption that government action was inherently virtuous or otherwise superior.

For the expanded welfare (and development) state was done as the state does most things–clumsily. (A point that also applies to the reform processes.) The clumsiness of state action is discussed by Peter Shuck’s recent book Why Government Fails So Often: And How It Can Do Better, nicely reviewed by economist David Henderson (link here). The wider issues of the limits to state action are brilliantly analysed by James C Scott in his splendid Seeing Like A State: How Certain Schemes to Improve the Human Condition Have Failed (1999). (A September 2010 discussion between Scott and various liberal or libertarian economists and political scientists is here.)

That the anti-”neoliberalism” literature presents us with Western intellectuals and academics more hostile to expansion of private commerce, markets and private property than the Central Committee of the Chinese Communist Party is somewhat striking, but said Central Committee has to struggle with genuine policy problems. (Just as dependency theorist Fernando Cardoso engaged in economic liberalisation and privatisation as President Cardoso.) All the academics hostilely pontificating on “neoliberalism” have to worry about is their own glowing moral soundness. Where the failure of command economies also goes down the memory hole, creating no problem they have to wrestle with. These are people who are so tied up in a proper conception of History that they cannot see history right in front of them. (And folk who habitually analyse the motives of others in the most hostile and dismissive terms will, of course, be outraged at their own obvious moral nobility being treated somewhat sceptically.)

The loss of harmony
Which is where we come to the central sin of “neoliberalism”. A basic principle of the direction of policy analysis labelled “neoliberalism” is that there are things that the state does relatively poorly and so should not do. This means abandoning belief in an omni-competent state. Which means–in the absence of any rival implementing mechanism to the state–the end of belief in the perfectible society; the end of belief in a society of perfect social harmony; of a society which completely embodies particular virtues; a society without alienation. What, to its practitioners, in the aims that the policy coalition could coalesce around, was simply a way of making the economy work better, of having better targeted and affordable welfare policy, of better allowing people to go about their lives, was a massive assault on every Radical Enlightenment ambition.

(As an aside, radical Left politics and radical Islam both embody the appeal of this notion of virtuous harmony. Hence the various cross-overs between the two.)

The term “contradictions” being regularly applied to clashing social interests nicely expresses the underlying assumption that the proper society has harmony, rather than seeing conflicting interests as both normal and inevitable.

We are dealing with very different responses to the paradox of politics (or the paradox of rulership): that the state is the most dangerous social predator but need we it to protect us against other predators (in order to permit a certain level of social amenity). A paradox sharpened by the constant temptation to use the state for one’s own predatory schemes–from rent-seeking corporations blocking competition or seeking other special benefits to projects of social outcasting and exclusion. The Sceptical Enlightenment accepts that the paradox can never be resolved, just managed more or less well. The Radical Enlightenment lives in the false hope of final solution. The tension between the two has created much tragedy, but has also helped fuel the Emancipation Sequence. (Though, even there, the practical Sceptical Enlightenment goal of inclusion and normalisation has constantly triumphed over the grander Radical Enlightenment goal of subversion and transformation).

Trying to put the Radical Enlightenment project into effect via Leninism created tyranny, (state) slaveryindustrial serfdomhereditary elites and even hereditary God-Kings. (Kim III succeeds his father, an Eternal Secretary-General who succeeded his father, an Eternal President; what else would you call Kim II and Kim III other than God-Kings?) There is also plenty of what James C. Scott calls “cosmological bluster” and Xavier Marquez’s hyperbolic loyalty signalling. This strikingly atavistic array extends to fetishising the mummified corpses of leaders. Said atavism is a result of Utopia in Power‘s vanguardism, which generates both an enormous cognitive and power gulf between those to be harmonised-and-equalised and elite which does the harmonising and “equalising”–those who will prune and straighten the crooked timber of humanity–magnifying dramatically the project’s inherent reliance on, and celebration of, (state) command-and-control. Hence the project’s reversions to past patterns of command-and-control and grand cognitive posturing.

Those who are most under the delusion that they represent some escape from the constraints of history are those who most end up in thrall to its recurring patterns. (Something else the radical Left and Islam have in common.)  For they fail to interrogate the patterns of history with appropriate humility, confident in the notion that they have the key to break out of the same, so remain caught in its patterns. Such as recycling the millennia-old state=civilisation and commerce is the morally polluting action of lesser minds tropes. Or the “end of history” Soviet regime going through the entire ibn Khaldun cycle of rise, decay and fall in a single life-time. Or the aforementioned consequences of the Leninist dismissal of the “bourgeois” Sceptical Enlightenment wrestlings with how to restrain power; wrestlings which presume that ultimate social harmony is not ours to achieve, being made, as we are, of the crooked timber of humanity and that a free (and mass prosperity) society can only be based on accepting people as their nature is, not as you would like them to be.

Which itself can lead into the “eternal now” of conservatism–taking current circumstances as simply “reflecting” human nature rather than contingent historical processes, not all of whose constructing victories are morally worthy. Hence productive tension between Sceptical and Radical Enlightenment visions–which can run within as well as between people–being important in, for example, the Emancipation Sequence.

Ahistorical pomposity
One sign of being so wrapped up in the proper conception of History that the “progressive” academic critics of “neoliberalism” cannot see the history in front of them is the ahistorical pomposity, the redolent rhetorical overdrive, of much of the language used.

Consider, for example, this statement from Stuart Hall and others (pdf):

With the banking crisis and the credit crunch of 2007-8, and their economic repercussions around the globe, the system of neoliberalism, or global free-market capitalism, that has come to dominate the world in the three decades since 1980, has imploded.

No, there has been an economic downturn and financial crisis. These happen. The latter happen recurrently; indeed, for centuries now. Hence the recent examination of the same, This Time Is Different: Eight Centuries of Financial Folly.

They go on to describe them as events:

whose catastrophic consequences are still unfolding.

Catastrophic compared to what, the Great Depression?  The market order survived the Great Depression, it will survive this. (Though hopefully not without some monetary and financial policy changes.) That your humble author lives in a country (Australia) which has been a strong adopter of “neoliberal” policies but avoided both the Global Financial Crisis and the Great Recession (indeed, is the country where the Great Moderation has not ended) makes the not-remotely-apocalyptic reality a little more obvious, but only a little more.

They also talk of:

the redistribution from poor to rich.

Actually, the poor have not got poorer, what has happened is that top income shares have expanded faster–not the same thing. It is useful to keep the perspective that mass impoverishment is the historical norm; it is mass prosperity, and the hope of mass prosperity, which is historically extraordinary.

A theme in the anti-”neoliberalism” literature is to be highly critical of the expansion of the financial sector. Yes, if you subsidise financial activity through IMF “welfare for Wall St” and “too big to fail” (what economists call injections of moral hazard), that will inflate your financial sector and the returns thereto while making subsequent financial crises more intense.  Yet many a “neoliberal” have been critics of such policies–Milton Friedman was a noted critic of the IMF on precisely those grounds.

In Australia, financial liberalisation has incorporated balancing prudential regulation: in the US, it did not, for political reasons which reflect recurring patterns in US politics that go back over a century (see Fragile by Design: the Political Origins of Banking Crises and Scarce Credit). A political failure completely not inherent in “neoliberalism”.

It could be objected that, in quoting from Hall et al, I am citing a manifesto as scholarship. I am merely following Robin James’s original citation as such; and its rhetoric is intended to be persuasive to its audience, so a reasonable indicator of a world view or mentality.

Let us consider a more directly scholarly piece by David Harvey, also cited by Robin James, which refers to (pdf):

State interventions in markets (once created)

One way to elevate the political and the state is to treat markets as inherently “created” by states–as if stateless peoples never had markets and black markets never existed. And yes, of course markets usually rely on law, mediation and other state-supplied services (and black markets have all sorts of quality and violence problems precisely because they blocked from using the same)–but then we are back to the issue being the proper boundaries between state and non-state action.

Harvey at least understands the tension between falling revenues and rising social expenditure, even if he has no serious grasp of why the former. He tells us:

The restoration of fiscal discipline was essential. This empowered those financial institutions that controlled the lines of credit to the state. In 1975 they refused to roll-over the debt of New York City and forced the city close to the edge of bankruptcy. A powerful cabal of bankers joined together with state power to discipline the city.

Yes, the people the money was borrowed from stuck to the novel idea that things have to be paid for. (Especially if you ever want to borrow again at non-punitive terms.) But we are dealing with “neoliberalism”, so something evil and malign:

This amounted to a coup by the financial institutions against the democratically elected government of New York City

Which, of course, has to be paralleled to Pinochet’s coup against the Allende Government. Because we are dealing with a manichean world view, and it is all one vast evil really.  And a failed one, as Harvey cites (quite inaccurate) global economic growth figures:

Aggregate growth rates stood at 3.5% or so in the 1960s and even during the troubled 1970s fell to only 2.4%. But the subsequent global growth rates of 1.4% and 1.1% for the 1980s and 1990s (and a rate that barely touches 1% since 2000) indicate that neoliberalism has broadly failed to stimulate worldwide growth (World Commision, 2004).

They are so evil, they persist in failure. How wicked and stupid policy makers must be! Actually, the answer we are given is that they have not failed, because it all about making the rich, richer. (As if all consequences are intended.) Which make voters pretty stupid then. The answer is rather simpler–while global economic growth has not recovered to 1960s levels, global per capita economic growth has been trending up again.  It is also fairly normal in the anti-”neoliberalism” literature that the massive global exits from poverty are either ignored or glossed over.

On the matter of intention, the anti-”neoliberalism” literature is full of the post hoc ergo propter hoc fallacy, as if consequences were always intended. But such knowing intention is, of course, another sign of “neoliberalism’s” manichean power. This is history without happenstance: thus surges in corporate profits cannot be an unintended consequence of the interaction between inflation-targeting and positive productivity shocks. There is no policy discovery process, no trial and error, because how things work is, of course, already known.

Harvey sees the “neoliberal” turn as being:

to restore or, as in China and Russia, to construct, an overwhelming class power.

What class power is relevant to the Chinese policy shift? Really, you are the elite in charge of a police state command economy which utterly dominates society and its resources in a way no market economy can come close to replicating and somehow economic liberalisation is going to increase your class power? In which planet is someone living when swapping the immense class power of a totalitarian state for the dynamic instabilities of free commerce is seen as “creating class power”? But to say that the policy turn might be about escaping from mass impoverishment would imply the “neoliberal” turn had a positive point, and we can’t have that, can we?

The comparative (and later absolute) failure of command economies in various “natural experiments” has no serious resonance in the anti “neoliberal” literature. Yet it was deeply influential in encouraging “neoliberal” policy advocacy.

Harvey does at least notice that “neoliberalism” accepts the state has having a role. One can only understand the “neoliberal” turn in policy if one grasps that it is really about what the state should, and should not, do because it is about what the state can, and cannot do, effectively. If you cannot enter seriously into the intentions, motivations, ideas and contexts of historical actors, you cannot produce anything beyond congenial rhetoric parading (falsely) as substantive analysis.

Rampant conflation
A common feature of this literature is that neoliberalism is conflated with globalisation. More openness to trade obviously fosters globalisation, but contemporary globalisation itself is mainly driven by falling communication costs (although falling transport costs also matter), just as C19th globalisation was mainly driven by falling transport costs (although falling communication costs also mattered). And there was no globalisation worth the name before the dramatic drop in said costs.

There is also often somewhat sneering references to “capital accumulation”–also known as mass prosperity. Since that is only possible with capital accumulation: indeed, the level of capital per person in a society is a basic indicator of its level of prosperity.

Watching the “neoliberal” Hawke-Keating Government (1983-96) labour mightily to create a sustainable welfare state makes the “threatened ruling class” spouting seem like the self-indulgent crap it is. And exactly how “threatened” was the Chinese “ruling class”? Once again, we are back in the realm of manichean analysis–never explain by the reality that things have to be paid for and economic stagnation is politically problematic when malign conspiracy can be invoked.

So much rests on how things are framed. So Harvey frames it thus:

In whose particular interests is it that the state takes a neoliberal stance and in what ways have these particular interests used neoliberalism to benefit themselves rather than, as is claimed, everyone, everywhere?

Serious, broad-based policy making, struggling with genuine issues, is excluded so that things can be framed in (malign) self interest.

Spontaneous, at the margin, revolt by developing world peasants, workers and students for property rights and commercial freedom also does not suit such narratives, where ordinary folk figure only as victims or dupes. That there might be good reasons by “neoliberal” governments were elected (and, worse, re-elected) passes them by, such good reasons being excluded by their framing. By excluding even the possibility of the “neoliberal” turn being a reasonable policy response, one is left with malign self-interest (even malign conspiracy) to “explain” it.

Just as so much of said “analysis” conflates disparate phenomena together according to what reflects their own hopes, fears and frustrations, not that which is actually in any analytically useful sense a common phenomenon.

This conflating operates at various levels. For harmonising is also homogenising; abandon the notion of final social harmony and the play of diverse identities becomes enchanting and natural rather than inconvenient and threatening.

Again and again, this literature presents us with ahistorical grandiosity. The term “capitalism” helps promote such overweening systematisation. The term becomes so easily an ahistorical abstraction. One that is rhetorically powerful, yet analytically fraught. Retreat to terms such as “state capitalism” shows how analytically empty. Leading to treating the state as some epiphenomenon (typically of class), which it is not. One cannot understand the Euro project, for example, without understanding that it is a deeply political, even (super)state building, project.

So much of the anti “neoliberal” literature is deep crap, because intellectual sophistication is well in evidence even while rhetoric overwhelms and blocks understanding. It is something I find distressing, because I enjoy reading scholarly articles, made so delightfully accessible by the internet. To have such persistent nonsense written about something I am personally familiar with generates worries about the scholarship I appreciate.

Though there is scholarship that can help us make sense of it. A paper by Dan Kahan and Donald Braman, Cultural Cognition and Public Policy, examines how people interpret evidence on the basis of their “cultural placement”. The paper alludes to the intensifying effects of cognitive conformity–Cass Sunstein’s Why Societies Need Dissent is an excellent presentation of the social science evidence on the powerful, and deleterious, effects of cognitive conformity. So much of the academic literature on “neoliberalism” is a case study of precisely that.

In the post praising Robin James’s original post, philosopher Leigh Johnson writes:

And, not to put too fine a point on it, but the “Invisible Hand” drone is a deadly effective weapon that basically works like this: defund or deregulate, make sure things don’t work, wait for people to get angry, then privatize.

Back to the manichean conception. (Manichean both in the sense of evil but also in the sense that it is a grand conjoined corporate-finance-Middle Eastern policy-managerialist evil.) The notion that politics is about contending interests just disappears behind some notion of evil, coherent manipulation of the commercial and political realm because history is not going the way it is supposed to. What is Johnson’s comment that:

the neoliberal imperative, shouted into the panopticon of our modern world and echoed off every wall by banks, political parties, corporations, families, nation-states, social groups and social media

but a wail of cognitive pain about history going wrong? Absent in this manichean wail is the sense that there might have been any legitimate difficulty to be wrestled with, that the welfare and development states suffering fiscal and economic stress might have produced serious policy responses.

The “neoliberal” phenomenon–being about a broad trend in public policy–sits at the intersection of the classical liberal tradition (including those elements that reach into the social democratic tradition), developments in economics and practical politics. People who do not understand these three realms of action and thought are not going to usefully write about the intersection between them.

What is so often portrayed as merely a fight over who controls the lever of the state, and which way it is pulled, is something much more fundamental than that. It is and was about wrestling with what is practicable to do with that lever, and at what cost to whom. The much more fundamental nature of what has been going on in the “neoliberal turn” is not faced because it is too confronting. Remaining within their dogmatic slumbers is much more comforting.

Hegemonic comfort
Hence we get the products of humanities and social science academics who rarely, if ever, meet, still less directly engage with, those involved in the above wrestling. SF author Orson Scott Card observed in a podcast interview with Glenn Reynolds (aka Instapundit) and Dr. Helen Smith (DrHelen) in November 2006 that military officers are generally more intellectually open and flexible than academics, as they have to deal with people with wide range of views; academics just vote against tenure of those they disagree with. It is much more congenial that way: particularly if you wrap your sense of personal identity up in a common notion of being morally and cognitively “sound”. So narrowness of range of views and perspectives becomes a feature, not a bug. It is much more congenial to be part of the progressivist hegemony of academe, rather than a trouble-making outlier. Particularly if you can parade as a morally heroic “exposer” of a malign, society-dominating, “neoliberal” hegemony.

(Pausing here, obviously I have lots of disagreements with Mr Card; that does not mean he doesn’t have a point. And that I have to engage in such hand-waving just illustrates how entrenched the judging-people-by-their-opinions-is. Note also that nothing I have written in this post implies that the actual policies chosen were optimal, could not be reasonably disagreed with, etc.)

I would say that said academics need to go and do the reading for themselves, but the barrier of their assumptions may well be invincible. The barrier of being seen to be “sound”, part of the comfortable progressivist hegemony, even more so.

It is not about reality, it is about feeling intellectually and morally comfortable. Or trying to recover some sense of such comfort by creating a set of intellectual fictions which both deny there is any serious point to the “neoliberal” policy turn and allow them to believe their ideological comforts can be unproblematically resurrected.

Just don’t mistake that search for cognitive comfort, and what it produces, for anything resembling serious scholarship. It utterly lacks the genuine engagement with its subject matter that such scholarship requires.


[Cross-posted from Thinking Out Aloud.]