28 September, 2007

Exhaustive enumeration

Recently Aaron Haspel linked to a review of Thomas Pynchon's Against the Day from the Virginia Quarterly Review. I wrote a little piece on Pynchon here—but never got around to reading Against the Day. The reviewer, William Logan, quotes this passage:
The rooms seemed to run on for blocks, stuffed with automata human and animal assembled and in pieces, disappearing-cabinets, tables that would float in midair and other trick furniture, Davenport figures with dark-rimmed eyes in sinister faces, lengths of perfect black velvet and multicolored silk brocade a-riot with Oriental scenes, mirrors, crystals, pneumatic pumps and valves, electromagnets, speaking-trumpets, bottles that never ran empty and candles that lighted themselves, player pianos, Zoetropic projectors . . .
Logan comments, 'The list may be the manifest sign of research the novelist can’t bear to throw away', adding that 'the matter of matter is almost always farcical in accumulation, from Dickens’s dust heaps in Our Mutual Friend to Imelda Marcos’s shoes'. He continues:
The meaning of the title, should Against the Day mean anything, lies in shoring up the present against those ruins of the future—and to that end, the list stockpiles odds and ends, like boxes of Civil Defense crackers, as a specific against destruction.
I'm not going to write a history of lists. (The canonical figures of literary listmaking are Rabelais and Joyce; many accounts see a precursor in the catalogue of ships from Iliad 2. I think that textual listmaking also has two important extra-literary sources—religious ritual and anatomical catalogues.) But this list, with its last two members, and with its mention of 'perfect black velvet', tips its hat in subtle reference to another striking literary list—this, from Vonnegut's 1952 Player Piano:
In the early light, the town seemed an enormous jewel box, lined with the black and gray velvet of fly-ash, and filled with millions of twinkling treasures: bits of air conditioners, amplidynes, analyzers, arc welders, batteries, belts, billers, bookkeeping machines, bottlers, canners, capacitors, circuitbreakers, clocks, coin boxes, calorimeters, colorimeters, computers, condensers, conduits, controls, converters, conveyers, cryostats, counters, cutouts, densitometers, detectors, dust precipitators, dishwashers, dispensers, dynamometers, dynamotors, electrodes, electronic tubes, exciters, fans, filers, filters, frequency changers, furnaces, fuses, gages, garbage disposers, gears, generators, heat exchangers, insulators, lamps, loudspeakers, magnets, mass spectrometers, motor generators, motors, noisemeters, oscillographs, panelboards, personnel machines, photoelectric cells, potentiometers, pushbuttons, radios, radiation detectors, reactors, recorders, rectifiers, reducers, regulators, relays, remote controls, resistors, rheostats, selsyns, servos, solenoids, sorters, spectrophotometers, spectroscopes, springs, starters, strain-gages, switchboards, switches, tape recorders, tachometers, telemeters, television sets, television cameras, testers, thermocouples, thermostats, timers, toasters, torquemeters, traffic controls, transistors, transducers, transformers, turbines, vacuum cleaners, vacuum gages, vacuum tubes, venders, vibration meters, viscosimeters, water heaters, wheels, X-ray spectrogoniometers, zymometers. . .
Player Piano is not one of Vonnegut's great books—it lacks the perfect irony and beauty of Slapstick, and the linguistic invention of Cat's Cradle. It is clearly an early work. But this passage is a masterstroke. It comes toward the end of the novel: the protagonist is admiring the wreckage left by a group of Luddites (the 'Ghost Shirts') in a dystopian techno-future. The list is thus not so much 'shoring up the present against those ruins of the future', but rather shoring up the past against the ruins of the present.

There are two brilliant things about Vonnegut's list: firstly, its unremitting alphabetization (only 'radios, radiation' and 'television sets, television cameras' seem to break the law), and secondly, the local effects created by the total pattern. It's terrific that we have 'dynamometers, dynamotors' and 'motor generators, motors', because the standard logic of prose rhythm would give the reverse order in each case. 'Fans, filers, filters' is a more traditional tricolon, and there is a delightful jangle on 'calorimeters, colorimeters'. Another nice touch is 'wheels, X-ray spectrogoniometers', which moves from the simplest machine to the most abstruse, purely by alphabetic place.

These local effects serve by contrast to make the listening reader more aware of the flatness of standard prose structures. The list defamiliarises both the language and the mechanical world now reduced to rubble: that world is now to be described not by a bystander, casually, but as if by an archaeologist, in systematic order. The world is dead, unrevivable; fully objectified.

Vonnegut's list thus takes the reader out of the lull of the text—which, in a pretty standard bit of science fiction, is a bold writerly move. But Vonnegut makes two mistakes. Firstly, he ends the passage with an ellipsis (. . .), and secondly, he then continues the novel, with some rather weak dialogue and narrative dribble, for another few pages, adding nothing. It would have been better to put a full stop after 'zymometers' and leave the novel at that.

23 September, 2007

Academic humanism: a diagnosis

We are to address the role of the university in providing and sustaining an intellectual life; more specifically, a life in the humanities—a humanist life. So let us begin at the beginning, with an account of the earliest universites, which evolved from earlier schools some time around 1200. There is a fair amount of history and quotation in this post, gentle reader, but I hope you will not let that put you off from the body of my actual arguments.


The mediaeval liberal arts curriculum was oriented towards careers at court (eg. arithmetic for accounting) or in the Church (eg. astronomy for the computus). The best liberal arts students graduated to theology, the most prestigious subject of study; others to law or medicine. The Church stood by, with blessing or condemnation, ready to snap up the cream for its own. In this respect it resembled the rich multinationals that finance university programmes today (eg. Hewlett Packard at MIT), only its authority was paramount. It was the Church and its minister Bernard that condemned Abelard in 1130 and again in 1144; it was the Church that condemned Aquinas in 1277, along with the heretical Averroist Siger of Brabant; it was the Church that fostered the growth of the friars at the Sorbonne in the mid-thirteenth century. So it would seem a little strange, would it not, if someone were to claim that the mediaeval universities were in any way autonomous?
The university corporations of the Middle Ages at the height of their power were not responsible to anybody, in the sense that they could not be brought to book by any authority. They claimed, and succeeded in making their claim good, complete independence of all secular and religious control.
Thus Robert Hutchins, in The University of Utopia (1953). The American reformers of the mid-century were, in fact, full of mediaeval dreamings. In a recent book, William Haarlow quotes Stringfellow Barr, co-author, with Scott Buchanan, of the massively influential 1935 Virginia Plan, which outlined a university curriculum based on the Great Books:
The [reading] list reflected the title of [Buchanan’s] Poetry and Mathematics and the connection between these two modes of thought and expression; between the medieval trivium and quadrivium that he and Adler and McKeon had argued about in the days of his seminars for the People’s Institute.
And indeed, in Poetry and Mathematics (1929), Buchanan has more to say about Richard McKeon, the great mediaevalist scholar (and student of the Neo-Thomist Etienne Gilson), whose work on the liberal arts curriculum would prove so influential to the educational theories of his contemporaries. Only a year after the drafting of the Virginia Plan, Hutchins could already write, in The Higher Learning in America:
The medieval university had a princple of unity. It was theology. . . The medieval university was rationally ordered, and, for its time, it was practically ordered, too. But these are other times; and we are trying to discover a rational and practical order for the higher learning of today.
He goes on to deny any specific system, but he insists that his ideal university would consist of three main faculties—natural sciences, social sciences, and metaphysics. Metaphysics, in other words, is for Hutchins the queen of the humanities. And he had good precedent: Newman (Idea of the University, 1852) had written that ‘all branches of knowledge are connected together, because the subject-matter of knowledge is intimately united in itself, as being the acts and the work of the Creator’. Before that, Coleridge and Bentham had drawn up schemata showing the interrelation of different fields of knowledge, and their ultimate Platonic unity—a project going all the way back, through Peter Ramus and John Dee, to classical antiquity.

How strange this seems to us! Some of us are positivists; others postmodernists—all of us firmly against metaphysics. If you go to a second-hand bookstore, at least in America, ‘metaphysics’ now means ‘New Age gibberish’. Wittgenstein, we suspect, is chuckling maniacally from the clouds. But Hutchins is squirming: for as he has it, ‘consciously or unconsciously we are always trying to get’ a metaphysical system. The intellectual trends of the next sixty-odd years would prove him quite wrong: we have drifted further and further away from wanting any sort of system. We have drifted towards Toulmin and Wittgenstein, towards the postmoderns and Feyerabend, towards an anthropological sort of relativism. Nobody, it seems, except perhaps a handful of desparate Catholics, still wants a metaphysics at all, let alone a metaphysics at the hierarchical summit of the humanities. Even before the surge of relativism, J. A. Rice had complained that St. John’s College, still famous for its Great Books programme, ‘trains its students not for the church, as Oxford did then, and not for any office in or under an oligarchy, but for something pleasantly vague: to be artists in the art of thinking, Neo-Thomist dialectitions [sic], lawyers, without law’.

And yet the historical and prescriptive beliefs of Hutchins and his ilk stand as the vague foundation for the way a large proportion of our humanities departments are today structured. We still, most of us, believe in the ‘liberal arts’ and the ‘Great Books’: and the liberal arts and the Great Books are, for better or worse, essentially a metaphysics.


In 1408, the humanist Guarino of Verona returned from his studies in Constantinople to open a school in Florence. Later, the school moved to Venice, and then to d’Este Ferrara, where it eventually became the arts faculty of the new university. Guarino’s course began with an ‘elementary’, which taught students to read and pronounce Classical Latin; it continued with the teaching of grammar, and then of history, geography and mythology. Large amounts of memorisation and repetition were not only required—they were, naturally, the very basis of the education. Students would proceed to learn tropes from the pseudo-Ciceronian Rhetoric, and to a lesser extent from the orations of the real Cicero, and finally build up an insider’s knowledge of Latin by reading authors like Pliny and Augustine.

When Guarino’s students left his school, they could read and speak Latin extempore, compose formal letters and orations, imitate poems and recite ‘facts’ of etymology and classical history. More to the point, their attitude to social authority had been irrevocably shaped by an awe and deference towards classical texts. They had become good, contributing members of society, and would go on to get well-paid jobs at the court or papal chancery. Their parents, of course, lapped it up. In Guarino’s school, as in the mediaeval universities, education was in the service of civil obedience—of the State. (It was, Mencius, much more a 'pretext for Party training' than it is now.) The same sort of conditions held in late Victorian and Edwardian England, and the last, scant vestiges of it can be seen in the sort of education I received at public school—Latin and Greek from age 10, lucky me!

The practices of Italian humanism, unlike those of thirteenth-century Paris, do not depend on a hierarchical metaphysics or theology. But they still depend on a Platonic understanding of human character, an unfailing faith in the moral value of classical literature, and a social structure still essentially aristocratic and authoritarian. Both paedagogies trained for uniformity and diligence; neither for creativity or freethinking. Both, amazingly, managed to produce young men of astounding creativity and free thought.


If our modern educational ideals can be described, with any accuracy, as a confluence of the mediaeval liberal arts programmes (as interpreted by mid-century American theorists) and the revival of text-based humanism, how can we make sense of a system whose social and metaphysical foundations have been almost eradicated?

One of the most common complaints about traditional liberal arts paedagogy is that it is inherently élitist—‘dead white males’ and all the rest of it. An education, it is claimed, should be for all. Anti-élitism, curiously, was one of the motivations of the liberal arts theorists as well—although the arguments are, invariably, rather confused. In The University of Utopia Hutchins writes that ‘everybody can and should learn’—and adds the rather sinister corollary, ‘I should welcome any method by which people are seduced into forming the habit of learning’. In his 1953 critique of Hutchins (‘one of the best, wittiest and most unanswerable things I’ve ever done’), F. R. Leavis accuses him of naïve democratism. Hutchins is quoted—
If leisure and political power are a reason for liberal education, then everybody in America now has this reason, and everybody where democracy and industrialization penetrate will ultimately have it. If leisure and political power require this education, everybody in America now requires it, and everybody where democracy and industrialization penetrate will ultimately require it.
Mortimer Adler, Hutchins’ co-conspirator on the Chicago Great Books project, had similar ideals in mind when he extolled the 'learning which belongs to everybody and should be the common culture in which everybody participates’. The liberal arts programmes are proposed to create a generalist educated democracy alongside the specialist departments of academia, with the Great Books providing society with a common currency of historical thought. Democracy, in fact, cannot reasonably function in a society lacking a liberal education—as Hutchins writes: 'If the people are not capable of acquiring this education, they should be deprived of political power and probably of leisure. Their uneducated political power is dangerous, and their uneducated leisure is degrading and will be dangerous.'

Leavis thinks this ridiculous: élitism, of a sort, after all, is necessary for any stable society: 'It is disastrous to let a country’s educational arrangements be determined, or even affected, by the assumption that a high intellectual standard can be attained by more than a small minority'. Or again:
The attempt to establish a democratic educational system in Great Britain has gone on the assumption that far from everybody has the capacity to justify his or her presence at a university—if ‘university’ is to mean anything—and that there must consequently be a severe sifting.
Leavis, no less than Hutchins, wanted an 'educated public'. But although Leavis approved of Alexander Meiklejohn's liberal arts programme at Wisconsin (1927-32), he was sceptical of 'unrooted global eclecticism', of the mish-mash of classics of all disciplines proposed by Hutchins and Adler. Like many critics, he advocated the expansion of higher education, but wanted to restrict the growth of the university, which he identified with Oxbridge and put in the role of a dominant centre; as R. P. Bilan puts it in his 1979 book about Leavis, 'if the university is to help create the new educated public it must itself be a real centre and thus attempt to counter the increasing specialization that has, ultimately, led to the loss of a centre in society'. And at the centre of the university would be not metaphysics or the sciences, but a humanities structured around literature: and then not a canon of Great Books, but a 'living tradition'.

Despite his criticisms, Leavis was much like Hutchins. Both, for instance, deplored the positivists like Snow, who equated life with 'mortality tables' or 'standards of living'. But Leavis' arguments, unlike Hutchins', were entirely ignored. Universities have multiplied and fragmented—the 1963 Robbins Report, which Leavis denounced, promoted science over the humanities, and advocated more universities in response to growing demand. The Further and Higher Education Act (1992) allowed polytechnics to start calling themselves 'universities', leading to an even more crowded marketplace.

Even more damagingly, for a Leavisite—the university has grown drastically farther away from the common man. Academics are perceived as an irrelevance at best; at worst, as a threat. Professional writers hardly talk to English professors, just as, if Mencius is to believed, programmers hardly talk to computer science departments. There has developed a professional—not an intellectual—élitism, or rather a ghettoisation. And the 'school of resentment', as Harold Bloom has termed it, has sprung up to level charges of social élitism wherever it can. The feminists, post-colonialists, Marxists and so on—all reject the Hutchinsian notion that the Great Books represent a common culture: they see in it only the vested interest of the all-encompassing Patriarchy.


So the academic humanities have lost their metaphysical justification, their wider place in a social or political hierarchy, and their noble humanist purpose. Is there anything else, Doctor?
The causes of the media’s sniping at the University are not individual resentments but a more general uncertainty as to the role of the University and the very nature of the standards by which it should be judged as an institution.
I quote Bill Readings’ The University in Ruins (1996). Readings diagnoses yet another problem with the humanities: their unhappy transition from being the ‘custodian of national culture’, and thus the Koh-i-Noor in the nation-state's ideological crown—whether Humboldt’s Germany or Wilson’s America—to being the useless appendage of a ‘transnational bureaucratic corporation’. And Readings is quite right to note the trend—he discusses, among other things, the rise of the university’s administrative sector and the development of corporatisation and branding. Like me, he sees the university as having outlived its own purpose, only he has a different purpose in mind:
I would prefer to call the contemporary University “posthistorical” rather than “postmodern” in order to insist upon the sense that the institution has outlived itself, is now a survivor of the era in which it defined itself in terms of the project of the historical development, affirmation, and inculcation of national culture.
Despite his misgivings, he does not propose a radical answer:
So what is the point of the University, if we realize that we are no longer to strive to realize a national identity, be it an ethnic essence or a republican will? In asking such a question I am not suggesting that I want to blow up the University, or even to resign from my job.
His prognosis, in fact, is pretty weak, and amounts to a subaudible mumbling about 'dissensual dialogism' and 'thought beside itself'. Perhaps he would have been able to flesh out a more worthwhile response one day—but he died before his book's publication. Ah well. Still, Mencius is on hand with the solution turned down by Readings: 'the Henry VIII treatment—unconditional abolition and confiscation' of the universities. I wrote a short rhetorical reply here, which was flippant but nevertheless contained the germ of my actual views expressed in this post. The problem with Mencius' post is that its proposals rest on a wild and totally unsubstantiated claim that the 'universities are directly responsible for almost all the violence in the world today', on a few charming anecdotes about 1970s Hegelianism and 1990s computer science, told at considerable length, and. . . well, that's about it. But the cages rattle and the Menciophilical mob roars with delight, as it so often does. (He really is a very convincing demagogue, especially if, like me, your judgements are easily swayed.) The other problem with his post is that he doesn't address the humanities at all. For some reason he seems to prefer them to science departments. So his post is of no use to us here.


But why preserve the academic humanities? What would we lose if they went? What could we imagine in their stead? Could they be replaced, as our commenter Michael has suggested, with a sort of unofficial academy or society of study? It's a tempting thought. We can think of the standard examples: Ficino's Florentine Academy, Gresham College, the Royal Society, the circles of Erasmus and later the various Republics of Letters—even the Académie Française. Why couldn't we restore groups like this, and restrict our own humanist activities to them? We wouldn't need any metaphysics, any political hierarchies, any justification to the taxpayer, any Noble Social Purpose. The problem of bureaucratization would be irrelevant. We would be amatores.

The problem with this is a question of leisure. The members of the above societies all had proper jobs—and jobs, moreover, that fed and sustained (and thus legitimated) their interests and talents. But they also had considerable free time in which to study. As Mencius notes, in a recent comment here, 'a humanist was either an aristocrat, or a professional serving a largely aristocratic market'. Because they were operating in an essentially aristocratic milieu, they could afford the leisure time required for amateur study. Court and Church alike are conducive to educated leisure.

But we don't have the leisure any more. We can't afford it. And the work that most of us do—whether in finance, media or the service sector—hardly sustains our interests and talents. Those who fiddle numbers by day, and read old books by night—those few—are forced to lead a divided life hardly conducive to serious reflection or intellectual progress. Some of us might be patronised by loved ones, or subsidised by earlier financial success—but such situations are scarce. This is the situation of a society oriented around the middle classes.

And while it is nice to be Chris Miller—a non-academic lover of the arts—if we were all Millers, the sorts of books that Millers like to read wouldn't get written. We would never make any new discoveries; our thoughts would tread the old paths again and again. In the modern world, if not in Ficino's or Newton's, academic study makes possible non-academic study.

So while we may, like Michael, wish that universities 'as officially sanctioned diploma mills were eliminated', we are forced to conclude that in order to satisfy large numbers of humanist-minded individuals—or even to satisfy those intelligent and talented enough to produce worthwhile results—the existence of professional institutions is necessary.


The problem, then, is how to establish a professional environment for humanism, without being prey to all those factors that might render the institutional humanities retrograde, obsolete, and lacking in social function? How do we come to terms with a world increasingly inimical to 'purposeless' intellectual enquiry? How can we do what we like to do, as a paid community, and do it with dignity?

I'm not sure I can answer this yet. But there's the problem at least.

18 September, 2007

Koto ba

Heidegger's late book, On the Way to Language (1959), opens with a philosophical dialogue on the nature of language, between Heidegger himself, identified only as 'Inquirer' (though explicit reference is made to Heidegger's academic life and previous works), and an unnamed Japanese interlocutor, apparently based on Tomio Tezuka, who met Heidegger in 1954. Their conversation overtly centres on one of Heidegger's old students, Count Kuki Shuzo, who died in 1941, and his analysis of iki. In the middle of this dialogue comes the following exchange:
I: What is the Japanese word for “language”?

J: (after further hesitation) It is “Koto ba.”

I: And what does that say?

J: ba means leaves, including and especially the leaves of a blossom-petals [sic]. Think of cherry-blossoms or plum blossoms.

I: And what does Koto say?

J: This is the question most difficult to answer. But it is easier now to attempt an answer because we have ventured to explain Iki: the pure delight of the beckoning stillness. The breath of stillness that makes this beckoning delight come into its own is the reign under which that delight is made to come. But Koto always also names that which in the event gives delight, itself, that which uniquely in each unrepeatable moment comes to radiance in the fullness of its grace.

I: Koto, then, would be the appropriating occurrence of the lightening message of grace [das Ereignis der lichtenden Botschaft der Anmut].

J: Beautifully said!
My bullshit-detectors were, at this point, raging out of control. (I concede the possibility—certainly not the likelihood—of this passage being less stercorine in the original German.) So I pulled out the resources of my address-book and asked Gawain and Steve Languagehat (who in turn asked his friend Matt, of the excellent Japanese-studies blog No-Sword) if there was any validity to the claims Heidegger here makes about koto ba. I was relieved to discover, first of all, that koto ba (or kotoba) is in fact one of the Japanese expressions for language. It's a start! Kotoba seems to be the everyday word, with a semantic range from 'word' and 'speech' through to 'language' itself; it contrasts with the more technical term gengo, used in linguistics. Steve and Gawain were unanimous on this point.

But what is the origin and analysis of kotoba? Matt writes that 'in an earlier period, there were two phrases: "koto-no-ha" (言の葉,"leaves of words/speech") and "koto-ba" (事端, "tips of speech")'. The former referred to 'refined, artistic things like poetry', the latter to 'regular speech'. However, the two words were so similar that they 'proceeded to merge into one'. For this theory, he helpfully cites a number of sources: an 1835 book called Meigentsu, Ōtsuki Fumihiko's Daigenkai (1932-37), and Ōno Susumu's dictionary of Old Japanese (2004 edition). (Can you imagine Heidegger being this pragmatic and concrete?) Matt notes the older theory, that kotoba itself (and not koto-no-ha) means 'word-leaves', but observes that the latest source for this is the 18th-century Wakun no Shiori, and that it has since been discredited. Despite this, Matt thinks it remains a popular etymology, given the superficial identity of ba ('leaves') and the ending of kotoba: 'If you asked the average Japanese speaker (native or otherwise), they would probably give the "word-leaves" definition'. By contrast, Chris Drake, on this 1998 thread, writes that 'even if you personally interviewed 130 million Japanese, very few of them. . . would give 'word leaves' as the primary meaning or consider kotoba vegetative, although more people may have made the association during the period nationalism of the 30s and 40s'.

Gawain gives us some different information. He writes that 'in oldest Japanese (pre-Nara times), as far as we know, koto meaning "business" or "affairs" or "Things" was used interchangeably with the koto meaning speech or words'. [Incidentally, it fascinates me that koto should exhibit the same semantic development of affair to thing as English thing (cf. Old Norse Thing, German Ding) and Latin causa (case, affair) > French chose, Italian cosa.] So we a dichotomy between koto (thing) and koto (word). In fact, Tomio Tezuka himself recalls part of his conversation with Heidegger about the word kotoba, thus (cited by Drake):
I think that the koto is connected with koto [meaning "matter"] of kotogara [meaning "event" or "affair"]. . . the koto of "language" and the koto of "matter" are two sides of the same coin: things happen and become language (kotoba).
Drake adds that this connection of the two kotos 'isn't accepted by any historical linguist I know of; it resembles more of a pun'. Gawain, likewise, remarks that
there is no reason to think that "koto" meaning things and "koto" meaning words are actually the same word; they are written with the same ideogram, but ideograms have sometimes been used purely phonetically.
Perhaps Heidegger would praise Tezuka's remarks, in his own words, as 'playful thinking that is more compelling than the rigor of science'. Gawain also draws a similar distinction to Matt's, between koto ['meaning somehow important speech (sayings, teachings, possibly also magical formulas?)'] and kotoba ['meaning "stuff you said"']. Gawain, like Chris Drake, admits that he doesn't know when kotoba came to mean 'language' in general, although Drake speculates:
Up until 1868, a variety of characters were used, although kotoba was often simply written in hiragana or katakana phonetic scripts. A wild guess would be that the 'word-leaves' combination was chosen to become the official standard because of its elegant courtly heritage by Japanese modernizers and supporters of the imperial system sometime between 1868, when Japan definitively began to "modernize," and the end of the 1880s, when mass literacy (and universal conscription) and a "rationalization" of the language were in full swing.
So this is where Heidegger got his account of ba as 'leaves'. But where does this leave us with regard to the 'appropriating occurrence of the lightening message of grace'? Not very far. Gawain was sceptical from the outset: 'if it's Heidegger, then it is almost certainly mumbo-jumbo'. When I gave him the koto definition, he quipped that it 'sounds like a Chinese menu in Phoenix, AZ'. (He is referring to this sort of thing.) Steve, meanwhile, called the koto definition 'what you call a load of bollocks over there on your side of the Channel'. Matt, finally, admits:
it sounds like nonsense to me. I suppose it depends on if he's already set up definitions for "appropriating", "lightening", "message of grace", etc. If they're all in place, then it might make sense on its own terms, but then it would mostly be about his definitions rather than the Japanese itself.
Heidegger's project, in this book, and this dialogue, is to come to terms with (or at least address) the alterity of Japanese thinking, and consequently of its language. This was a hot topic in the mid-century, when Whorf was still all the rage, and not yet dismissed as a charlatan. It is still a hot topic, and psychological experiments are still being performed, as you can read in Nisbett's The Geography of Thought (2003). But Heidegger's real project is to make strange even Western thinking and language. He has not set up, in any serious way, definitions for the terms listed by Matt. And so we are left with the result that Heidegger's gloss on a supposedly arcane Japanese word is far more arcane, or, less charitably, far less coherent and meaningful, than the word itself. When it comes to Japanese words that few of his readers are likely to know, and still fewer likely to know the history of, Heidegger is apparently quite happy with any old 'playful thinking'.

Heidegger here defines iki as 'the pure delight of the beckoning stillness'. Wikipedia articulates the word's meanings with the adjectives 'simple, improvised, straight, restrained, temporary, romantic, ephemeral, original, refined, inconspicuous'. One can only conclude that there are few minds less iki than that of Martin Heidegger.

(With thanks to my contributors, Gawain, Steve and Matt.)

Update: Gawain has more to say about koto ba. Steve links, requesting more information on the mysterious Sei Shonagon and her putative rivalries. I admit to being much delighted by Steve's description of the Varieties as a 'philosophitorium'—first Google hit! I like to think that he reserves this on-blog approach to words for references to me, but that is probably unsupported by the data. The Laughing Bone links, and tells a cannibal joke.

Update 26/07/08: Peony links, refuting me. To others, my use of the word stercorine seems to be causing problems. One commenter on del.ici.ous wonders what it means; another glosses it with '[sic: better, stercorous]'. I like this! That 'better' is the language of the editor or lexicographer, for instance the OED, which glosses acheilous with 'better achilous'. As for stercorine, the OED doesn't have it, though it does have stercorous, stercoreous, and stercoraceous. For me, the suffix of stercorine has the pejorative connotation of a saccharine, a bovine, or even an anodyne. So I invented it.

14 September, 2007

Humanism and the virtue of anxiety

[17.01.08: This post could be taken as a response to Kugelmass' riposte to Stanley Fish's reply to Anthony Kronman's expression of defeat in the face of nearly everything—a riposte, it might be added, that will not be written for almost four months. Fish says the same again. Kugelmass says the same again. Mark Liberman weighs in. 02.11.08: Kugelmass again, on a related subject. nihil sub sole novum, etc.]

Last year, at dinner with a spitzer of art-history graduates, I suggested—perhaps that is too polite a word—that art-history, and in fact the rest of the humanities, were useless disciplines. (I was bored!) Sure enough, it was a rock broke in a byke, and the graduates swarmed up angrily in defense of their occupation—they attacked a sterile ‘world without art’, said they were learning about ‘humanity’—one even used the word ‘redemptive’. The problem with these responses was that I was not proposing a world without art, not even a world without art-history. I was merely observing that art-history served no particular function, unlike, for instance, cancer research. Whether that uselessness is grounds for termination is another question. I posed the problem out loud to a tableful of graduates—but really, as you no doubt already suspect, I was posing it to myself. I wanted a good answer to the question—I hoped they might have some idea. And so I was disappointed with the whobub of inarticulate responses.

Now it is obvious that, when one attacks, or seems to be attacking, another’s professional work, and moreover the basis of her values, that person will get angry. But I heard too the note of anxiety. Frankly I can’t imagine the same heated outquarks from a party of scientists. The point of their work would be transparently obvious to them, and would require only a patronising lecture, like C. P. Snow’s ‘Two Cultures’, to communicate. There might be contempt in their voices, but not likely anxiety.

This episode seemed to me a microcosm of the last fifty years in humanist academia—and more broadly of the entire history of humanism, from the Greeks onward. What is the point of studying literature and history? As I began to think again about the issue, other questions crowded in on me, each clamouring for attention. They may be answered, or may to an extent have already been answered, on these pages. But the most general question of all was: For what should we be fundamentally striving?


The humanist’s traditional defence of his discipline is that it cultivates virtue. This is what the proto-humanist Gorgias says to Socrates, although he later comes a-cropper when he admits that his students can go on to use their rhetorical training for evil.

The more subtle modern reflex of this idea is that studying the humanities cultivates self-awareness and self-knowledge. Distinction is made between a kind of learning that tells us facts, and a kind of learning that changes who we are—a study that informs and thus transforms us. The terms differ—one speaks of useful as opposed to liberal knowledge, another of information and understanding—but the core idea is common. This distinction was first clearly formulated by the Victorians, although its roots are in the German notion of Bildung, ‘cultivation of the self’. As Bill Readings observes, Bildung was at the heart of the German university-project in the early nineteenth century; and so it is only natural that Cardinal Newman, who provided the first major modern formulation of an ideal university in England, should put Liberal Knowledge—that is to say, humanist learning—at the heart of the university enterprise. But where the Germans explicitly associated Bildung with classical arete (virtue), Newman explicitly dissociates them:
To open the mind, to correct it, to refine it, to enable it to know, and to digest, master, rule, and use its knowledge, to give it power over its own faculties, application, flexibility, method, critical exactness, sagacity, resource, address, eloquent expression, is an object as intelligible. . . as the cultivation of virtue, while, at the same time, it is absolutely distinct from it.
On the other hand, Matthew Arnold, that apostle of late Victorian paedagogy, reassociates humanist learning with virtue, which he calls our ‘sense for conduct’. He describes ‘specialist’ scientific knowledge, rather clumsily, as:
knowledge not put for us into relation with our sense for conduct, our sense for beauty, and touched with emotion by being so put; not thus put for us, and therefore, to the majority of mankind, after a certain while, unsatisfying, wearying.
He goes on to argue that if faced with the choice between natural science and ‘humane letters’, the ‘great majority of mankind’ would be better to choose the latter for study: ‘Letters will call out their being at more points, will make them live more’. This core distinction remained very common in the twentieth century. Robert Hutchins, the controversial president of the University of Chicago in the 1940s, asserts that ‘the aim of education is not to gain more and more detailed knowledge of the world but to understand the world and ourselves in it’. Hutchins again, ten years earlier: ‘if the aim of education is the communication of useful information, we may as well abandon the enterprise at once’—and ‘the primary object of institutions with this aim will be the cultivation of the intellectual virtues’—and ‘Education is the deliberate attempt to form human character in terms of an ideal’. Allan Bloom: ‘the impression that our general populace is better educated depends on. . . a fudging of the distinction between liberal and technical education’. Mortimer Adler calls for a ‘philosophy as everybody’s business’—a ‘humanistic and philosophical learning of the generalist, learning which belongs to everybody and should be the common culture in which everybody participates’. So, too, Frank Leavis, insistent that the ‘creation of the human world’ is prior to scientific knowledge, and keen to make the university ‘a centre of human consciousness: perception, knowledge, judgment and responsibility’. And so on, through the whole canon of modern cultural conservatism. R. S. Crane, who pushes this line back to Vives and Philip Sidney, observes:
The great defect of all these attempts to define the humanities in terms of the lofty ends they may be made to serve is their persistent vagueness about the means by which these ends are to be accomplished in the everyday affairs of education.
This is true—though I go further. It is not just that these texts are vague about the way humanist learning leads to ‘cultivation’ or ‘consciousnesss’: the problem is with ‘cultivation’, under all its many names, itself. It is intangible, immensurable, indefinable—a secular sort of tao or holy spirit. It is just what ‘we all know’ to be important—for who would want to lack, or be thought to lack, understanding, or judgement, or sagacity, or taste, or responsibility, or flexibility, or a sense for conduct and for beauty?

One cannot help but feel that these are words made up by humanists to defend their discipline, catchphrases that sound significant but in fact conceal only an emptiness, and an anxiety. In modern academia, another such word is ‘excellence’. Newman writes that the object of Liberal Education is ‘nothing more or less than intellectual Excellence’. R. S. Crane, after criticizing woolly accounts of the humanist project, concludes that the humanities are simply those studies having as their objects ‘those human achievements, like Newtonian or modern physics, the American Constitution, or Shakespearean tragedy, to which we agree in attributing that kind of unprecedented excellence that calls forth wonder as well as admiration’. Excellence becomes a catch-all word denoting superiority in any discipline; and Readings devotes a hilarious chapter—probably his best—to the use of this word in today's academy, as an empty currency of general value, equivalent to 'Total Quality Management' in the business world.

A witty young skolastikos sold his books when short of money. He then wrote to his father, 'Congratulate me, father, I am already making money from my studies!' — Philogelos, fourth century AD.
Imagine three brothers going off to university—Alfred to read Business and Finance, Bertrand to read Biochemistry, and Gerald to read Literature. After eight years they reunite in the family home over sherry, and compare notes. Alfred drives up in his Ferrari, fresh from a holiday in the Bahamas. Bertrand waxes lyrical about the tremendous progress his department is making towards the manipulation of enzymes, with important repercussions for various problems in contemporary medicine. And Gerald. . . well, Gerald says proudly that, despite his own penury and the lack of professional and/or governmental interest in his research, he is at least more human than his two brothers—or perhaps he will say, more modestly, that he has a better grasp of the ‘human condition’—or else simply that his ‘intellectual virtues’ have been thoroughly ‘cultivated’. Alfred and Bertrand will just laugh in his face! And they’ll both have whiter teeth in those broad grins, to boot.


Is it possible to rescue some aspect of character as a defence of humanism? As a defence I think not. Still, we must talk about character. When I was at university, I met students in all disciplines, and I noticed a pronounced—though not a precise—divide of character, between those in the sciences and those in the humanities. My library experiences aside, I found I could talk comfortably with the latter, but not with the former. This was not merely because I knew more about Shakespeare than about the Casimir effect: it was moreover because the humanists wanted to talk about their work. They did not see their studies as a 'job', whereas the scientists did. For the scientists, what they learnt was merely the means to an end, and after five o'clock they preferred to wind down over a pint and talk about the football results. When, therefore, I tried to talk to one about zero-point energy, the discussion went nowhere. But the humanists—well, more of the humanists—continued thinking about their studies after hours, and were happy to discuss them. Some even believed that what they were reading had relevance to their life.

Whether or not humanist studies do have relevance and importance outside of humanist studies is a significant question; but it is also significant whether or not a person has that attitude. Sometimes it is the existence of a belief, not its accuracy, that we want to evaluate. And so if humanists do not have a better character than non-humanists—if, in other words, they are no more virtuous, no more 'cultivated'—they do at least have a different character. That has been the case in my experience; and I hasten to add that it is a statistical, not a categorical, generalisation.

Humanists may be less brilliant, and certainly will be less rich, than their non-humanist brothers at university, but I warm more to their character. I don't think that their character is shaped by their choice of subject, but rather the reverse—they have chosen the humanities because of who they are. One of the best things about humanists—about the best humanists—is their anxiety. Upon reflection I began to admire the inarticulacy of the art-history graduates in their attempt to justify themselves, or rather, I admired what it revealed about their character. Those who are not anxious, reason; those who are anxious, attempt to persuade. The attempt to persuade—rhetoric—is at the heart of the humanist enterprise, and thus, so is anxiety. You can read the texts of the Snow-Leavis controversy and conclude that Snow had better arguments than Leavis—I think it is a difficult matter—but powerfully clear to me is Leavis's superior character. Snow is smug and self-assured—a 'smiling, jovial face' in Roger Kimball's words—where Leavis is impassioned and righteous (if not self-righteous), but also rude, unsettled, even anxious behind his rhetorical thunder.

Now, while some will regard anxiety as a failure of character, I regard it as a strength. It is, I would say, a more authentic, a more emotionally-realised form of the erotetic questioning, and especially self-questioning, that has been at the heart of philosophy since Socrates. The humanist who is anxious is better than the humanist who (like the scientist, or anyway the ideal scientist) questions dispassionately, because his emotions are better integrated with his reason, his intellectual quest. He is, we might say, more whole. Unashamedly, it is an aesthetic criterion. To me, dissatisfaction and unease are better signs than any other that I will like a person.

What I would like to have done with the art-history graduates is make their anxiety more explicit. One of the problems with academia, present and past, is that it tries to bury anxieties in a nest of imposed truths, for instance that of the Great Books, on which I intend to write a post soon. Such an attempt is, naturally, unavoidable in any institutional setting. Nevertheless, there is at least an inherent level of resistance, however small, among those characters drawn to the humanities; less present, I think, in those who pursue science or business. And character is so much more important than brains.


This does not yet offer us a reason to study the humanities, nor a reason to support the existence of humanist disciplines. After all, my preference for intellectually anxious characters cannot be an objective criterion upon which to build an argument! I am certainly not arguing that the humanities are superior to the sciences because the character of those who pursue them is a better one. I am not even saying that humanist characters are better at all.

But I do think that one argument for the support of the humanities, and for the pursuit of humanistic disciplines by those who are instinctively drawn towards them, is simply that their existence allows likeminded persons—specifically, those of a similarly anxious character—to meet and interact at a common level. I support liberal arts programs for the same reason that I support Alcoholics Anonymous, or BDSM clubs, or the Esperanto Society. It goes back to the problem of camaraderie, dismissed so lightly. And it is one reason I blog: not purely for the pleasure of writing, for the pleasure of the text—though there is that, undoubtedly—not even to put my thoughts in order; but for the opportunity to mingle with the likeminded, to feel not alone, and best of all, to be challenged where my convictions are strongest. They are not strongest on this matter, but then—that is the nature of the problem.

This is not my only response to that problem. But to my earlier question—For what should we be fundamentally striving?—the answer 'camaraderie' is not a bad one. Not an answer, certainly, to be ashamed of.

Update: Caressing the lovely face of the humanities, with The Nonist.

10 September, 2007


Shadows we are, and like shadows depart.
Pump Court sundial
It's almost five o'clock in the morning, and I've just woken up.

I've been sleeping for the best part of an hour, fitfully, on a hard leather sofa overlooking the ballroom floor at the Royal Festival Hall. The gamelan is still playing against the wayang kulit (shadow-puppetry), as it has been since eleven pm, and will be until seven. The star singer, Sukesi, seems to be chanting my name, over and over. Even if not, the sound of her voice is horrendously beautiful. Six hours of droning bells, gongs and voices are not natural on a man's ears—but all the better. Something turns in me, a switch. I pull myself up and wander about—it is as if I am floating above the noise—an intense feeling of elation and freedom. I can't even hear the gongs any more.

This is not, in some sense, an authentic performance. Ness and Prawirohardjo, in their 1980 introduction to Wayang Kulit, paint a vivid batik of a traditional Javanese recital:
More often than not, the air is festive and in a word, pleasantly chaotic. Many of the onlookers will come and go from the performance area as they wish, participating in those points of the play which they find interesting. Many will quietly engage in conversation over sweet Javanese tea, often gossiping about the characters in the play as if they were real. Children will usually position themselves as near the dalang [puppeteer] as possible, imitating his play and voices with paper versions of the wayang kulit puppets. Through their play they are learning as generations before them have learned of the great traditional stories of wayang kulit. There are always some who choose to doze off after the first hour or so, to be awakened with a jolt by the raucous clanging of the fight scenes. The air is laden with the piquant aroma of kretek, Indonesian spiced cigarettes, succulent sweets and snacks of food sellers around the performance area.
At the RFH Wayang Kulit, where I spent the tail-end of Saturday, and the entirety of predawn Sunday, the performance took place in a rather clinical interior, with industrial inflatable cushions lining the floor, stout white columns, a nondescript bar offering pints, coffee and sandwiches, and a hundred-odd punters in various states of waking. But despite its dull setting, the show did feature the cream of Javanese and British gamelan talent: the dhalang Ki Purbo Asmoro, the drummer and director Rahayu Supanggah, two first-rate pesindhens (singers), and a small host of percussionists, all in natty costumes. The group performed 'The Building of the Kingdom of Amarta', a story very loosely adapted from early chapters of the Mahabharata. Ki Purbo burbled the dialogue incomprehensibly—in Kawi, I presume—using his feet to clatter and jangle metal plates, cueing the gamelan-players, and with his hands manipulating leather puppets against a screen. Here he is from the orchestra-side, centre:

This, meanwhile, is a tiny fragment of performance from the shadow side, where about half of tonight's audience are sitting, a much greater proportion than is traditional in Java:

Here the epic's prime hero Arjuna, on the left, sizes up against arch-baddie, the Goliathesque Cakil. I wish I'd got a more exciting part on film. Possibly the biggest laughs were drawn by a motorcycle-puppet, which Arjuna's brother Bhima 'mounts' in lieu of a chariot. The wayang was studded with these moments of incongruous modernity: a snatch of some jazz standard starts up out of nowhere during an interlude, someone mentions Harry Potter, and cigarettes, and Ki Purbo jokes about erectile dysfunction. The whole thing is translated, in real time, on video-screens, by an American in the thick of the orchestra. Nobody cares about her grammar, although a small titter goes up when she renders 'blind' as 'bling'. This occurs only a few minutes into the proceedings—five hours later, no one would have batted an eyelid.


The story is not deep. When Peer Gynt goes into the wilderness to confront demons, he encounters the riddle of existence itself—the Great Boyg. (PEER: Who are you? VOICE: Myself. Can you say the same?) But that is urbane 1860s proto-existentialist Europe masquerading as a fairy-tale. The real fairy-tale does things differently. When Bhima, second son of Pandhu, and for tonight the chief protagonist, goes into the forest to confront demons, he encounters only ogres and spirits, the antithesis of alus (cultivated) Bhima, and overcomes them with ease. On a motorbike. The commentator tells us that the battles are an allegory for Bhima's inner conflicts (naturally), a parallel that will become more explicit, towards the end of the Mahabharata, in Arjuna's dialogue with Krishna about metempsychosis and the strife of the soul. Likewise, Clifford Geertz, in his classic 1957 essay 'Ethos, World View, and the Analysis of Sacred Symbols', quotes a Javanese interlocutor to the effect that:
Well, in the wajang the various plagues, wishes, etc.—the godas—are represented by the hundred Korawas, and the ability to control oneself is represented by their cousins, the five Pendawas [including Arjuna and Bhima] and by Krisna. . . the wajang is full of war and this war, which occurs and reoccurs, is readily supposed to represent the inner war which goes on continually in every person’s subjective life between his base and his refined impulses.
Finally Bhima encounters his demon-double—whose name, damn it, I forget—and with whom he is equally matched. They struggle awkwardly together, each puppet spinning back from the other, or somersaulting in ritualised agony, before the demon casts a 'poisonous fog' (or 'magic net') over Bhima—Ki Purbo achieves this effect by superimposing a tree-shaped puppet (you can see this in his left hand, above) over Bhima, in such a way that its diffuse penumbra envelops the smaller, sharper shadow of the hero. It's a neat effect. At some point Bhima is rescued by an ogress, Arimbi; but I was asleep during that. At least, I think I was.

The story is not deep, and at any rate, you're supposed to know it beforehand. As Ness and Prawirohardjo point out, these legends have been cultural touchstones for generations in Java. And that was one of the problems with Sunday's performance: few of the punters knew a damn thing about the Mahabharata, let alone its wayang kulit adaptation. I asked one of my acquaintances there why they kept referring to Bhima as 'Bratasena'. (It turns out that the latter is just his name as a youth.) His response was, 'Which one is Bhima?' The material is so alien, and the names so long and hard to remember, that appreciation of narrative goes right out the window. And it's not just that—we effete Londoners are not accustomed to rambling gamelans going on for eight hours, nor to the pure texture of the song, nor to sleeping through a performance—we're used to paying close attention to a subtle plot for two or three hours, not to wandering in and out of focus over the course of a night. We are, I suppose, not alus enough: we will not take our time. This is why our appreciation of the wayang kulit—and appreciate it we did—could only be superficial and second-hand. For it was not so much participatory as downright mystifying, and of course, extremely tiring. And at best, magical, in the undiluted sense of that word: theurgical. The applause, when it comes, lasts.


At six the mallets are beserking marvelously, and I stumble outside for some air. In morning twilight the Thames is a sight for sore ears: calm, cold, untroubled, and so silent. I get up on the Golden Jubilee Bridge with my camera; there's another person, way down the bridge over the wide water, and we take pictures of each other inadvertently, tiny and distant against the vastness of the steel. Perhaps he does not even see me.

It is as if I have emerged from a cave. One wonders, in fact, if Plato had ever seen a wayang kulit performance when he composed the seventh book of the Republic. 'Above and behind them a fire is blazing at a distance, and between the fire and the prisoners there is a raised way; and you will see, if you look, a low wall built along the way, like the screen which marionette players have in front of them, over which they show the puppets.' Plato knew the power of music on a man's soul, which is why he was so determined to control the musicians in his projected city.

Geertz writes that, in the wayang kulit, 'the shadows are identified with the outward behavior of man, the puppets themselves with his inward self, so that in him as in them the visible pattern of conduct is a direct outcome of an underlying psychological reality'. In the dim morning there are no shadows yet. Except, perhaps, myself. Part of me is still floating, still elated and free. Lily is sleeping at home; I will wake her gently when I return, soon, but for now I'm alone. . . yes, for the moment, not quite part of the world.

08 September, 2007

Gray Cloth

At the Library the other day I was sitting next to a young chap of Asian descent. He was reading Paul Scheerbart’s novel The Gray Cloth; on his desk was Michael Wigginton’s Glass in Architecture, and Mallgrave and Ikonomou, eds. Empathy, Form and Space: Problems in German Aesthetics, 1873-1893. The girl on my right was reading Ezra Pound’s Contributions to Periodicals and Paul Kristeller’s Renaissance Thought and its Sources. (True enough—I am nosey.) I was bored, so I decided to strike up conversation with one of my neighbours. Now Renaissance Thought is not Kristeller’s best book, and the Pound looked pretty dull too. So I chose the Asian chap. Excuse me, I said quietly, pointing to the white volume in his hand—Are you working on Scheerbart?

I was hoping to surprise him. I mean, when you’re working on someone like Scheerbart, you can’t expect to meet too many kindred spirits, can you? I was ready for some awkward debate, or at least some shared secrets. Scheerbart was a damned interesting fellow—poet, theorist, mystic, Expressionist associate. In another life I might have learnt German and written a thesis on him. (There’s still time.) I was even ready to talk about Empathy, Form and Space, a collection of antique essays on the psychology of art, which I read earlier this year.

No, I don't think so, he says, with a refined accent matching his preppy neatness. I’m working on glass in architecture. This (he gestures to The Gray Cloth) is a. . . strange book—very episodic.

It is an odd response. What's strange about the episodic? He'd been chuckling a couple of times while reading the novel, and it does have a certain light charm. And it is, undoubtedly, a strange book—the vessel for a colour-mysticism along Kandinskian lines, a volkish take on philosophical science-fiction—but not, I would have thought, for its narrative structure. Perhaps I misinterpreted him. Perhaps he was only caught off-guard.

Still, I was caught off-guard too, and I ended up mumbling—I found it a bit disappointing, actually. This was true, but I was also trying to provoke more of a response from this suave young character. My gambit did not succeed. He merely smiled indulgently and said I think I'd best be getting back to my work now. And so he did.

For what it's worth I don't believe I'd have had any better luck with the girl—she had a rather stern face. But what is it about young dudes in the British Library? They're not a friendly bunch. Well-dressed, too well-dressed, especially the women, and full of a terrible self-seriousness. They're in the library, doing their work, and are not to be interrupted. I lament! Can't they tell they've got a live one here? (How many have even heard of Scheerbart—let alone read his novel? Shouldn't that alone put me in the 'intriguing' category for anyone interested in glass architecture? Maybe I ought to have mentioned the epic poem I once wrote on the subject.) Don't they long to share and be challenged—and, even better, outside the formal structures of academia? Do they not find, as I do, that soupçon of camaraderie—even possible camaraderie—the most exciting of all things? Are they so flush with fascinable and inquiring acquaintances, that the appearance of another fills them only with boredom and distaste?

Christ, who are these people poring over obscure books, so satisfied?

06 September, 2007


Thamus, chief of the gods, via Socrates, via Plato: And in this instance, you who are the father of letters, from a paternal love of your own children have been led to attribute to them a quality which they cannot have; for this discovery of yours will create forgetfulness in the learners' souls, because they will not use their memories; they will trust to the external written characters and not remember of themselves. The specific which you have discovered is an aid not to memory, but to reminiscence, and you give your disciples not truth, but only the semblance of truth; they will be hearers of many things and will have learned nothing; they will appear to be omniscient and will generally know nothing; they will be tiresome company, having the show of wisdom without the reality.

Rousseau: The body of a savage man being the only instrument he understands, he uses it for various purposes, of which ours, for want of practice, are incapable: for our industry deprives us of that force and agility, which necessity obliges him to acquire. If he had had an axe, would he have been able with his naked arm to break so large a branch from a tree? If he had had a sling, would he have been able to throw a stone with so great velocity? If he had had a ladder, would he have been so nimble in climbing a tree? If he had had a horse, would he have been himself so swift of foot? Give civilised man time to gather all his machines about him, and he will no doubt easily beat the savage; but if you would see a still more unequal contest, set them together naked and unarmed, and you will soon see the advantage of having all our forces constantly at our disposal, of being always prepared for every event, and of carrying one's self, as it were, perpetually whole and entire about one.

Roth: Mark my words, the internet is going to make bloody idiots of us all, soon enough. It won't be Facebook or Myspace—although they will help—it'll be EEBO, Wikipedia and Google Books.

[Update 20/07/08: Nicholas Carr says the same thing at more length. He even makes the Plato connection. So it's not just me. Thanks to Peony for the link. Meanwhile, Bryan Appleyard fills his Times readership with moral panic on the subject. I couldn't be bothered to finish the article. Q. E. D.]

04 September, 2007


I returned, recently, to one of my favourites: the choreographies of Footlight Parade. When I first watched these, the number that most caught me was 'Shanghai Lil', with its arch stereotypes, opium-den tap and fantastical patriotism. And Jimmy Cagney. This time round it was undoubtedly 'By a Waterfall', with the more classical Busby Berkeley pairing of Dick Powell and the beautiful Ruby Keeler. 'By a Waterfall' is the very epitome of the Berkeley routine, and his most famous image comes at its glorious climax:

I watched the number again and again. There is so much to look at, and it is so strange. Many of us have some idea of what we're going to see when we watch one of his films. And so it comes as little surprise to see massed girls twirling geometrically. But— it should be a surprise. Shouldn't it? Dick Powell opens the scene, as he usually does. His voice does not have right timbre for the material; it is too hard. And he sings with a sort of jolly smugness that we now find uncomfortable to watch, if we do watch. There's a magic melodeee / Mother Nature sings to meee / Beside a waterfall / With you. He completes his part with a knowing nod, almost imperceptible, to Keeler. The nod is an act of perfection: it says, This is the order of things: it is correct. Then he sets his head all snug on her barm, and she starts to sing, before a chorus of bathing demoiselles takes over. Choirs on film had a very distinct sound at that time—it had something to do with recording techniques—a sharp, keening, ghostly coo.

It is at this point that the number becomes really odd. For about ten minutes we watch these slim lovelies—let us pretend they are lovely—cavorting on the side of an artificial cataract, and then in an artificial pool, and then in a stylised Art Deco palace, and then in more pools, of indeterminate size, depth and shape, and finally in the palace again, arranged in that legendary ziggurat, spraying jets of water out on all sides.

The costumes represent, albeit stiffly, a soft and osculatory flow of preraphaelitesque hair from the head to the neck, shoulders, around the breasts, between the legs, against a palette of foamwhite flesh, thinly suited. (Hair as tentacle; how Japanese.) There is a mood of social gaiety and innocent frolics. And this is one of the most bizarre things to us—we who have grown up with a pop culture at once morbidly ironic and hypersexual, paranoid—this conflation of the sweet and jolly with the titillation of (apparent) ladyflesh. It is striking to see a beauty so unencumbered with sexuality.

In this shot Keeler smiles blissfully as she dissolves in the visual noise of the torrent. Keeler, just one of the dozens of seagirls involved in the sequence, is happily fulfilling orders. They all participate in the spectacle, just as when the bugle is blown in 'Shanghai Lil' the sailors all march out to drill their rifles and parade with their flags (and portrait of FDR). This, ladies and gents, is what America is made of. The aesthetic deepens as the natatory movements become mechanical. Here the girls become a human zipper of shapely legs:

And here a rather peristaltic boa-constrictor:

These images are consumptive, digestive. And faces are lost in the patterning; the players might as well be droids. For those who let ethics get in the way of aesthetics, this should be disturbing—a stripping-away of human particulars to create a harmonious whole—a reduction of the human to the functional—and thus an essentially anti-humanist choreography, the opposite of character-centred Astaire routines. For me it's fine. But how far is this from the furniture of faceless nude slavegirls in De Sade's castle? Interlocking limbs and all.

This is what I mean when I insist that that 'By a Waterfall' is extremely strange, and beautiful because strange. It is not just camp or corny; it is irreducibly foreign. Even after the synchronised swimming, it is still strange. Keeler wakes Powell up from his dream by splashing water on his shoes, and the final shot shows us three baby whippoorwills in a nest, chirping one-two-three at the final notes of the music. The camera cuts to the curtain falling and the audience clapping ecstatically, and we remember that this entire fantasie is supposed to have happened onstage before a flock of theatregoers. We look for some acknowledgement of the surrealism—we look because we are accustomed to the ironic wink—but there is none. 'By a Waterfall' is a mesmeric reminder of that enormous gulf that separates our age, wholly subsumed in irony, from that which came before, a past almost lost to us.

02 September, 2007


Why does one, traditionally, raise one's pinky while drinking tea?

The question has been bothering me a little of late, one of those petty enquiries that keeps the faded clichés and rituals of quotidian existence from being utterly assimilated. I could think of two answers.

1. The raised little finger helps one to balance the tea-cup, like the funambulist extending his arms—

or more likely, 2. The little finger, unraised, would touch the lowest part of the cup, below the other three fingers; this lowest part is likely to be the hottest, and at first probably too hot to handle with ease. The raising of the finger thus avoids (either actually or symbolically) a burnt pinky. It is a gesture of cultured delicacy—the soft untough delicacy of social refinement.

Any other ideas?