Read Adam Kirsch’s review of Robert Alter’s translation of The Wisdom Books (Job, Proverbs, and Kohelet).  That they serve as a kind of counterpoint to the world’s narrative as established elsewhere in the Bible is not a new observation, but one that Kirsch does well to point out — he points it out in the context of language — that of Alter’s translation and that of the literary merit of the texts themselves.  In terms of surface value, a line-by-line basic reading, they perhaps are contradictory.  They are even so in their broader sense — yet they are (and, more importantly, were, millenia ago) accepted as revelatory, just as were (say) the Psalms or the Torah itself.

All of these are, of course, different kinds of revelation: Torah as, in a strict sense, a legal-historical revelation; Psalms as a personal-poetic; then the wisdom books, discussing something more akin to man-in-general, but still a sort of personal-poetic revelation.  The question, however, still remains: can you have revelation that contradicts itself?  Does this revelation, as it may initially appear, contradict itself?  And then what — especially for one who, like me, is inclined to believe that truth is inherent and inhering in these books?

The seeming contradiction and its acceptance by the earlier (now ancient?) generations and its leaders points toward how we can and should (and must, at times?) read the Bible: not as a singular unswerving narrative, but as a mixture of voices, all trying to understand man and God from their position; all of which, the tradition holds, experienced revelation of some kind.  The literary truth of Job or Kohelet may seem to contradict the narrative truth of the Creation, as Kirsch notes of Alter’s translations: but this does not mean that neither is true, or that we must choose one or none.

Biblical truth is closely related to the truth we find in art and literature.  It is various and multiform but exists nevertheless.  What strengthens this truth to something beyond that which is found in art or literature is the idea of revealed truth, if one accepts it.  But the composition of the Bible — that it seems to contradict itself; that it leaves gaps and jagged edges; that the truths of the various books shout, at times, against each other and then with their opponent of a moment ago — is a kind of instruction from those who lived before us — from, in essence, the founders of the religion as a religion of a book — as to how we, millenia later, should read it.  Let the gaps and rough edges stand and try to understand them, and the whole, as they are, rather than try to sand them over into some immaculate unified whole that in the end becomes wholly uninteresting.

After all — even though Kohelet’s men and beasts are equals, unlike the man who is given dominion over beasts in Genesis, the “mere breath” that is all is merely the air which the same earlier book claims God breathed into Adam’s nostrils.  In the end, perhaps, the more interesting truths are those found in the gaps.

Advertisements

Iran and the Long(er) War

August 26, 2010

Jennifer Rubin commenting on the final paragraph of Bret Stephens’ recent Wall Street Journal piece on twenty years of US-Iraq military-something-or-other:

“Well, this would seem equally apt for the Thirty-One Years War that Iran has waged against the U.S. and the West more generally. Multiple administrations have done nothing as it waged a proxy war through terrorists groups against the West. Neither the Bush administration or the current one has responded to the deaths of hundreds of U.S. soldiers (and Iraqi allies as well) killed by Iran’s weapons and operatives in Iraq. Iran too has committed human-rights atrocities against its own people and defied UN resolutions.

So now we are faced with the threat of a nuclear-armed Iran that would, if it possesses nuclear weapons, certainly be emboldened to continue and step up its war on the West. The question for the Obama administration is whether to finally engage the enemy, thwart Iran’s nuclear ambitions, and commit ourselves to regime change. The chances are slim indeed that this president would rise to the occasion. But perhaps, if Israel buys the world sufficient time (yes, we are down to whether the Jewish state will pick up the slack for the sleeping superpower), the next president will.”

She’s jumping and running with the same two fallacies I noted in Stephens’ article: that there is no difference in kind between US-Iran relations from 1979-2010 and the kind of relations the two states would have after the beginning of outright war; and that it was clearly inevitable from the regime’s beginning that it would come to war—that, in other words, war was immediately and transparently inevitable from the beginning.

As with Iraq, these pre-war years have not been true “peace”: Iranians have taken Americans hostage, funding terror cells that have killed Americans, pursued WMD, and threatened a genocidal war against Israel.  The United States has sparred with Iran from time to time; has enacted an embargo; has condemned the regime as evil.  Perhaps this is war.  If it is, it’s a cold war.

And therein lies my problem with the rhetoric she and Stephens employ.  For 45 years, the United States went to great pains to keep the cold war with the USSR from turning hot.  Why?  Because there was an inherent, fundamental difference between the two: economically, practically, morally, and in simple terms of human life.  The adjective “cold” is in place for a reason: a cold war is something other than outright war.

This “cold war” between Iran and the US does not, of course, operate under the shadow of mutually assured destruction.  But an invasion of Iran would, let us say, be at least as bloody, at least as costly, at least as long, and at least as likely to not succeed (not to fail, mind you—simply not to succeed, to land in some weird grey area) as the war in Iraq.  From my best amateur’s guess, it would likely as not be significantly more so in most if not all categories.

Claiming that we have been at “war” with Iran for 31 years, eliding cold war and hot war, is an attempt to make irrelevant the questions: “Are the costs of entering into war with Iran too high?  What will we gain by doing so?  Is it truly necessary?”  If we’re already at war with Iran, those questions are irrelevant: they are questions to ask before the war.  The rhetoric strives to get us into war by pretending that we’re already in the same war that would occur were we to attack Iran.  After all, if we’ve been at war since 1979, then the debate over whether to start a war is moot.  We might as well just end the damn thing; it’s taken long enough, no?

*          *          *

I should make one comment: the reason I’m interested in this point is not because I want to go around shouting that Rubin and Stephens are being disingenuous, or that I’m concerned particularly with what either of them think.  It’s the language that I’m interested in, and that I find so striking—and, as far as I know, Stephens’ article — appearing on the not-quite-obscure WSJ Opinion page — was the first to push this linguistic version of events in Iraq (keeping one eye on Tehran), and Rubin, in addition to showing up conveniently in my Google Reader feed, makes the implicit explicit.

I’m concerned, that is, with what has concerned others before me: the hollowing of language by war.  It is still perhaps the most striking concern of Thucydides’ great work:

“Words had to change their ordinary meaning and to take that which was now given them.” (3.82.4)

The Long War

August 25, 2010

According to Bret Stephens in the Wall Street Journal, the United States has been at war, essentially, for my entire life; according to Jennifer Rubin at Commentary, we’ve been at war since the Iranian hostage crisis (if not slightly earlier).  While I’m not at all displeased to see even the supporters of The Long War inching towards acknowledging it for what it really is, there’s a problem with this line of conversation.

First, let’s look at Stephens’ definition of “war by another name”:

“In that box, he killed tens of thousands of Iraqi Shiites, caused a humanitarian crisis among the Kurds, attempted to assassinate George H.W. Bush, profited from a sanctions regime that otherwise starved his own people, compelled a ‘no-fly zone’ that cost the U.S. $1 billion a year to police, defied more than a dozen U.N. sanctions, corrupted the U.N. Secretariat, evicted U.N. weapons inspectors and gave cash prizes to the families of Palestinian suicide bombers.”

The worst of these are crimes against humanity, and shouldn’t be trivialized.  The no-fly zone, yes, was an example of the growing role of the United States as the world’s police during the 1990s.  On the other hand, a billion a year compared to the $2 trillion price tag of invasion, occupation, and security seems like pocket change.

That price tag is indicative of something important running through the piece: a refusal to acknowledge a difference in kind by re-labeling what occurred during the 1990s.  This is revisionism.  Stephens pretends that there is no difference—in terms of human and capital cost, in terms of social change, in terms of government—between a “military effort designed to contain Saddam Hussein and a military effort designed to replace him.”  Enforcing no-fly zones and an invasion-turned-occupation that is in its eighth year are essentially different.  Perhaps we have been at war with and in Iraq for essentially my lifespan; but the “war” that ran through my elementary school years was nothing like the war that began shortly after I entered high school.

(That I feel it necessary to use scare-quotes around one use of the word “war” in the previous paragraph points toward something particularly sinister about, among other things, The Long War: its corruption of language.  How do we distinguish between the War in Iraq and the semi-militarized 1990s — which saw American troops in Iraq, the Arabian peninsula, the Balkans, Somalia, etc.?  It was “peacetime,” I suppose, but with something wholly other lurking at the horizon.)

Stephens’ commits himself to another such assumption in the piece: that there was, really, no choice in the matter when it came to the 2003 invasion.  This is already implied by the idea that there is no difference between 1992-2003 and 2003-present.  The way that it had to end was with a full-on invasion and replacement of Saddam Hussein.  This is patently false.  Consider Cuba and Fidel Castro—admittedly not a Saddam Hussein, but he has starved his own people, attempted to acquire WMD, and, at one point, was subjected to, essentially, a “no-sail zone” around his island.  Our policy since the Bay of Pigs, for better or worse, and in varying forms, has been one of containment, content to wait on the natural regime change of human mortality.  (Is that the best policy?  That’s not the question at the moment.)  But there is a choice – and that makes all the difference in the world when one attempts to assess the landscape and create future policy.  Only in a world where there was no true choice in 2003 could this paragraph be written, with one eye on Tehran:

“One thing is clear: The Twenty Years’ War lasted as long as it did because the first Bush administration failed to finish it when it could, and because the Clinton administration pretended it wasn’t happening.  Should we now draw the lesson that hesitation and delay are the best policy?  Or that wars are best fought swiftly to their necessary conclusion?  The former conclusion did not ultimately spare us the war.  The latter would have spared us one of 20 years.”

Stephens’ history, Rubin’s post, and their implications concerning Iran will be the subject of a near-future post.

Two articles in today’s Courier-Journal about Wendell Berry that might interest people.  First, Actor’s Theatre in Louisville is set to beging performing a play, Wild Blessings, based on his work.  It’s apparently “composed of 36 poems” — and hopefully will work out better than that Billy Joel musical someone slapped together a few years back.  I don’t have much to say about it, though if I weren’t leaving for snowy Chicago today I wouldn’t mind going to see it.

The second article, “Poet will step off farm to hear works read on opening night,” is a brief profile of Berry which includes these entertaining sentences:

“In addition to poetry, Berry writes essays and novels by hand and his wife assists with the typing of them. He has made a slight concession to computers. His manuscripts are copied onto disks when it’s time to send them for publication by a main press.”

 That brings to mind a Berry-esque rant about handwriting in Bringhurst’s book that I read the other day (and was trying to figure out a way to make relevant to a posting):

“Many people now cannot form legible letterforms at all except by tapping on a keyboard.  For those people, writing and the alphabet have, quite literally, ceased to be human.  How do you expect to be able to cook good food or make good love when you write with prefabricated letters?  How do you expect to have good music if you live on a typographic diet of bad Helvetica and even worse Times New Roman — never mind the parodies of letters that flash across your cellphone screens and the parodies of numbers marching over the screens of your pocket calculators and cash-dispensing machines?  How can things so ill-formed have a meaning?” (“The Typographic Mind” in Everywhere Being Is Dancing pp.217-8)

My handwriting, as described by one classmate “is either the worst-best or the best-worst handwriting I’ve ever seen” and, in the words of another, “Looks really distinguished and pretty until you actually try to figure out what it says.”  (Or, as a teacher once put it, “It’s not illegible.  It’s just difficult.”)  So I’m not quite there yet — and I certainly don’t have the hand-stamina to do what Berry does.  I don’t know: I like to think that a little bit of one’s personality comes through in handwriting, which is part of why I don’t like reading handwriting that’s blandly sloppy — when it comes across like the person writing was irritated that they had to be bothered to take the time to write something out.

The idealized image of what handwriting should be, in my mind, will always be my grandfather’s (though it has suffered a little recently as he’s aged, but it’s still more elegant than mine, and probably than mine ever will be).  He’s a retired elementary school principal, and among the many laments he has about things that are no longer taught in schools is penmanship.  (And good posture.)

“Skill Is Seductive”

March 26, 2009

Despite my differences with him on certain issues (religion, particularly) I think that Robert Bringhurst is one of the most fascinating writers and thinkers out there — his analysis of the meaning of mythology is, if you ask me, second to none.  And his voice is strident on the nature of art and artifice:

“[Robert McNamara’s] example has taught me, nonetheless, that positions of power must not be occupied by people who are happy to take refuge in the craft of administration or the skill of systems design, nor by people whose sense of respect for the physical world is subservient to their sense of political loyalty.  There must be some point too at which even typographers, meterologists, knifesmiths, philosophers, and shovelmakers raise their heads from the workbench and ask how what they make is being used.  There is no sane person to whom napalm or mustard gas is saintly.

[…]

“Morality is part of language itself, and language is part of morality.  Not all sentences are good to speak on all occasions even though the language can construct them.  And not all things the designer can design are desirable just because he can design them.  I think this truth applies, in its small way, even to Peter Schoffer’s title page — though in Schoffer’s case the witnesses are dead, the statute of limitations has long run out, and the page is inarguably beautiful.”  (Robert Bringhurst, “Boats is Saintlier than Captains” in Everywhere Being Is Dancing pp. 197-9)

Or, to see it framed differently, read the parable of “Father Smith’s Confession” and “Father Smith’s Footnote” in Walker Percy’s The Thanatos Syndrome.  It doesn’t work for excerpting in this medium.  The point, of course, is that the beauty of the artifice alone isn’t enough to make something truly, nobly beautiful (for it to be kalos, let’s say).  Because,

“If we divorce truth from beauty, we’re engaging in a sophomorically lazy reading of Keats’ dictum, forgetting that beauty alone does not make something truth: we must have truth to have beauty.”

Without the truth that makes it kalos, the beauty of the artifice can be deceptive.  In the realm of art, it leads to debates over obscenity and appropriateness and eventually at least one side calls the other bourgeois; removed from that realm, however, the deception can become dangerous: elegance does not necessarily make something good.

________________________________________________________________________________________

(The title of the post is Bringhurst, from the same essay.)

At Jewcy, Ben Cohen writes:

“Our view of history — more precisely, the way in which we remember the recent past in the public domain – generally tends to be cluttered by the political imperatives of the present.”

What he’s more specifically talking about is genocide, which has developed different definitions, depending on the situation you’re referring to. In the abstract, legal sense, it

“means any of the following acts committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group, as such:

(a) Killing members of the group;
(b) Causing serious bodily or mental harm to members of the group;
(c) Deliberately inflicting on the group conditions of life calculated to bring about its physical destruction in whole or in part;
(d) Imposing measures intended to prevent births within the group;
(e) Forcibly transferring children of the group to another group.”

That’s very roughly what’s meant when it’s used in relation with the Holocaust – and that would make sense, as this definition was formulated as a result of the atrocities of World War Two. Contemporary political conversation, however, has seen “genocide”

“recast as a ‘civil war in which all sides are committing atrocities’ and, equally, ‘a nasty regional conflict in which culpability can be distributed among several parties’.”

Again, roughly speaking: he’s talking about Darfur, which is commonly called a genocide despite it’s failure to meet the legal definition; and he’s talking about those who refer to Israel as genocidal.

What’s particularly interesting here is the interplay between past and present: altering the definition of a word alters a past action, if that action was inextricably linked to that word (as is the case, I feel safe in claiming, with the Holocaust and genocide). But using that word in a contextually inappropriate present situation applies the older/original definition to the present: the perceived reality of the present is also altered.

That may seem contradictory: present redefining past and past redefining present, the word itself taking on two different yet simultaneous meanings. But imagine it as more akin to the word, used in two different context, flattening the differences between those contexts. It isn’t that one becomes the other, but that both are shifted toward a mean, and neither remain what they objectively are/were.

Or, to bring in the requisite line of Orwell:

“He who controls the past controls the future. He who controls the present controls the past.”

See Also: “Notes on Meaning and Language”

I’m going to try to put together some more descriptive/cohesive thoughts on conservatism, writing, and reporting over the weekend, but for now I’ll point you toward William Beutler’s take on it from way back in May, or at least this paragraph, which is the point I particularly want to channel:

“The reaction is usually to set up an alternative forum which is defined as being explicitly conservative. The problem is that these alternative organizations often operate inside a bubble which their “liberal” counterparts do not. This can be the case beyond journalism as well. On the web we can see this very clearly: The non-partisan but in some ways “liberal” Wikipedia has been answered by the conservative-minded, low-quality Conservapedia.”

If nothing else, it seems that C11’s shuttering has reopened discussion of conservatives and reporting — not merely about politics, but about culture.  So there’s your silver-lining for the day.
***
In his C11 post-mortem, Joe Carter answers the question I spent five months pondering: What’s up with that name?  And why is Google unable to help me figure it out?

“In case you hadn’t heard, LibertyWire was our original name. But it didn’t fit. Even if we were to be a political site, LibertyWire didn’t convey the type of site we wanted to become. As William Beutler said, the name “sounds like an Associated Press for Ron Paul voters.” So we searched for a new name. And searched. And searched.

Choosing a domain name is a tough task; choosing a domain name that suits a crew of hardheaded and opinionated writers is nearly impossible. The suggestions ranged from the horrible (Voxtale) to the bland (MainStreetScene) to the what-were-they-thinking (The Confabulum(!)). I had been kicking around the idea that we should be focused on 11 key areas of culture (which became our 11 categories), so I suggested “Culture11.” We didn’t hate it, which was consensus enough. It was short, easy to remember, and – most importantly – the URL was available. Culture11 we became.”

Updike

January 27, 2009

I’ve always been more a partisan of Philip Roth than of John Updike; my experience with his work is limited.  I suppose Roth’s post-war Jews grabbed my attention more than Updike’s “Protestant small-town middle-class,” and if I needed the latter, I always had Faulkner.  But I still felt something drop in my stomach when I read the headline.

The piece of his writing that made the strongest impression on me (and I’ve only read a handful of stories, essays, and poems; never a novel), was this story, from The Atlantic‘s 2007 fiction issue.  And what I remember isn’t much the story itself (I remember much more the act of reading it, out back in late summer humidity) than the single, fragmentary line I copied into my notebook:

“… her mother tongue, the language of her heart…”

I wrote it down then because “mother tongue” made me think of Yiddish, the mame loshn now departed.  But today, looking at it for the first time in over a year and with much more experience now in and out of English and other languages, I can say it explains my love-bordering-on-obsession with English — why I come back to her after every dalliance with classical Greek, even though that Hellenic glossa is the more beautiful — better than Roth’s brief apologia in The Counterlife ever could, no matter how fond I am of quoting it.

Paul Dean, reviewing Geoffrey Hill’s critical writings in December’s TNC:

“If language is fallen, yet can be God-bearing, has it been redeemed, and if so, how? Was language, too, saved on Cavalry? (That is not a flippant question.)”

Though I’m not exactly of authority to hazard a response to that question in its particular form, I’ll do it anyway: If language is fallen, I wouldn’t place the fall in connection with Original Sin (as Hill, apparently, does) or, more specifically, with “the serpent’s use of specious argument to win Eve over” (as Dean does). Babel, rather, seems the proper setting for its (literal) Fall: the Fall involved punishment, but language was not punished until after Babel, when it was made imperfect and scattered out of a unity.

Of course, I have trouble with what I’ve just been saying, mostly because language of “the Fall” and “fallenness” isn’t something I’m perfectly comfortable with. They are, to my ears, inextricably linked with the idea of Original Sin-and therefore, like it, not Jewish terms. I understand them, of course, and have developed an aesthetic appreciation of the concept – I have to if I intend to live within the Western literary tradition (and have to if I intend to appreciate so many of the works on any meaningful scale). But to truly believe the language, one needs (I think) a Christian sensibility.

My preference is to couch discussion of post-Edenic existence in terms of loss, not fall. Between that and a (more Jewish) belief in an inherent imperfection in man (a state caused by not being divine, or The Divine, rather than resulting from a Fall), there’s enough common ground that I can read (for example, since his book is on my desk as I’m writing) Peter Lawler and sense that we agree on the present state of man’s fallibility and imperfection, while disagreeing on how he got there and where he’s going from there/how he’s getting out of it.

So I would say that language is less Fallen than humanly imperfect; that its fall from the peaks of Babel represents not a Fall but a brokenness — a loss, if you will, of wholeness.

And if we’re going to talk about the merits of the term “Judeo-Christian tradition,” or, more specifically, a Judeo-Christian political tradition, it stands to point out that the two traditions define the origins (and therefore the particular nature) of man’s imperfection differently. Such differing opinions regarding the meaning of the expulsion from Eden and the validity of Original Sin/whether we are specifically fallen, are not negligible, and any common conservative politics (as opposed to worldview or disposition), or (more aptly?) dialogue of conservative politics in/for a shared arena, can’t be established without some sort of contingent superstructure built (precariously?) above it – though that structure may merely be acknowledgement of this difference.

(I suppose you could argue that a similar endeavor is required for non-religious conservatives; though I wonder whether background in the Christian/Jewish/other tradition wouldn’t play an important role here – possibly so much so that merely being a non-religious conservative from the Christian tradition would provide more common ground – on this single matter – than if one were religious but coming from the Jewish tradition.)

Have I mentioned how much I enjoy reading Alan Jacobs’ new blog at Culture11, Text Patterns?  Anyway, today he’s got a post up talking about “The Age of Correspondance.”  Though it’s not his title for whichever age it is that text messages and e-mail are ushering us into, my immediate reaction was to draw a distinction between the average e-mail (at least in my inbox) and certainly the average text message and the term “correspondence.”  I think the latter denotes something more substantive, and not necessarily by means of a fountain pen or typewriter.

Letter-writing isn’t dead by any means, nor do I think it’s going to go that way.  But what remains will be increasingly intentional: writing a letter with the aim of engaging in correspondence rather than merely keeping-in-touch.  After all, we have e-mail and The Facebooks for that now, no?  My only real engagement with letter-writing (excepting those obligatory notes home from summer camp) have been very deliberate: between a friend and myself, in part because her camp-counsellor job one summer was going to severely limit internet access, but also because we both wanted to try that type of writing as a particular form

I’ve got to admit: I find it much more pleasurable than e-correspondence; there’s something inimitable about the feel of a pen on paper — whether it’s one of my nicer “writing” pens or the cheap Bic ballpoint I was using today to take notes in class.  Writing by hand requires a more deliberate mind and prose than typing on a computer: when each mistake and correction still leaves some trace on the page (unless you scrap it entirely), you become more cautious about the type of mistake you’re willing to make, if not mistakes altogether.  The result is a style at once more finished and with more traces of having been hewn from something — of having been written?

That, and the “small pleasures, small moments of imaginative vision,” are not limited to the archivist and the academic: the form itself has a certain character otherwise lacking (compare a vinyl LP to a CD or mp3, except it’s visual stimulus, and probably more real), and I, at least, find great pleasure in reading handwriting that — even though it may win no awards for penmanship (mine certainly wouldn’t) — has character to it.