Feeds:
Posts
Comments

Posts Tagged ‘writing’

One of the projects I need to finish some day is a translation of the Old English poem “The Seafarer.” It started as a directed studies program when I was an undergrad, and I’ve puttered away at it now and again ever since.

Why, I’m not sure. I have no sympathy for its Christian moral. But the descriptions of sailing have a beauty of their own in the original, and I admire the cleverness with which the vivid description gives way to a moral. As in many of the most moving Old English poems, the description of an outcast’s life in “The Seafarer” has a vividness that comes through even when you have limited understanding of the language.

By contrast, I’m exasperated by the stolid translations others have done, particularly Ezra Pound’s ham-fisted one, which reads like exactly what it is – the work of someone completely ignorant of Old English cribbing from a dictionary and guessing at the grammatical structure. At best, most existing translations seem too literal, hiding some of the complex associations in the poem and giving rise to false issues (such as whether the poem is two or more fragments clumsily welded together) that disappear when you consider the original language.

Translation, of course, is by definition an exercise in the impossible. No matter how hard the translator tries, the best they can ever manage is to recreate an approximation of the original as they conceive it.

Still, someone translating Old English has some advantages that other translators don’t. As in many translations, Old English offers false cognates; “dream,” for example, means something like “joy” rather than the modern “dream.” It also contains what I think of as half-cognates, or words whose meanings overlap with a modern word but aren’t completely synonymous: for instance, “graedig” means “eager” rather than “greedy,” while “lustig” means “longingly” rather than “lustingly.”

However, Old English does have many words that have exact equivalents in modern English. Sometimes, these words may be mildly old fashioned, but often that works well, since Old English poetry does appear to have had some vocabulary that wasn’t used in everyday speech. When no equivalent exists, modern English, with its much broader vocabulary, can often provide several alternatives that don’t seem too jarringly out of place.

Often, translators can even offer a reasonable facsimile of the Old English poetic line. This line usually consists of four accented syllables, of which the first, third, and sometimes the second alliterates. A translator can almost always keep to the four accented syllables per line, and, over three-quarters of the time, to the alliteration pattern as well. When the alliteration pattern can’t be sustained, an alternative such as having the second and fourth accented syllables alliterated gives an acceptable approximation. In general, this meter is far easier to keep up than, say, classical Greek hexameters.

Even so, a completely satisfying translation is a matter of effort. Sometimes, despite modern English’s larger vocabulary, no word exists that fits all the connotations of the Old English original while fitting into the meter.

An especially troubling example is “dryhten,” a synonym for “lord” that implies a leader of warriors. “Lord of hosts” would carry the same sense, especially since the word is used at one point to contrast an earthly lord with the Christian god, but adds an extra syllable to the line, while a coining like “host-lord” looks jarring.

In addition, to make life simple, I want a word that starts with “d,” so that I can easily translate a couple of key passages. Yet nothing really fits. “Director”is too modern-sounding, and “doyen” has the wrong connotations. “Dominus” doesn’t suggest a war-leader. “Dux” wouldn’t be bad, except that it has specific historical connections with the last days of the Roman Empire, and wasn’t used by the Old English so far as I know. Nor could I use “dux”’s modern equivalent “duke,” because, like so many other choices, it lacks the martial implications.

In desperation, I’ve even considered “ordainer” long and hard. Since it contains an accented syllable starting with “d,” it fits the meter, but, unfortunately, is utterly unfitting for an earthly lord. Consequently, I’m starting to look further afield, but, once I get away from the original alliteration, I open myself up to constant problems, because a dodge that works in one place usually doesn’t work in another place where the same word occurs.

And so the translation goes, word after word, trying to balance meaning, sense, and poetry and usually failing to meet at least one of these goals. In retrospect, it’s no wonder that I keep putting the translation away in despair. I often think that I’m trying to do the impossible, especially when I’m driven to considering what I can leave out when I would prefer not to omit anything.

Still, the effort lurches forward. I don’t know that I’ll ever manage the fully annotated version of the poem that I once hoped for, but I hope that before much longer, I will at least have a complete modern English version of the poem that does at least some justice to the complexities of the original.

Read Full Post »

In 1984, I fell into conversation with David Brin while pouring over the books in a science fiction convention’s dealers room. He suggested one book, reminding me that it had won the Nebula Award, and I said, “Oh? Was that back when the Nebula meant something?”

Abruptly, I remembered that Brin had won the Nebula Award a few months ago for Startide Rising. I made some strangling noises of embarrassment and stammered out an apology, and he was gracious enough to say, “That’s OK. I used to feel the same way.”

I still flinch at the memory, but not the sentiment. The truth is, with all respect to Brin and many other deserving winners, I’ve never cared for literary awards of any kind, even though once or twice I’ve served on awards committees myself. The recent news that Tolkien’s prose was dismissed by the Nobel Committee in 1961 as having “has not in any way measured up to storytelling of the highest quality” only reinforces my dislike.

For one thing, technical merit is only one consideration in a literary prize. If nothing else, few awards have any provision for nominating nothing in a given year, and those that do are under heavy pressure from publishers and booksellers to avoid using it.

Moreover, while an award can sometimes be made entirely on technical merit, especially in its early days, or when its jury is hidden, too often nationalism, friendship, and professional interest interfere. For instance, the Nobel Committee has been under pressure for years to see that non-European writers are better represented among the winners. At times, deserving writers have been passed over as too old.

In such circumstances, the criterion of excellence threatens to become compromised. Yet this simple fact can never be admitted. Instead the pretense that the award is completely for excellence is kept up, and the selection process becomes an exercise in hypocrisy. The most that a conscientious member of the jury can hope for is that the eventual award winner isn’t entirely unsuitable.

Fortunately (for the jury members, if not the reading public), such winners aren’t hard to find. Beyond a certain standard of quality, it is frequently impossible to claim in any meaningful way that one writer is more skilled than another, because they have such different goals artistically.

For instance, if you look at well-known 19th Century writers (whom I’m choosing because the canon is much more established for the writers of over a century ago than for living writers), you might just be able to compare meaningfully George Eliot and Thomas Hardy, because they share an interest in psychology.

But how do you compare either one to Charles Dickens? To Charlotte Bronte? Jane Austen? Mark Twain? Henry James? As soon as you ask yourself by what criteria one of these is considered a better writer than another, the whole exercise becomes absurd. The best you can do is point out what one of these writers tries to do that another doesn’t. But these differences don’t mean that one is better than another, any more than differences in physiology prove a cat a superior animal to a dog.

And when you’re dealing with modern writers, the task is even more difficult. With most modern writers, no one has observed what they are good at. Instead, jury members are thrown back on their own powers of observation, or – more likely – upon the perceived wisdom of their generation’s academics and critics.

That, I suspect, was what happened to Tolkien in 1961 (to say nothing of Graham Greene, Karen Blixen, and Lawrence Durrell, all of whom are now recognized as major literary figures). If you take Tolkien on his own terms – as a writer whose important influences are a mixture of Old English and Medieval traditions, popular ballads, and oral storytelling, and as a writer of epics rather than novels – then he is an excellent writer of his sort.

However, in 1961 (and still, to an extent today), none of Tolkien’s influences or intentions were recognized by academics and critics as worthwhile. Tolkien’s tradition is plot-based, and its social observations are metaphorical where they exist at all. He has little psychological perception, even less social realism, and, generally speaking, none of the virtues prized in a modern serious novel.

Under these circumstances, how could the Nobel Committee possibly appreciate Tolkien? Its members would have been like people who are color blind trying to appreciate a painting in which subtle changes of hue are a major element. With the best will in the world, they couldn’t appreciate Tolkien – and, backed by the official opinions of their times, they probably didn’t see any reason to try very hard, either.

Literary awards may be popular with publishers and booksellers, because they can mean increased sales. Yet I can’t help noticing that very few writers of any stature take them very seriously. In fact, in my experience, the more acclaimed a writer is, the less seriously they take any award. They know that the only real competition they are up against is themselves.

Read Full Post »

The turn of the year always leaves me relieved for two reasons. First, every time I go out, I don’t have to watch women exhausting themselves to feed dozens, and refusing all help. Second, most of the year end summaries are over and done with.

I admit that I’ve done my share of year end summaries. As a journalist, I find these stories are almost unavoidable, and I imagine I will do more of them in the future. If nothing else, they are relatively easy – if tedious – to write, which can be a relief at a time of year when I’m distracted by various social demands and low on ideas.

But although I try to find a theme or two for my summaries, so that they are not just a random collection of facts, I’m still uncomfortable about writing them. For me, the only thing worse than writing these stories are reading them.

What I object to is that these stories are the first effort at official history. I have no problem with reporting or reading individual stories as they happen. But selecting which stories are important – that I have a problem with.

I realize, of course, that proposing stories or taking suggestions from editors is itself a form of deciding what is memorable. Sometimes, when I think of the stories that I haven’t told, either because I was too busy or because an editor thinks them too dull or too hot to handle, I feel like Midas’ barber. You know – the one who, seeing his master cursed with donkey ears, dug a hole to whisper his secret to, because he had to tell someone.

However, I can more or less live with this daily selectivity as a necessity. After all, there just isn’t time to cover every story.

But year end summaries go one step further. Starting from the already selected daily news, they go one selection further. That makes them two removes from reality, an abstraction that is even more remote from what’s happening than an everyday story.

Even worse, year end summaries are the beginnings of official history. And, as anyone who has read George Orwell’s”Looking Back at the Spanish Civil War” is aware, official history slips easily into distortions and outright lies.

For example, when Expo 86 was held in Vancouver, there were mass evictions of the poor, and pointed questions asked about government spending priorities. But, after it was all over, the year end summaries proclaimed it a success that silenced all oppositions. Never mind that those of us who objected continued for years to boycott many people and companies who took part: drawing on those year end summaries, people now talk about the whole affair as a golden moment, unsullied by any complaints.

The same transformation has occurred already with the 2011 Winter Olympics. I remember several surveys that showed that, even after the fact, fifty percent of those living in the greater Vancouver area continued to object to the spectacle. Yet, now, less than two years later, mainstream journalists continue to tell how, despite some initial misgivings, residents were won over by the excitement and the Olympic spirit. Inconvenient facts like riots and protests are dealt with simply by not mentioning them.

Fortunately, writing about free and open source software for a variety of websites and magazines, I face almost no pressure to write feel-good stories, nor controversial ones. But, being aware of how easily year end summaries can become part of a corrupt process, I shy away from writing them. I don’t falsify my reactions, but I worry that, by writing such stories at all, I could be lending myself to a process of which I disapprove.

Just as I rarely read such stories, I can barely bring myself to write them. That’s why I’m always glad to see their season depart. Come January 1, for another eleven months, I don’t have to face the dilemma that they represent. Like the spectacle of women being expected (and  expecting themselves) to prepare a huge meal while everyone else socializes, for me year end summaries seriously diminish what whould otherwise be an enjoyable occasion.

Read Full Post »

A friend of mine insists that everything happens for a reason. If a relative dies, or he receives a disconnection notice from the power company, or he fails to make his rent on time, he consoles himself by repeating this idea over and over. But for all my familiarity with his refrain, it only occurred to me today that he might be right – although not in any way that he would suspect.

I’ve never told him, but the implied resignation in his philosophy always leaves me faintly irritated. If you’ve ever been with someone whose hearing is much better than yours, you’ll understand my irritation, because if there’s a pattern to the events around me, it appears to be beyond my perception. I’m tempted to dismiss the idea out of hand, then I wonder: what if he sees something I can’t?

Even more seriously, if you believe that everything happens for a reason, you quickly confront the idea of inevitability, or of a deity who controls events. In turn, either of these ideas quickly comes up against the so-called problem of pain – in other words, what’s the role of suffering within that all-explanatory reason? And if hurt or loss is either inescapable or ordained by a deity, then the universe is hostile, and any ruling god is, as Mark Twain suggested, “a malign thug,” worthy neither of worship nor resignation towards through the belief that everything happens for a reason.

Such possibilities seem a needless complication compared to the idea that there is no innate reason, and that much of what happens is simply the result of chance and beyond our control. My friend creates a sense of meaning by believing in hidden causation, just as I create meaning by opposing and trying to limit the painful when it randomly occurs.

So far, so existential. But I suddenly realized today that I hadn’t taken my outlook far enough. If no innate reason exists, there is no reason why you can’t settle on your own purpose for everything happening, and take comfort from that, even if that purpose is only finding satisfaction in the fact that what happens is in accord with your perception of how existence works. That may be a distant and unsatisfying outlook, but it would make my friend’s mantra truer than I imagined possible. But what purpose would reconcile me to stoicism in the face of disaster?

Immediately, I thought of a story by Jorge Luis Borges, which mentions in passing that the entire purpose of a greedy Sixteenth Century merchant’s existence was to give Shakespeare the model for Shylock in The Merchant of Venice. Having recently spent a couple of days with fantasist Fritz Leiber’s immediate descendants, that in turn reminded me of Leiber’s comment that seeing everything he experienced as potential story material was part of his adoption to life.

Was there any reason, I wondered, why I couldn’t assume the same perspective? If what happens is story material, then that would be a reason that I could accept for everything that happens. Some events might be horrific, but, softened by being put in a story, maybe even they might instruct, amuse, or distract both me and any audience. The idea requires no belief in destiny or a deity, and morally involves nothing worse than salvaging something from whatever happens.

The only problem is that, so far, I’m not much of a fiction writer. While I’ve sold close to twelve hundred pieces of non-fiction, I’ve only published two pieces of fiction, both so short and so slight that calling them minor is being kind. The potential is there, but time is starting to run out.

Still, there is no reason why I can’t try out the perspective. Cultivating it might even encourage me to use the material I’ve collected, to settle down and make more of an effort at fiction.

That may be expecting too much. But, effective immediately, I’m resolved to see how the perspective works out. Since I’ve been mentally circling the idea since I first thought of it this morning, it seems worth exploring. I’ve rarely had an idea that intrigued me so much.

Who knows? Perhaps in a year or two, when my friend tells me everything happens for a reason, I will nod and tell him that he doesn’t know how true his belief actually is.

Read Full Post »

(Note: The following is a handout I used to give in composition courses to first year university students. You are welcome to reformat and distribute it under the terms of the Creative Commons Attribution-ShareAlike License. Basically, that means you it any way you like so long as you give me credit and let others use it under the same conditions)

A. “Chunk” (Paragraphs arranged by subject):

In the co-op, Judy is the practical one. Of the four people who share the house, she is the only one who is not visibly eccentric. She keeps regular hours, and sees that the bills are paid. If food is bought, or laundry is done, either she has done the work or bullied someone else into doing it. Periodically, she musters everyone else for a massive cleaning of the house. It is only her profession–writer–and her New Age interests that suggest how unusual she is.

By contrast, Saul, the household’s other original resident, is so eccentric that his friends think that he looks abnormal in ordinary clothes. His usual wear is either a faded red caftan or Scottish formal wear, complete with a sporran and skean dhu. Because of his light-sensitive eyes, he is usually awake while other people are asleep. He rarely leaves the house, and the ordinary business of living holds little interest for him. He never considers bills, and, although he will eat if food is available, will lives for several days at a time on nothing more than nerves and coffee. Only the area around his computer is clean; the rest of his living space has mounds that archaeologists would love to excavate. Even his hobbies are unusual: sword meditation and writing poetry in obscure languages like Gaelic and Iroquois. Unlike Judy, Saul seems incapable of functioning normally; he does not meet visitors in their world so much as invite them into his.

B. “Slice” (Paragraph arranged by Points of Comparison):

Although both Judy and Saul are old friends, they have little else in common. Saul is visibly eccentric; Judy is so ordinary that she is no more noticeable on the street than a lamp post. She is awake and starts work when their neighbors do, and she can handle such things as bills, laundry, shopping and cleaning–matters that are mysteries to Saul. She even knows how to organize the other household members. Saul, on the other hand, can barely organize himself. Except for his work station, he is surrounded by clutter. If his routine is more organized than Judy’s, the reason is only that he organizes only himself–and then only so that he can work, which is the most important thing in his life. His life is arranged to give him as much time to work as possible, so he pays no attention to ordinary matters like food. A night person, he may go for days at a time seeing nobody, never leaving the house, and surviving on coffee with the odd bit of leftovers. Even his hobbies, sword meditation and writing poetry in obscure languages like Gaelic and Iroquois, are private. He is so different from Judy that many people are surprised to learn that they have shared a house for over twenty years.

C. Analogy:

If the difference between eccentrics and ordinary people is the difference between night and day, then Judy is twilight and Saul is midnight.

Read Full Post »

Whenever I come across a use of language that makes me cringe, I tell myself that the English language is robust and evolving, and can survive any number of hopeful monsters. I outlasted “not” being added to the end of every sentence, I keep reminding myself, and I can survive whatever other grotesque usage that slouches my way. But there are limits, and mine is “not appropriate” and its near-relative “inappropriate.”

You know what I’m referring to: flat, prim statements that something done, said – or even thought – is “not appropriate.” Typically, I regret to say, it is said by someone such as a unionist, feminist, or environmentalist, with whose basic ideas I agree, but with whose tactics (obviously) I find reprehensible. In fact, I can think of few phrases I disdain more.

What’s my problem with “not appropriate”? For one thing, it’s a euphemism. When someone says that something is “not appropriate,” what they really mean is that they dislike or disapprove of what they are condemning. But instead of saying what they mean, they hide behind a vague phrase. So, right away, they’re being dishonest.

However, unlike many euphemisms, “not appropriate” isn’t used to be discreet or to spare someone’s feelings. Instead, it’s one of the most basic invalid arguments imaginable: an appeal to an authority – in this case, the alleged standards of the community and the unspoken rules by which we live by. The implication is that the person who has done something “not appropriate” has transgressed in a way that no decent adult ever should.

I say “alleged standards” because, almost always, the transgression is not against existing community norms, but against what the speaker would like to be the community norms. My impression is that the speaker is hoping that, by assuming that these norms are already generally accepted, they can enforce their ethics as though everybody shared them.

I would find such tactics hard to tolerate in anyone, but, when standards I support are used in this way, I worry about the harm they can cause. I suspect that many people who might otherwise be persuaded to those standards will reject them simply because they resent the clumsy efforts at manipulation.

After all, when accused of being “not appropriate” or “inappropriate,” you are not supposed to stop and consider the merits of the standards being implied, or discuss what is happening. You are supposed to act on reflex, and shut up.

Those who go around condemning things as “not appropriate” are setting themselves up – almost always, completely unasked — as authorities about what is socially acceptable. The implication is that they know what is right and wrong, and those they address they do not.

Basically, they are offering themselves as the guardians of ethics and morality, demanding that others obey without any discussion. They are taking on the role of teachers and casting everyone else as dull students, playing parents to unsatisfactory children, or cops to the mob. They are usurping an authority to which they have no right – and, when they stoop to condemning even thoughts (or their interpretations of them) as “not appropriate,” they become downright creepy.

No matter how you parse the phrase, “not appropriate” is a fundamentally dishonest and authoritarian expression. The sole virtue of “not appropriate” (or “inappropriate”) is that its use signals that the speaker is so committed to intellectual fraud and authoritarianism that you can save yourself endless time and effort by walking away from them.

Read Full Post »

One LinuxWorld Expo, Maximum Linux and Linux Magazine had booths next to each other. Near the end of third day, as journalists lounged around the booths talking, a free software celebrity approached. “Look,” he said, singling out a single magazine from a display. “There’s my issue!”

He wasn’t talking about an issue featuring an interview with him. He was talking about an issue in which he had published a small and rather minor article.

The assembled journalists looked around, as embarrassed at this flash of ego as though a puppy had relieved itself on the carpet. No one wanted to point out the obvious: that most of the journalists had dozens of articles in the displayed magazines, and that bragging about your work just wasn’t done.

I was new to writing at the time, but I quickly absorbed this lesson about the difference between professionals and amateurs. Professionals take the work seriously, but not themselves. Amateurs and semi-professionals reverse the priority, so for them writing and publication is all about ego.

What other differences are there? Considering the number of people who describe themselves as writers in their Facebook and Twitter profiles, the question is worth answering.

Here are some of the other differences I’ve observed:

  • Professionals work when they need to, amateurs when they feel like: If you graduate from university, publishing isn’t that difficult, especially if your editor is willing to guide you. Amateurs, though, are likely to sit back and glow in the bask of accomplishment when they sell an article, and not write another one until the mood strikes them. Professionals may celebrate sending off an article by taking the rest of the afternoon off, but usually they’ve no sooner finished one article before they need to start thinking of the next.
  • Professionals are pragmatists, amateurs perfectionists: Professionals take pride in their work, but they also have a perspective on their work. They know they aren’t writing deathless prose most of the time, but something that will be forgotten in a few weeks. By contrast, amateurs will labor far past the point where improvements are worth the time and effort. Professionals don’t have time for endless tinkering.
  • Professionals judge their work by results, amateurs by efforts: Like high school students, amateurs assume that their work should be judged by how hard they try. Professionals recognize that their work is judged by results – and that an article that is long or takes hours to write can fail as easily as a short, quick piece.
  • Professionals make deadlines or explain why: To amateurs, submitting an article on time is less important than their perfectionism. Professionals know that when they miss deadlines, they are letting their editors and other people down. I’ve heard amateurs laugh about missing deadlines, but rarely a professional. If a professional does joke about deadlines, they sound distinctly guilty.
  • Professionals accept criticism, amateurs are hostile to it: Sometimes professionals complain about editing, but they are usually sharp-tempered because of other matters when they do – or right. But, having invested so much more in their work, amateurs have trouble accepting that their work could be improved. In fact, many amateurs become angry if their work receives anything except praise.
  • Professionals edit to improve the work, amateurs to make it sound more like them: To a degree, all editing is affected by the editor’s own habits of writing (in fact, I can predict fairly accurately what changes an editor is likely to ask in a piece once I’ve worked with them a few times). The difference is that professionals try not to rewrite a piece by someone else as though they had written it. Amateurs don’t make this distinction, imagining through inexperience or ego that the way they write is the only possible way to write.

There are probably other differences, but these are the most common ones. Professionals, I suspect, will nod in agreement at them. Amateurs will probably either be angered by what I said, or else guiltily recognize their own faults.

Needless to say, it’s those who recognize themselves in my comments who have the best chance of making the transition to professionals. Other amateurs might also make the transition, but their progression is likely to be rockier, and include longer and more frequent detours.

Read Full Post »

If I talk or write long enough, I’ve concluded, I’m going to say accidentally something that I didn’t mean to say. I don’t mean that my words will suggest a double entendre, which nine times out of ten only causes everyone to laugh. I mean that what I say will have implications that I didn’t intend, or will be interpreted in a way that I never meant them. No matter how hard I try, sooner or later I’m going to slip and embarrass myself.

The first of these situations that I can recall happened shortly before Trish and I married. We were sitting with a mixture of friends and strangers in the pub at Simon Fraser University. Naturally, the talk turned to the possibility of children. I said that one of the reasons that I wanted to work from home after I graduated was that I thought that toddlers would benefit from having a parent at home.

I got up to get another round of drinks, and, when I returned, a woman who had arrived when I was speaking was blasting me for being a sexist. With Trish’s help, I managed to convince the woman that I was not talking about female social roles, but my own.

However, for the rest of the evening, my cheeks could have served as neon lights at the thought that anyone – even a stranger – could have imagined that I was expressing views so foreign to my actual ones.

Another cringe-worthy moment happened when I was teaching a first year composition class to a class with a large proportion of foreign students. I got on well with the class, and I often bantered with the students.

Just before the start of a class, I heard one Asian student complaining about staying up late the previous night to finish the assignment that was due that day. “Oh, you people have it easy,” I said.

By “you people,”I meant “students.” But as an awful silence fell and students started to stare at me, I realized that what people were hearing was “Asians.”

With nightmare visions of official accusations of racism scrabbling around in my head, I quickly added, “You students don’t know when you’re well off.” To my relief, everybody immediately relaxed, and the moment passed without ever being mentioned again. But after that, I was considerably more careful about what I said in class.

In fact, for a couple of decades I was successful enough in watching what I said that I managed to forget such incidences were possible. There was the time that I remarked, “small world,” to a dwarf I kept meeting at the elevators of the Skytrain, but I only realized what I had said afterward, and he didn’t seem to have taken my words as a joke at his expense.

Then, a few days ago, another one happened.

I had just finished a biography of Ada Lovelace, the first computer programmer and colleague of Charles Babbage. As you may know, she was the daughter of Lord Byron and Annabella Millbank. She was raised entirely by her mother, who fled Lord Byron’s sexual abuse and mental cruelties and divorced him.

I have always had a great deal of sympathy for Millbank. Any of Byron’s victims might be pitied, but as an infatuated innocent, Millbank must have suffered more than anyone else would have under his treatment.

However, after reading how Millbank continually tried to control her daughter, right to her dying day, I developed a strong distaste for her as well. Tweeting comments about the biography, I said, “I wonder if there were two sides to the divorce of Lord & Lady Byron.”

Shortly afterward, a colleague objected to the comment. My first reply was, “Not saying that Lady B. wasn’t right to divorce. Lord B. was impossible. But her treatment of Ada suggests she was a control freak.”

Another protest followed. I realized that, in attempting to express sympathy for Lovelace, I was minimizing Byron’s cruelties by suggesting that aggravating but commonplace behavior was just as bad.

I’ve done it again, I thought, and admitted that I’d been too flippant. Perhaps Millbank’s self-righteous evangelism would have been irksome to Byron and to many other men as well, but that can’t possibly justify his rapes and sadism – and, put that way, I had to agree that I had implied something I had never intended to say.

Later, I told the colleague that they were quite right to call me on the comment, and I believe the apology was accepted.

However, the embarrassment – the utter chagrin – lingers. I suppose three or four mistakes of this kind in as many decades isn’t the worst possible record. Some political leaders make as many slips in a single ten minute speech.

But the memories aren’t comfortable ones, all the same. Remembering them, I almost don’t want to write or speak at all. Like many writers, I’m overly-fond of sarcasm and flippancy, and, if I’m not careful, being pithy sometimes matters more to me than being accurate or thinking of implications. As a result, the possibility of another episode is always there.

But not speaking would be cowardly (to say nothing of impossible for someone like me). Anyway, I can imagine situations where silence could be as damning as speaking.

As a result, I’ve decided that, while I plan to watch what I say, some misunderstandings are inevitable. In fact, my determination to avoid them just might make me nervous enough that they become more common.

However, I have decided that, the next time I find myself in such a situation, I will explain what is happening as soon as possible. An apology is embarrassing in itself – but not nearly as embarrassing as being wrong, or branded in someone else’s mind as sexist or racist because of a few poorly chosen words.

Even with an apology, I suspect that some people will say that my original words are a Freudian slip that reveals what I really think. But I can only deal with my imperfections as best I can.

Read Full Post »

Should you – or can you – appreciate works by an artist whose morals or actions you find objectionable?  Today, the question returned to haunt me when a colleague rightly pointed out that a public statement I made about a writer minimized his cruelty and immorality by equating it with shortcomings well within the human norm. That wasn’t the first time the issue or art and morality had come up, and I’m sure that it won’t be the last.

If you consider yourself a person of conscience, the question has no easy answer. In many cases, evoking the cultural relativism of past times just doesn’t provide an excuse. By the standards of any time, Samuel Pepys was a sexual predator. In all likelihood, Byron was, too, although the removal of evidence by his friends allows some people to believe otherwise. Mozart was a brutal egomaniac, Dali a sadist, and Ezra Pound a Fascist sympathizer. Even as seemingly an amiable eccentric as William Blake subjected his wife to poverty and kept her subjugated to his art,insisting that she color in his prints and waking her in the middle of the night to keep him company. The truth is, artists are so far outside the social norms in general that, once you start reading their biographies, many will be found morally lacking.

At times, the exceptions stand out all the more because of their rarity. For example, William Morris was true enough to his ideals of equality that he never divorced his wife, although knowing she was carrying on an affair with Dante Gabriel Rossetti. John Keats also appears to have been a thoroughly decent man, although cynics might question how much poverty and illness simply deprived him of opportunities to offend.

I suspect that how you answer such questions depends on your priorities. If you are only concerned with artistic achievement, then everything else that an artist does is irrelevant. What matters is the art, and the fact that Leni Riefenstahl’s films were propaganda for the Third Reich is irrelevant compared to her cinematic technique.

The trouble with this position is that, if you admire someone for one reason, you often want to admire them in other ways. Unless you are very careful, sooner or later you find yourself making excuses for their behavior, simply because you like their art.

Yet holding artists to the strictest ethics and morality is no easier. For one thing, the artists of which you approve will make a very short list. For another, the question seems a slippery slope. Do you reject Charles Dickens because of his utter inability to portray women as human? Raymond Chandler or Brendan Behan for their alcoholism?  Where do you draw the line for the minor offenders against morality?:When, if any time, do you make an exception?

Just as importantly, there is something crass and insensitive about insisting that art meet other standards as well, perhaps because that is a common practice of totalitarianism. The problem is not so much that at least some arts – especially writing – can have a moral content, as the difficulty of imposing morality upon art without reducing it to the triteness of modern Catholic Holy Cards.

In theory, as George Orwell suggests, it should be possible to hold two separate beliefs — first, that someone is a skilled artist, and, second, that they were reprehensible human beings – but the practice is more difficult. It seems to involve endlessly jumping back and forth between the two extremes, and therefore is likely to satisfy no one. Instead of offering clarity, Orwell’s solution actually invites us to practice double-think – that is, thinking two contradictory thoughts at the same time, a habit that Orwell pointed out is a handicap to clear thinking.

I suspect, however, that is exactly what the majority of us do. We get swept away by the perspective or the choreography, only to start guiltily at enjoying the efforts of someone we disapprove. At other times, we start out disapproving and find ourselves tapping our fingers to the music despite ourselves, or having a memorable phrase lodge in our minds against our sternest judgments. For most of us, the answers don’t come easily or offer much satisfaction when we face the complexity of the situations in which we try to apply them.

Read Full Post »

I spent most of Grade Six drawing maps. The result is a knowledge of geography that serves me well to this day, except for a few newer states in Central Europe and Asia. Another result is that I fully agree with Diana Wynne Jones’ Tough Guide to Fantasyland that most fantasy maps lack any sense of geography, history, or economics.

Recently, I’ve been spending my free time refining the map that will be the background for my efforts at fiction. As I work, I’ve developed some basic rules for ensuring that, whatever else I might do wrongly, at least the geography in my fantasy world will be plausible:

  • Remember continental drift. Your land masses should look as though they would roughly fit into each other. If you have trouble coming up with realistic land masses, try sketching the outlines of clouds or stains; you’ll be surprised how realistic the result will be.
  • Rivers and streams don’t start and end just anywhere. They arise in the mountains or hills, and usually empty into a larger body of water, such as a lake or an ocean. A few may go underground instead. Almost all grow wider as they move away from their source.
  • Mountain ranges are generally the result of the collision of tectonic plates. This means that they will rarely meet at convenient right angles to each other, the way that the mountains of Mordor do in Tolkien.
  • Ecosystems follow set patterns. You don’t have a rain forest next to tundra or desert. Instead, you have prairie and scrubland inbetween. A half an hour’s research on climate zones should be enough for you to get the idea.
  • Cities, towns, and farms don’t appear just anywhere. They are separated by however much land is needed for them to be self-sustaining, the only exceptions being large towns that are supported by a circle of small towns and farms that support them. In a primarily rural culture, they will be close to a water source. As population and trade develop, habitations may be positioned to service traffic on a road, or to take advantage of a certain trade. When you place a habitation, know why it’s there, even if the reason never gets into the story.
  • Consider how people get around. If water is the main transport, you need either a lot of coast line or else large rivers that can be navigated for much of their length. If roads are used more than water, then you’ll have several grades of road, probably ranging from highways like the Roman roads to half over-grown foot paths. All these decisions will affect how far anyone can travel in a day.
  • Forests and wilderness areas are much larger in pre-industrial cultures than they are today.
  • Most lands have a history that involves a succession of different cultures passing through them. Your names should reflect that, suggesting borrowings or corruptions from several different languages mingling together. A particular region might have a concentration of names from one language, and you should know why.
  • Be prepared for your landscape to evolve as you write. However, if changes are necessary, try to make them follow the rest of the guidelines given here.

Put this way, many of these points may sound obvious. But open the frontspiece of your typical fantasy paperback, and the chances are that the map will suffer from one or more of the faults I mention. Some have nearly all of them.

But take the time to create a believable map, and you’ll know more about your story’s background. You might even find story details or plot elements that wouldn’t have occurred to you if you map didn’t have at least a toehold in reality.

Read Full Post »

« Newer Posts - Older Posts »