Feeds:
Posts
Comments

Posts Tagged ‘Bruce Byfield’

(Note: I wrote the following article six years ago for the Techwr-l site. For a long while, it was my most-requested article. Recently, I noticed that a site makeover had left the article down. By now, it may be posted again, but I thought I’d reprint it here, although it’s very long by the standards of blog postings. Although addressed to technical writers, almost everything in it applies to writers and writing in general.)

Most technical writers are confused about grammar. On any day on the TECHWR-L list, basic questions are asked: “Is ‘User’s Guide’ or ‘Users’ Guide’ correct? Maybe ‘Users Guide?'” “Should ‘web’ be capitalized when used to refer to the World Wide Web?” “Which is right: ‘A FAQ’ or ‘an FAQ?'” Many of these questions become the major thread on the list for a day or two, generating far more debate than they’re worth.

The confusion isn’t so much about the grammatical points themselves. It’s about the nature of grammar in general. Apparently, many tech writers do not see grammar as a set of conventions to help them write clearly. Instead, to judge by the wording of the questions and responses, they see grammar as a set of unchanging rules that can provide definitive answers in every situation.

Some are afraid to break the rules of grammar and risk being denounced as incompetent. A handful, smugly sure that they know the rules, use their rote learning of the rules as an ad hominem attack, nitpicking at typos and small errors to discredit writers without disproving their viewpoints. Most sit in the middle, haunted by the ghosts of childhood grammar classes until they can hardly tell on their own authority whether they are writing well or not. But underlying all these reactions is an attitude that rules are rules, and cannot be broken.

This attitude is usually known as a prescriptive approach to grammar. It assumes that grammar exists mainly to tell us how to speak or write properly–not well. It is an attitude that tech writers share with almost everybody in the English-speaking world. It is a form of conditioning that begins in kindergarten and continues through high school and even into college and university. It undermines nearly everyone’s confidence in their ability to communicate, especially on paper. Yet it is especially harmful to professional writers for at least three reasons:

  • It grotesquely exaggerates the importance of grammar. Although competence in grammar is sometimes proof of other writing skills, it stresses presentation over content. Even worse, it stresses correctness over precision, conciseness, or clarity.
  • It binds writers to viewpoints that are not only arbitrary and obsolete, but, in some cases, far from their own opinions.
  • It undermines writers’ confidence and their ability to make decisions about how to communicate effectively.

Why are we burdened by this attitude? How does it affect us? The easiest way to answer these questions is to look at the origins of the prescriptive attitude and the alternatives to it. Only then can we begin to grasp how we can live without it.

The Rise of Prescriptive Grammar

Prescriptive grammars are the products of the Enlightenment. Earlier grammars such as William Bullokar’s in 1586 and Ben Jonson’s posthumous one were also prescriptive, but intended for language students. The first prescriptive pronouncements for native English speakers date to the Seventeenth and Eighteenth Centuries.

This is the start of the great era of describing and recording. In every subject from biology to Egyptology, educated men struggled to write accounts so thorough that no other one would ever be needed. It was also a time that looked both backwards and forwards to Classical Rome. It looked back in the sense that ancient Rome was seen as the height of civilization. It looked forward in two senses: England perceived itself as a second Rome, and Latin was the language of international science.

The first prescriptive comments were very much in the spirit of their times. Their compilers hoped for definitive grammars that would fix the form of
English once and for all, and provide a source for settling grammatical disputes. Since Latin was the language of the world’s most sophisticated civilization, the closer this fixed form of English was to Latin, the more sophisticated English would be. Moreover, given the belief that culture had been degenerating since the fall of Rome, most grammarians automatically equated any change in the language with decay and degeneration.

John Dryden, the poet and playwright, was among the first to make prescriptive pronouncements. His main contribution to prescriptive grammar was to suggest that prepositions should never end a sentence. The reasons for this prescription are that Latin sentences rarely end in prepositions, and that the word “preposition” clearly indicates that this part of speech should go before (“pre”) the noun it is associated with. Dryden criticized Shakespeare and Jonson for not following this rule, and scrupulously edited his own writings until they conformed to it.

Dryden also hoped to fix the form of English, which was still rapidly changing. Together with the diarist John Evelyn and other authors, Dryden called for an English version of l’Academie Francais–the body of scholars and writers that oversee the creation of the official French dictionary and is the arbiter of correct French. Dryden’s plea for an English Academy was echoed later by Daniel Defoe and Jonathan Swift. The idea was being seriously considered by Queen Anne when she died in 1714. But with the ascension of the German-speaking George I, the question shifted from the monarch helping to purify English to teaching the monarch English, and the idea was dropped.

Deprived of royal assistance, English men of letters did their best to improve the language on their own. In the mid-Eighteenth Century, a number of grammars were published, as well as Samuel Johnson’s famous dictionary, whose Preface spells out its prescriptive purposes with both succinctness and dry wit. All these works were heavily prescriptive, although Joseph Priestley did include some comments about the importance of common usage in deciding what was proper.

The most influential of these grammars was Robert Lowth’s “Short Introduction to English Grammar,” published in 1761. Criticizing almost every major English writer from Shakespeare to Pope, Lowth made most of the prescriptive statements that people still follow today. His prescriptions include:

  • Two negatives make a positive, except in constructions such as “No, not even if you paid me.”
  • Never split an infinitive.
  • Never end a sentence in a preposition.
  • “Ain’t” is unacceptable in formal English.

These ideas have several sources: An attempt to model English on Latin (and therefore to arrive at a universal grammar that underlay all languages), a desire to be scientific, and Lowth’s personal preferences. For instance, Lowth would not accept split infinitives because Latin infinitives are one word and cannot be split. Similarly, two negatives make a positive because they do so in mathematics. The ban on “ain’t,” though, seems entirely Lowth’s idiosyncrasy. None of these prescriptions, however, took any notice of how people actually spoke or wrote. For example, despite Lowth, “ain’t” continued to be used by Queen Victoria and the upper classes until the start of the Twentieth Century.

Although Lowth later became Bishop of London, his ideas on proper usage would probably have remained obscure if they had not been borrowed for classroom use. In 1781, Charles Coote wrote a textbook grammar based on Lowth’s Grammar, adding his own preference for “he” and “his” as the indefinite personal pronoun (as in “everyone is entitled to his opinion”). A few years later, Lindley Murray borrowed from Lowth to write a series of textbooks for a girls’ school. Murray’s textbooks became so popular that they quickly became the standard English in American schools throughout much of the Nineteenth Century. With modifications, Murray’s “English Grammar” and Lowth’s Grammar have been the basis for textbook grammars ever since. With their unyielding rules, these textbooks have given at least eight generations of English-speakers the prescriptive attitude that inhibits people today.

The Biases of Prescription

In their language and their purposes, prescriptive grammars make a strong claim to objectivity. A. Lane was typical of the first grammarians when he wrote in 1700 that the purpose of grammar was to teach people to speak and write “according to the unalterable Rules of right Reason.” Similarly, in writing his dictionary, Samuel Johnson humorously referred to himself as a “slave of Science.” These claims are still echoed when modern defenders of the prescriptive attitude assume that the rules of grammar are value-free. Yet a closer look at prescriptive grammar reveals distinct biases. Some of these were openly admitted by the first grammarians. The problem is that some of these biases are no longer current. Others are demonstrably false.

The early grammarians’ claims to be scientific or precise concealed a strong personal bias. This bias was by no means a conspiracy–it was simply natural self-expression. Grammarians were highly educated, or the subject would hardly come to their attention at all. Since education was a privilege, they were either well-to-do or extremely talented. Since the higher levels of education were barred to women, they were male. And, as might be expected from the subject, they were all intensely literate.

While this background made the first grammarians supremely qualified for literary and scholarly work, it also carried a certain arrogance. The first grammarians showed scant interest in how English was used outside their own class and social circles. All of them assumed an unwarranted authority in their subject, appointing themselves as the arbiters of the language without any justification except their willingness to serve. Robert Lowth, for example, did not hesitate to include his own idiosyncrasies as grammatical rules. In much the same way, Samuel Johnson saw it as his “duty” to purify English usage through his dictionary. This same attitude continues in the prescriptive attitude today.

Moreover, the first grammars were a direct reflection of their authors’ bias and education. The problem is, the education of the Restoration and Enlightenment includes assumptions that we would question today. For example:

  • Rome is the model for all things: In fact, Latin is a poor model for English. Although both languages are Indo-European, the relation is indirect. Even the heavy influence of French, a Latin-derived language, via the Norman Conquest, does not make English’s structure much closer to Latin. At its core, English is a Germanic language, and if the first grammarians had used Dutch, Swedish, or German as a model, they would have had no precedent for objecting to split infinitives or double negatives. However, the Germanic origins of English were poorly understood in Britain during the Seventeenth and Eighteenth Centuries. At any rate, the first grammarians would probably have considered Germanic models too crude to replace the polished perfection of Latin.
  • Writing is the basis for English grammar: Like other grammarians, Samuel Johnson assumes that the principles of grammar should be taken from the written form of the language. Since Johnson was a writer himself, this assumption is understandable. However, modern linguistics regards written usage as simply one of many types of English, none of which is more valid in the abstract than any of the others. If anything, the spoken language is usually given greater priority today, partly because it tends to be the source of innovation, and partly because it reveals how people use the language when they are not trying to write correctly or formally.
  • Change is degeneration: When the narrator of “Gulliver’s Travels” visits Laputa, he is shown a vision of the Senate in Ancient Rome, then the modern English Parliament. The Senators look like demigods and heroes, the Members of Parliament scoundrels and ruffians. In writing this passage, Jonathan Swift reflects a widespread belief in his time that times are getting continually worse. This fallacy is the exact opposite of the modern one of equating all change with progress.

Applied to the English language, Swift’s view has little basis in fact. Admittedly, English has simplified itself over the centuries by dropping most noun declensions and verb conjugations, but that has not made it less useful for communication. Nor, despite the delicate shudder of prescriptive grammarians, does the shift in meaning of “cute” from “clever” to “attractive” in the early Twentieth Century or of “gay” from “happy” to “homosexual” in mid-Century weaken a language with as many synonyms as English. When clumsy or unclear constructions do arise (such as the use of “not” at the end of a sentence), their impracticality generally ensures that they are brief fads.

At any rate, what is degenerate and what is progressive is often a matter of opinion. To J.R.R. Tolkien, the Anglo-Saxon scholar and author of The Lord of the Rings, the hundreds of words added by Shakespeare and his contemporaries are a corruption that ruined English forever. Yet to most scholars, these coinages are an expression of a fertile inventiveness and part of the greatest literary era ever known.

  • London English is standard English: Like most English writers, the grammarians and their publishers centered around London. The language of the upper classes in the Home Counties had already become Standard English by the Fifteenth Century, which is why many people have heard of Chaucer and few people have heard of (much less read) “Sir Gawain and the Green Knight,” a brilliant poem by one of Chaucer’s contemporaries, written in an obscure North Country dialect. That is also why Chaucer and Shakespeare poke fun at other dialects–they already had the idea that some forms of English were better than others. In basing their work on the English spoken around London, the grammarians were simply working in a long-established tradition.

Far from seeing their biases as contradicting their claims to scientific objectivity, the grammarians openly proclaimed their goal of saving English from itself. In fact, by the standards of the time, proclaiming their educational and cultural assumptions was a means of asserting their ability to be objective on the matter.

Of all the early grammarians, the one most keenly aware that prescriptive grammars were biased was Noah Webster, the writer of the first American dictionary. Working on his dictionary between 1801 and 1828, Webster was not content simply to record how words were used. Instead, he was also concerned with producing a distinctly American language. To this end, Webster not only included words such as “skunk” and “squash” in his dictionary, but also introduced American spellings, such as “center” instead of “centre.” In addition, he encouraged the spread of a uniquely American pronunciation by consistently placing the stress on the first syllable of the word. In other words, Webster attempted to deliberately manipulate the use of English in the United States for patriotic reasons. Whatever anyone thinks of those reasons, Webster’s efforts are one of the best proofs that prescriptive grammars are not as value-free as many people imagine.

The same is true today in the debate over whether “they” can be used as the indefinite pronoun instead of “he/his” (“Everyone is entitled to their opinion”). On the one hand, traditionalists who insist on “he/his” are perpetuating the male bias of the first grammarians. On the other hand, reformers who favor “they” are trying to remake the language in their own world view. It is not a question of objectivity on either side. It is simply a question of which world view will prevail.

The Problem of Change

But the major problem with prescriptive attitude is that it resists the fact that languages are continually changing. If a community newspaper constantly includes editorials urging people to drink less, the amount of concern suggests a local drinking problem. In the same way, the constantly expressed wish to set standards for the language reflect the massive changes in English at the time that the first grammarians worked. Although the rate of change was probably slower than that of the Fifteenth and Sixteenth Centuries, English was still changing much faster between 1650 and 1800 than it does today.

Some of the changes that occurred or were completed in this period include:

  • The disappearance of dozens of Old English words like “bairn,” “kirk,” and “gang” (to go). Many of these words survived in Northern and Scottish dialects for another century, but became non-standard in written English.
  • The addition of dozens of new words. Some were deliberately coined by scientists, such as “atom.” Some were borrowed from the regions that England was conquering, such as “moccasin” or “thug.”
  • The replacement of “thou” and “ye” with “you” in the second person plural. These forms survived only in poetry.
  • The standard plural became “s” or “es.” Only a few exceptions such as “oxen” survived.
  • The loss of all case endings except the third person singular in most verbs (“I read,” “he reads”). Some of the older forms ending in “th” survived longer in poetry.
  • The loss of inflection in most adjectives.
  • The regularization of past tenses to “ed” or, occasionally, “t” (“dreamed” or “dreamt”).

Many of these changes are easy to overlook today because popular editions of texts from this period routinely modernize the spelling. However, the sheer number of changes makes clear that the early grammarians were fighting a rear guard action. If they have helped to slow the rate of change in the last two centuries, sometimes they have also accelerated it; the loss of many Northern words, for example, is probably partly due to the standardization on Home County English. Yet, despite these efforts, English continues to change as the need arises. Many of these changes come from the least educated parts of society–those ones least likely to be influenced by the prescriptive attitude.

Today, all prescriptive grammarians can do is resist changes as long as possible before accepting them. This constant retreat means that most prescriptive grammars are usually a couple of decades behind the way that the language is actually used in speech and contemporary publications.

The Descriptive Alternative

While prescriptive grammars were finding their way into the schools, an alternative approach to the study of language was being developed by linguists. Imitating the naturalists of the Eighteenth Century, linguists began to observe the pronunciation, vocabulary, grammars, and variations of languages, and began cataloging them in ways that suggested how they related to each other. In 1786, Sir William Jones established that most of the languages of India and Europe were related to each other. By 1848, Jacob Grimm, one of the famous Brothers Grimm of fairy tale fame, had detailed in his History of the German Language how English, Dutch, German, and the Scandinavian languages had descended from languages like Gothic. Content to observe and speculate, the early linguists developed what is now called the descriptive approach to grammar.

The descriptive approach avoids most of the distractions of prescriptive grammars. Today, most linguists would probably accept the following statements:

  • Change is a given. In fact, a working definition for linguistics is the study of how languages change. Linguists can offer a snapshot of how a language is used, but that snapshot is valid for only a particular place and time.
  • No value judgement should be placed on changes to a language. They happen, regardless of whether anyone approves of them or not.
  • The fact that one language is descended from a second language does not make the first language inferior. Nor does it mean that the first language must be modelled on the second.
  • Linguistics does not claim any special authority, beyond that of accurate observation or verifiable theory. Claims are open to discussion and require validation before being accepted.
  • No form of a language is given special status over another. Regardless of who speaks a variation of a language, where it is spoken, or whether it is oral or written, all variations are simply topics to be observed. For example, when Alan Ross and Nancy Mitford coined the phrases “U” and “non-U” for the differences between upper and middle class vocabularies in Britain, they did not mean to suggest that one should be preferred (although others have suggested that deliberately using a U vocabulary might be a way to be promoted).
  • Variations of a language may be more or less suitable in different contexts, but none are right or wrong. The fact that you might speak more formally in a job interview than at a night club does not mean the language of a job interview is proper English, or that the language of the night club is not.
  • Proper usage is defined by whatever the users of the language generally accept as normal.
  • A language is not neutral. It reflects the concerns and values of its speakers. For example, the fact that every few years North American teenagers develop new synonyms for drinking and sex reflects teenagers’ immense preoccupation with the subjects. The fact that they also develop new synonyms for “slut” reflects their sexual morality as applied to girls. A similar viewpoint is known in psychology as the Sapir-Whorf hypothesis.

To those conditioned by prescriptive grammars, many of these statements are unsettling. For instance, when I suggested on the TECHWR-L list that proper usage was determined by common usage, one list member went so far as to call the view “libertarian.”

Actually, these statements are simply realistic. Despite two centuries of classroom conditioning, average users of English have never been overly concerned about the pronouncements of prescriptive grammar. Instead, people continue to use English in whatever ways are most convenient. If the prescriptive usage is not widely used in practice, it may even sound odd, even to educated people. For example, to an ear attuned to the spoken language, the lack of contractions in an academic essay or business plan may sound stilted. In fact, an entire written vocabulary exists that is almost never used in speaking. The descriptive approach simply acknowledges what has always been the case. In doing so, it frees users from the contortions of prescriptive grammar, allowing them to focus on communication–where their focus should have been all along.

Professional Writers and Grammar

Writing well, as George Orwell observes in “Politics and the English Language,” “has nothing to do with correct grammar and syntax.” If it did, then two centuries of prescriptive grammar in the classroom should have resulted in higher standards of writing. Yet there is no evidence that the language is used more skillfully in 2001 than in 1750. The truth is that, prescriptive grammar and effective use of English have almost no connection. A passage can meet the highest prescriptive standards and still convey little if its thoughts are not clearly expressed or organized. Conversely, a passage can have several grammatical mistakes per line and still be comprehensible and informative. Prescriptive grammars are interesting as a first attempt to approach the subject of language, but today they are as useless to writers as they are to linguists. So long as writers have a basic competence in English, prescriptive grammar is largely a distraction that keeps them from focusing on the needs of their work.

By abandoning prescriptive grammar, writers shift the responsibility for their work to themselves. In practice, this shift means making choices that are not right or wrong in the abstract, but, rather, useful in a particular context or purpose. For example, instead of agonizing over whether “User’s Guide” or “Users Guide” is correct, writers can choose whichever suits the situation. They can even flip a coin, if they have no better means of deciding. In such cases, which choice is made is less important than using it consistently throughout the document to avoid confusion. Even then, writers may decide to be inconsistent if they have a good reason for being so.

Similarly, the decision whether to use a particular word or phrase is no longer a matter of referring to a standard dictionary or somebody else’s style guide. Instead, writers have to fall back on the basics: Will the intended audience understand the word? Is it the most exact word for the circumstances? Does it convey the image that the company wants to present? In the same way, while the need for clarity and a factual tone makes complete sentences and unemotional words useful choices in typical manuals, in a product brochure, sentence fragments and words heavy with connotation are more common. Writers may still want to summarize their decisions in a corporate style guide, but the style guide will be based on their own considerations, not the rules that someone else tells them to follow.

None of which is revolutionary–except that, under the prescriptive attitude, irrelevant purposes are often inflated until they become more important than a writer’s practical concerns.

That is not to say that taking a descriptive approach to grammar means writing in the latest slang. Nothing could date a document faster, or be more intrusive to a technical manual. Nor does it mean abandoning technical vocabularies that are known to the audience or that make explanations easier. If anything, a descriptive approach demands a much greater awareness of the language than a prescriptive one. Instead of learning the single correct version of the languages, writers who take a descriptive approach need to be aware–probably through constant reading–not only of dozens of different versions, but of how each version is changing.

If necessary, writers can use descriptive grammars such as journalistic style guides to help them. On the whole, however, the descriptive approach leaves writers where they should have been all along, deciding for themselves what helps their documents to achieve their purposes. The only difference is that, under the descriptive approach, they are fully aware of their situation. If they say anything unclear or stupid, they can no longer hide behind tradition.

Prescriptive grammar is useful for teaching English as a second language, but it has little value for the practicing writer. Clinging to it may provide emotional security, but only at the expense of making writing harder than it needs to be. The culture-wide devotion to it will not be changed in a moment. But conscientious writers can at least change their own habits, and make life easier for themselves. And, from time to time, they can even laugh some worn-out, crippling concept — such as not ending a sentence in a preposition, or not splitting an infinitive — into the recycle bin where it belongs.

(With apologies to George Orwell.)

Read Full Post »

“You’re an English major? You must be planning a career in fast food.” Comments like this haunted me from the moment I declared my major in university. But hearing the sentiment recently, I realized that it was far from accurate. The truth is, people who have a way with words can make a comfortable living in all sorts of ways, so long as they don’t limit their possibilities to the obvious.

The worst mistake that anybody with an English degree – or, in fact, any Arts degree – can make is to hang about on the fringes of academia, hoping for a tenure track position. Ever since my undergraduate days, I’ve been hearing about all the tenured positions that are going to become available as their current incumbents retire, but, between budget cuts and the increasing tendency to hire non-tenured staff or sessionals, the positions are unlikely to materialize. People who were hoping for those positions when I left academia over a decade ago are still waiting for those tenured positions. Meanwhile, they endure semester by semester contracts, last minute hires, and doing the same work as tenured faculty for half the money. That’s fine for a few years, but it’s no way to live in the long-term.

The same is true of editing piece work. Just like academia, the publishing industry depends on having a constant pool of cheap work-for-hire editors. You may be one of the lucky exceptions, but the odds are against you, no matter how talented. Those who run the industry are careful not to employ you so much that they become obliged to offer you benefits.

Instead of lingering in limbo, waiting for the academic or literary job you used to dreamed of, English majors should explore the possibilities in business. Not only is the power of self-expression in demand there, but the competition is far less fierce than in academia – partly because of the greater need, and partly because many English majors seem to consider that taking a job in business is beneath them. Often, too, they make the mistake of thinking that their writing skills are all they need, and are slow to learn the subject matter expertise they need to do the work properly.

But, if you can get beyond the idea that you are dirtying your hands and are willing to learn what you don’t know, then the jobs are there. As a technical writer, you need to write clearly and organize information for conciseness and accuracy; in many ways, the job is writing stripped to the basics. As a communications and marketing manager, writing news releases or blogs, you take on the responsibility of being the voice of the company. As a product manager, you decide how to present a product or line, and you’ll find your skills with textural analysis serve you well when you come to deal with end user license agreements and other legal documents. As an instructor, you are reprising your role as a teaching assistant while you were in grad school, the only difference being is that you are teaching software or policies and procedures, rather literature or criticism.

And these are only the most obvious career paths. Writing and teaching skills aren’t a bad foundation for going on to law school, for example. Best of all, the first thing you’ll notice when taking these positions if you’ve been vying for scraps of work around academia, your yearly income will increase by over fifty percent or more.

Admittedly, some of these positions aren’t on the express way to the top. Technical writers, for instance, may rise to supervise other technical writers at a large company, but they aren’t likely to become CEOs. But they can serve as entry positions, and, if you’re interested in climbing the corporation, you can always expand your skill set later on. Meanwhile, you can reasonably expect a salary that puts you solidly in the upper middle class, to say nothing of responsible and often rewarding work.

Really, the only thing holding you back with an English degree is your own lack of imagination or initiative. Just because those who prefer an education they should be getting at a technical college choose to belittle your liberal education is no reason for you to believe them.

Read Full Post »

And if you’re looking for me . . .
Hey, if you’re looking for me . . .
The boy’s still running

-OysterBand

“Aren’t you the guy who used to be running all the time?” a man I went to school with asked when we met recently in downtown Vancouver. I’ve been hearing variations of that question all my life, from everyone from clerks in local stores to potential employers. That’s not surprising, really, because, aside from reading and writing, few things have been a part of my life as much as running.

When I was in the first grades of school, we used to play tag in a little pieces of woods on the edge of the school ground. I soon found that, while I was only the third or fourth fastest of the boys participating, the longer the game went on, the less likely those faster were likely to catch me. The same thing happened on the soccer field, where by the last minutes of the game, I could outrun everybody.

However, it was in grade three that I first took up running seriously. Like many decisions in my life, it was taken by a wish to prove someone else wrong. The school’s PE teacher had assigned people from each grade to participate in a track meet with two other schools – and he hadn’t included me. Stung by this unfairness in a way that only the very young and self-righteous can be, I determined to make him regret his decision. While his chosen few practiced each morning on the makeshift track on the playing fields, I started doing laps of the school ground, sure that he would see me and be so impressed that he would have to reconsider.

If he ever noticed, he gave no sign of reconsidering. But the habit lingered, and soon I was running three mornings a week before school with several of my friends. I was reasonably athletic, although in team sports I made my mark by enthusiasm and energy more than skill, and I took running very seriously – so seriously, in fact, that when I discovered that a couple of people had cheated on an after-school training run that, when I saw them a block ahead of me, I charged towards them, yelling “Cheaters!” at the top of my lungs, wild with rage and determined that they weren’t going to get credit for finishing first. I beat them, too – although probably I was helped by the fact that they didn’t care as much as I did.

In high school, I had more than my share of firsts in cross-country and distance running, largely on the strength of having discovered that all you needed to beat most rivals was to train every day. Moreover, since my only strategy was to rush to the front of each race and then hold on, I learned the importance of psychology. I won several races when woozy from ‘flu solely because everyone else expected me to be out in front.

Somehow, though, I didn’t have get the victories in the provincial finals that everyone expected from me in grade 12. I was sick at the time, but I wonder now if the illness wasn’t an unconscious rebellion against the increasing seriousness I was finding in sport. By that point, I had been several years in the Vancouver Olympic Club, training under the legendary Lloyd Swindell, and not only had I found several rivals, but the seriousness of the training I experienced seemed to take the fun from the sport.

In university, the seriousness intensified. Not only that, but, as a team member, I was expected to help paint posters for other athletic events and show up to football and basketball games. Since I was commuting three hours a day to university, I couldn’t have given the time to these things if I had wanted to.

Moreover, the competition was tougher, too. At eighteen, I didn’t have my full adult strength (such as it is), and I was competing with fully grown men from across North America. Increasingly, I realized that, judging from the success of some of my older peers at the university and at the Vancouver Olympic Club, I might make qualify for the world championships or the Olympics in the five or ten thousand meters if I devoted four or five hours a day to training – but I almost certainly would not reach the finals, let alone finish with the medals. Reluctantly, I acknowledged to myself that I was a good runner, but not a great one, and somewhere near the end of my second semester, I ran my last race.

But that didn’t mean I quit running. Even then, it was too much a part of my life to give up. It was a form of meditation, a collection of peak moments of exertion and early morning sights that I could never give up. I’ve run up hills in Glacial National Park while on holiday, and several thousand meters high at Mount Lassen with my lungs on fire. Early in the morning, I’ve run through the streets of Berkeley on glorious summer mornings and Tacoma’s skid row, through the fog, and the outskirts of Indianapolis in the snow and stabbing cold. I sometimes feel that, until I’ve run an area, I haven’t really experienced it.

Admittedly, my mileage has dropped and my speed is a joke, especially in the last few years, when I’ve started varying my exercise with swimming and cycling to spare my knees some strain, but I don’t expect to quit altogether so long as I can hobble, however slowly. I sometimes joke that I won’t consider any retirement home that doesn’t have an all-weather running track.

So, yeah, if you’re looking for me . . .
Hey, if you’re looking for me . . .
The boy’s still running.

Read Full Post »

Recently, I’ve been struggling with the suspicion that my distaste for marketing is hypocrisy.

I’ve been a marketer in a past career incarnation, and a moderately good one, if I say so myself. At Stormix Technologies, I developed the idea of ad campaigns based on different idioms that used the word “storm,” such as storm warning and eye of the storm. Later, at Progeny Linux Systems, I developed a campaign based on the use of a large animal to represent the company, such as a macaw, and a small one to represent the competition, such as a budgie. In the last version of the ad, we used a bull elephant and a toy elephant, under the slogan, “Some companies just toy with open source.”

Even in retrospect, I’m proud of these campaigns, although to be honest the Progeny campaign owed much of its effectiveness to the graphics work done by Rebecca Blake at Lara Kisielewska’s Optimum Design and Consulting in New York. Both campaigns handily accomplished the aim of giving new companies name recognition across North America.

However, it’s nearing five years since I’ve worked as a marketer. At first, I had difficulty finding marketing work in the high-tech recession, and later I found direction as a journalist and stopped looking for marketing work.

Still, given my past, I’m surprised to find myself viewing marketing with increasing distaste. Last year, when I noticed an ex-schoolmate using crudely obvious means to market her company, I told myself that her tactics were reasons to distance myself. More recently, I found myself shaking my head over local bloggers writing about products for money or goods. And when I spent an evening listening to a case study of a guerilla marketing campaign, I found myself thinking the whole idea a waste of creativity. I had a similar reaction when I read a friend’s recent blog enthusing over marketers who organized live roleplaying games to promote their products. I am all for play, since I believe it is important to creativity, but I wondered if marketing wouldn’t sully the whole experience.

The thing is, given my past, what right do I have to look down at marketing? Considering the recession when I tried unsuccessfully to find marketing work, is my reaction just sour grapes? At the very least, I am being inconsistent, and that troubles me, because such inconsistency points to unexamined complexity.

Moreover, I notice that very few people share my attitudes. Many people found the case study I heard clever, and were excited by the possibilities of using real-life games to sell products. I’m next to unique in my moral outrage. That’s fairly common, since I have a strong Puritan streak in me (by which I mean that I’m obsessive about ethics, not that I’m prudish), but righteous outrage, like inconsistency, suggests a complex reaction.

So, what is happening? To answer, I have to fish blindly into the mirky depths of my unconscious, and see what I happen to land.

Part of my reaction, I think, is a reversion to earlier attitudes. I was an academic before I was a marketer, and such dismissals are common to those who have never worked in business. In the last couple of years, I’ve been at a stage in my life when I’ve been reassessing my past through various means – including through this blog – and very likely I’ve made a reconnection that I didn’t realize until now.

Another reason is that I’ve switched sides. Instead of trying to persuade journalists, I am a journalist now. In the last few years, I’ve seen many inept marketers – not least of all those who borrow spammers’ techniques and keep sending information about Windows products to an address that is clearly about GNU/Linux. Because I get forty or more such emails every day, developing a jaundiced view of the marketing profession is only natural. I’ve seen too much of it at its worst.

However, I think the main reason for my disdain is to justify the path I’ve taken. While I could always return to marketing if I was desperate to earn a living, it’s slipped several rankings down in the possibilities. Most likely, I could probably find another gig in free software journalism if I had to. But, as a generalist who believes that “all of the above” is usually the correct choice, I’m obscurely bothered by committing to a single career choice. So, to quieten my misgivings, perhaps I’m hunting for ways of repudiating the possibilities I’ve more or less ruled out.

In other words, my reactions are less about the ethics of marketing in the abstract than about my own decisions about my life. And, having come up with this line of reasoning, I imagine that I can already feel my reaction diminishing. Examining hypocrisy can lead to insights – or so I’ve found in this case.

Read Full Post »

Since I’m a Canadian, Memorial Day doesn’t mean much to me. Our May long weekend is Victoria Day, and is often the weekend before. From the times I’ve been travelling in the United States on the Memorial Day long weekend, it seems to involve a lot of parade drill from everyone from octogenarians to people in wheelchairs – and as a sport, parade drill is low on the list for breakneck action and usually looks faintly ridiculous to my outsider’s eye. But I do remember one unforgettable Memorial Day, when we visited the fantasist Avram Davidson at the Veteran’s Hospital in Bremerton, Washington.

As you may know if you have any taste for literary fantasy, Avram was one of the great fantasists and humorists of the 20th Century. But, as sometimes happens with greater writers, he was not very skilled at taking care of himself. Through poverty, he had developed the habit of living in small towns where rents were cheaper, and, according to him, moving on when he had exhausted the local library.

In the last couple of years of his life, this habit had brought him to Bremerton. When his long-neglected health began to fail, he landed in the local veteran’s hospital, thanks to his service in World War 2 as a hospital corpsman in the Marine Corp in the Pacific. There, from the narrow confines of his room, he fought a running battle with the bureaucrats of the hospital and of Veteran Affairs, none of whom were used to dealing with patients who were not only highly intelligent but who had a high degree of curmudgeon and anarchist in their mental makeup.

Perhaps it was a campaign in this ongoing battle that prompted Avram to invite everyone he knew within a day’s travel distance to the hospital’s Memorial Day celebration, just to annoy his opponents. Or maybe Avram’s famous generosity, so long denied because of his poverty, seized on the celebration as a overdue way to treat his friends and repay them for their visits. He could, too, have been restless in the limitations of his life, and worrying that he might not have long to live.

Knowing Avram, the invitation was probably extended for all these reasons. But, whatever his motivations, the invitation went out, and we drove down from British Columbia that morning with all the excitement that inhabitants of Hobbiton must have tramped over to the party field to celebrate Bilbo’s birthday party.

The trip was memorable as the only one we ever took south of the border in which the American customs guard did not interrogate us on the strength of our rustbucket Maverick.. In those days (and possibly still, for all I know), custom jobs were veteran-preferred postings. The second that the guard heard that we were visiting the veteran’s hospital, he smiled and waved us through without another question.

When we got there, we found that the hospital had laid dozens of tables out on the lawn. The celebration was in full swing, but we had no difficulty finding Avram. He was sitting as far away from the bandstand as possible, surrounded by a dozen people, holding court in his wheel chair and telling stories about recent and past events.

At this late date, I don’t remember everything everyone said. But I do remember that, when someone noted that a tavern sat just beyond the hospital grounds, Avram said that many of the patients would go to any length to get to the hard liquor served at the tavern. When the hospital tried to discourage the custom by planting a pole in the gap of the fence, so that wheelchairs couldn’t squeeze through it, wheelchair patients would drag themselves along the fence, inch by painful inch, to get to the tavern. On Friday nights, he said, they looked like insects spread across a windshield as they clung there.

At some point, too another guest took out a letter he had been asked to forward to Avram six months ago. To my surprise, it was from me – I had completely forgotten the incident.

The stories and jokes went on, many told by Avram, but others contributing their share as well. A band arrived and played the usual American patriotic songs. We continued talking, oblivious to the occasional glares from other visitors. We lined up for food, and the hospital staff glared at our numbers and said nothing. The celebrations ended, and staff started to clean, until only our table was left standing, and still we talked. We didn’t care. I don’t know how Avram’s other visitors were feeling, but I felt as though I had stumbled into a London coffee house on an evening when Samuel Johnson was holding forth, and I didn’t want the evening to end. If we hadn’t had a ferry to catch and a two hour drive on the other side, we might have stayed until midnight.

As things happened, that was the last I saw of Avram. He was dead less than a year later, having left the veteran’s hospital for a basement suite in the town just before the bureaucrats could throw him out. I understand that he was only found a couple of days after he died, and I don’t like to think about his final moments alone.

Instead, I prefer to think of him as I last saw him when I looked back across the grass. He looked tired, but he was obviously in his element, telling stories and laughing at what other people said – a master storyteller even in his leisure.

Read Full Post »

A few days ago, I was awakened at 6AM by the slow roll of thunder in the background. At first, I though it was shunting freight cars, but by the time I sank back on the pillows from the bolt upright position that I had no memory of moving into, I realized that the spur line to the nearby industrial complex had been closed for years. It was only one of the small storms we get on the south Vancouver coast in late spring and early fall. After dashing out to make sure that the computer and its peripherals were unplugged, I settled back to enjoy it.

There’s something about the deep-pitch of a thunder storm that impresses me on an instinctive level, much like a note struck on a full-sized church organ. It rouses all the fight or flight responses, raising the hairs on my arms and perking up my ears. In the presence of thunder, I find myself walking further forward on the balls of my feet, and looking alertly about me. I suppose that when we hear thunder, we all revert to prey animals, because it is something beyond our control that seems to be circling us, moving in for the kill.

Perhaps that is why, the once or twice I’ve been in a skyscraper during a thunder storm, everyone has crowded to the windows to stare at the accompanying lightning, careless of the fact that the window is probably the last place they should be. If the storm had eyes, no doubt we would be staring into them, like a mouse transfixed by a snake.

I have two main memories of thunder storms. The first was on my way back from a trip I took in the first days after I graduated from high school, camping with a friend in the Kootenays. The trip was eventful, involving for me the end of the romance, my first sight of a raven, my first trip as an adult, and a nasty cramp from ingesting too much of the water while swimming at Radium Hot Springs.

We were driving down the Fraser Canyon, the motion of the car doing little to help my queasy stomach, when suddenly we crested a hill from which we could see what seemed like the whole of the Fraser Valley stretched out before us. And, at that moment, sheet lightning flashed across the entire western horizon.

The sight couldn’t have lasted more than ten seconds, and probably only half or a third of that. But to me, it seemed to last for minutes – a bright, blaze as though the sky was on fire, impossibly golden and shining, and blotting everything else from sight. Remembering my graduation the week before, I wanted to take it as an omen, then decided that it didn’t need any symbolism attached to it: However I regarded it, it was one of the most magnificent and outright uncanny sights of my life. Later, I had time to wonder what would have happened if a car had been coming the other way at that moment, but in the immediate aftermath, all I could do was sigh and wish that the sight would return.

The second was in the last weeks of my master’s thesis. I had just bought a new computer, and was learning to work on it, rather than my worn IBM Selectric. I had file cards with the basic formatting options for WordPerfect written on them, and each day I would learn some other chore, such as file management.

I had had the computer a week, and was priding myself on getting to know it fairly well. That day, I planned to learn how to backup my work to floppy disks.

I was sitting at the keyboard, happily typing, when I heard the thunder crescendoing over head. It seemed unusually close, and I wondered if lightning had struck at Simon Fraser University on nearby Burnaby Mountain. For a moment, I enjoyed the pleasant sensation of mingling my newfound competence on the computer with my somewhat Byronic enjoyment of the storm.

Suddenly, I remembered the vulnerability of computers to power surges and reached down to unplug the computer. As I did, the screen filled with light – possibly, just in my imagination.

I spent a fretful hour wondering what had happened to the computer while I waited out the storm. Somehow, I was no longer enjoying it very much.
When I finally dared to turn on the computer again, the worst had happened; it wouldn’t work.

That same afternoon, I took the computer in for repairs. But my thesis defense was in three weeks, and I couldn’t afford to assume that the latest versions of two chapters and the draft of a third that I had typed into the computer would still be there. I spent the next ten days frantically recreating my work and worrying about my diminishing time.

Eventually, I found that the lightning surge had fried a resistor on the motherboard, and, that damage done, had been unable to affect the hard drive. My chapters were still there, although I no longer needed them. But I had learned a hard lesson, and I’ve been a backup fanatic ever since.

Probably, these experiences explain the contradictory impulses I have in a thunder storm. On the one hand, I want to rush to the window, or even outside, to experience the full glory of the storm. On the other hand, I want to run to the computer to make sure it’s safe, even though these days I always have current backups and two or three computers around the house. Because I’m stolidly middle-aged, checking the computer usually wins these days, but, once I’m assured of its safety, I still turn to my romantic enjoyment of the storm.

Read Full Post »

I have a neighbor who always seems to have time on his hands. On the weekends or summer evenings, he’s usually busy with some sort of project, landscaping the area around his townhouse or washing his car as carefully as a cat licking its only kitten. When his invention fails, he pulls up a deck chair for a few companionable beers and cigarettes. On his days off, he says, he keeps thinking of what he would be doing if he was at work.

To say the least, we are not close. But when our paths cross, I observe him with a wary speculation as I try to understand him.

In contrast to our neighbor, I can’t recall the last time I had nothing to do. I suspect, though, that I must have been under ten. Any boredom probably lasted all of five minutes.

Since then, I have occasionally not known what I wanted to do next. At times I’ve been too tired to sleep. But boredom is a feeling I associate with polite social duties. Mostly, my problem has been trying to squeeze in the time for everything I want to do.

For as long as I can remember, I’ve always had several projects on the go. The current active roster includes the background to a fantasy world, fiction set in that world, and a collection of letters between the American fantasist Fritz Leiber and his oldest friend. If that palls, I have the translation of the Old English poem “The Seafarer” that I’ve worked on intermittedly for more years than I care to think. I have minor cleaning and repair jobs to do around the house. Worst case scenario, I’ve got stacks of books ranging from light fiction to primary historical sources, and DVDs to watch and music to hear, and a few Internet research projects I mean to get around to at some point. And, so long as I’ve got running shoes and a road, I can always go for a run.

It’s not that I have Adult Attention Deficit Disorder. I can work for fourteen hours straight on a project without any problem, and I do finish many of the projects I have going – maybe later than sooner, but I finish them. It’s just that I have all the curiosity of a newly-fledged parrot, so something new is always catching my attention.

The reason my neighbor intrigues me is that I can’t imagine wandering as lost as he so often seems to be. I’ve seen him at 8:30 in the morning in his white bathrobe, drifting around with a cup of coffee in one hand and a cigarette in the other, already looking like he’s wondering how to fill his day. And I’ve always thought: What sort of inner life does he have? What sort can he have?
I can hardly begin to imagine. All I can do is to be thankful for what accident of circumstances or genetics spared me a similar fate – and hope that, whatever happened to him, it never happens to me.

Read Full Post »

When you are trying to get something done in a large organization, frustration easily sets in. Before you know it, you can start fantasizing about shouting and name-calling and finding a throat that your fingers fit around – while in reality you slink off, feeling helpless and foolish. However, as I was reminded this past week trying to get action from the local health system on behalf of my hospitalized spouse, the secret is to use more indirect methods.

The first thing to remember is to never show that you are losing your temper. Show anger, and you’re giving the bureaucracy a reason not to listen to you at all. If you have to, retire to the washroom to snarl or cry, or go for some strenuous exercise after your efforts are done. But while you are talking to the members of the organization, keep calm. Smile. Say “Thank you,” even if the person you’re talking to has done nothing but obstruct you.

At the same time, never give up. In the typical bureaucracy, most people want nothing more than to go about their work quietly, and with a minimum of fuss. If you keep showing up, then after a while, they will be more likely to help you so that you go away and stop disturbing the quiet of their days. Calm, polite insistence should be your goal.

In addition, remember that you have to play by the bureaucracy’s unwritten rules – even if you are trying to get its official ones changed or rescinded (or maybe I should say especially when you are trying to get the official ones changed or rescinded). That means you need to have a simple, clear statement of what you want done, usually expressed in terms of a concrete action or two.

Even more importantly, the need to obey the unwritten rules means that your main strategy is to get allies in the system. Who can make your request a reality? Or – often more to the point – who can exert pressure on decision-makers to act in the way you want? Find out, and get those people on your side, advocating your cause within the organization. They know the structure far better than you have any hope of doing, often on an unconscious level of which they probably aren’t aware. Moreover, the more of your allies that surround the decision-maker, the harder the decision-maker will find resisting your request.

Finally, never forget your objectives. With these methods, you have a strong chance of realizing them. But if you’re expecting the decision-makers or the people who have been obstructing you to apologize or show any remorse for their lack of helplessness or failure to live up to the alleged ideals of their organization, you’re fantasizing. Settle for getting what you want, and keep polite even as you get it. While the primitive part of you might like to rub in the fact of your victory, resist the temptation, just in case the decision-maker balks at the last moment. Your purpose is not emotional satisfaction – it’s realizing your goals.

Getting a bureaucratic organization to get something done when you’re an outsider is like starting an avalanche. Anyone can set a boulder or two tumbling down the hill, and the result can even be spectacular. But finding the right pebbles to shift so that a large part of the landscape permanently moves (and doesn’t take you with it) is much harder. It requires patience, indirection, and an understanding of the landscape. But, in the end, the results can be farther-reaching than any expression of frustration or anger.

Read Full Post »

I like to think that I’m at home on the computer. Not on Windows – ask me to solve a problem there, and (assuming I don’t refuse to approach it), I’m relying on common sense, Internet searches, and my increasing irrelevant memories of the days I used a version of it with any regularity. But on GNU/Linux, I like to think that I know my way around. I know most of the configuration files and relevant packages for hardware and configuration, and, if I don’t, I can make educated guesses or know where to find the information I need. But an attitude like that can be as misplaced as hubris – it’s practically begging the forces of irony and chance to humble you.

Last night, it was my turn for humbleness. Suddenly, I was getting repeated “1”s whenever I tried to type something. I could turn it off by pressing the appropriate key, then it would return.

Since I was dealing with GNU/Linux, on a machine whose security I had hand-tweaked, I was relatively sure I wasn’t dealing with a virus or intrusion, which would have been my first guess on Windows. The system logs showed no suspicious activity – and, anyway, modern cracking tends not to be so randomly malicious.

Investigation quickly showed that the trouble was present regardless of account and whether the X Window System was running or not, and what desktop I was using. The Xorg.conf was identical to my backup copies. Altering locales and other keyboard settings, both globally and for particular accounts, changed nothing. Neither did upgrading key packages.

The keyboard was fully plugged into its jack as well. On both ends.

By this point, I had concluded that, since I had eliminated everything else, the keyboard must be faulty. True, when I booted via an old Windows install disk, no problem existed. But this wouldn’t be the first time that I’d found GNU/Linux drivers more sensitive than their Windows equivalent, so the diagnosis seemed plausible. Perhaps GNU/Linux was detecting a problem that was still too small for Windows to detect?

Unfortunately, by this point, it was past midnight – a time when few computer stores are open. Troubled, I went to bed and brooded on the problem in my dreams.

The next morning, I was on the phone when, wandering about the house, I happened to sit down in front of the keyboard. As I talked, I noticed that the 1 key on the number pad was partly depressed because of an errant seed from a parrot wedged between key and keyboard. A flick of a finger nail, and the seed was gone and my system working again.

I’d wasted two hours, for no better reason than, full of self-confidence in my knowledge, I’d overlooked the obvious. In my defense, I have to add that I rarely use the number pad. Still, I felt duly chastened that I hadn’t bothered to observe something so basic before going into full-tilt troubleshooting mode. I could have saved myself some time and frustration if I had.

Nor does the fact that I was systematic and eliminated various possibilities in an order way make me feel better – not one bit.

Read Full Post »

Northwest coast art is one of the healthiest schools of modern art, because it starts from a tradition yet still welcomes innovation. A juxtaposition of local First Nations mythology and the rain forest environment on one hand and advanced industrial techniques on the other, it also seems to reflect the experience of anyone who lives in the area where the artists work. For these reasons, yesterday I fought down the ‘flu that had taken root in my stomach to attend the public opening of the Bill Reid Gallery in downtown Vancouver.

Bill Reid was one of the founders of modern Northwest coast art, and his work from the late 1940s to his death in 1998 is broadly reflexive of the school’s history, starting with imitations of the past and gradually gaining originality as his confidence and knowledge of technique increased. With copies of his monumental Spirit of Haida Gwaii at the Canadian embassy in Washington D.C. and the Vancouver airport – as well as on the Canadian $20 bill – he is perhaps the best-known Canadian artist of the last forty years.

The gallery that carries his name features Reid, but, in recognition of his influence, does not confine itself to his work alone. A tribute pole by Jim Hart dominates the main gallery, and the gift shop has a large room where other Northwest artists are highlighted. Right now, the gift shop features April White, but I understand that the plan is to change the exhibit regularly.

The gallery windows are covered in semi-transparent blowups of Reid’s design, but still let in the natural light. With its high ceiling and dais for speakers, the main gallery suggests a modern version of a Northwest longhouse, the only jarring touches being the carvings around the archway and the computer screens and holograms that stand-in for pieces of Reid’s work that are not in the gallery A mezzanine allows visitors a chance to see close up the top of Hart’s pole, as well as “Mythic Messengers,” a bronze sculpture that is one of Reid’s best-known works.

Although today was the official opening, finishing touches at the gallery are still lacking. Several display cases are empty, and many are unlabeled. Nor does a guidebook or recorded tour exist. For yesterday, little of that mattered, because one or two people were giving tours, but I worry a little that the context may be lost on casual visitors.

Knowing that context is important, because otherwise the gallery might be mildly disappointing. Several of the pieces are smaller versions of Reid’s monumental works, and the change of scale makes it easy to under-estimate them. In particular, a palm-sized version of “Raven and the First Men” looks cramped and intricate where the original at the University of British Columbia’s Anthropology Museum looks spacious and simple.

Still, that is a quibble that seems ungracious when such a gift has been given to the area. With Reid’s preference for deep-carving and, in the last stages of his development, his trust of blank spaces – to say nothing of his consummate knowledge of technique and his frequent experimentation – his work consistently breathtaking. And to see so much of it in one space remains an overwhelming experience, even if his best work is not always represented. I found that I had to wander in and out of the gallery several times, just so I could appreciate all the exhibits properly. Otherwise, I would tend to wander in a sort of daze of admiration.

While I was there, I was also lucky enough to catch Martine Reid, the artist’s widow, talking about the jewelry displays. Although her French-accented English was easy to lose in the crowd, her reminisces helped to bring her husband’s development as an artist into perspective while also revealing something of his human side.

I particularly remember her story of how she bought a silver box he had made several decades previously and gave it to him as a birthday gift; he stared at it, she says, like a parent who had not seen his child for decades – then took a napkin and started polishing it.

Martine Reid also recalled that her husband used to carry a coil of wire and a pair of pliers in his pocket, and would twist the wire into shapes as he sat and talked. His “knitting,” he called it. Apparently, the habit was so ingrained that, even in his final illness, he was moving his hands as though twisting wire.

The Bill Reid Gallery is small — at least, to display an artist with such a long and varied career — but, if yesterday is any indication, I expect it will become an important center in Vancouver, not just for tourists, but for the First Nations community and art-lovers. Lingering for several hours, I completely forgot my ‘flu, swept away by the convictin that a species that can create such an artist obviously has redeeming qualities despite what you read in the newspapers.

Read Full Post »

« Newer Posts - Older Posts »