Feeds:
Posts
Comments

Archive for the ‘Bruce Byfield’ Category

“We’re all a little older, the air’s a little colder,
Feels like forty lifetimes since we walked upon the moon.”

-OysterBand, “I Know It’s Mine”

If you aren’t old enough to remember the first moon landing, you probably have trouble understanding how much it meant – or much it can sometimes still means to those of us who were.

In 1969, life in the industrialized countries had brought more prosperity to more people than at any time in history. At the same time, there were crippling, disfiguring inequalities and wrongs like the Vietnam War to correct. Some people – the so-called “silent majority” – were in denial about the problems, while the rest of us alternated between an optimism that often spilled over into the naïve and a growing cynical conviction that nothing was going to change. It was a moody time, as exciting as it was scary for those us who were still children and starting to wonder what the world would be like when we were adults.

For me, these conflicted feelings extended to the space program. I had done a school project a few years before about space exploration, and I knew it was nothing like the great adventure that science fiction had been promising us for the past thirty years. It was, after all, popularly called The Space Race, and I knew it was an extension of the nationalism of the Cold War, a struggle between the United States and the Soviet Union in which each was determined to prove its ideology the best. I knew, too, that Wernher von Braun was an ex-Nazi, and that NASA was too full of the American militarism that was responsible for Vietnam. As for the astronauts, in public they were bland good soldiers that no amount of PR could ever make into heroes.

All the same, I couldn’t help following the gradual testing of the Apollo systems in the eighteen months before the actual landing. No matter how tarnished, my science fiction dreams were starting to come true. When the crew of Apollo 8, in orbit around the moon on Christmas Eve in 1968, began reciting Genesis, I had much the same reaction as I’d had at Disneyland – it was at once corny and deeply moving. The gesture captured my imagination despite my recent conclusion that I was an agnostic.

By the time of the actual moon landing, my excitement – and everyone else’s – was almost unbearable. Everywhere I went, people were carrying transistor radios, not listening to music, but to live coverage of the Apollo 11 mission, or at least to discussion of it. People were making lists of firsts that would be accomplished on the mission as though they were achievements unlocked in a video game: First man to land on the moon, first man to orbit the moon alone, and dozens of others, some of them remarkably silly, including first man to leave the moon. Talk shows went on about the possibility that the LEM (which everybody knew was short for “Lunar Excursion Module”) might find itself landing sinking into layers of dust, or what Buzz Aldrin and Neil Armstrong might do if they needed suit repairs while walking on the moon, or what Michael Collins felt like, being more isolated than any other human had ever been. Nobody could get enough of the coverage.

Then the actual landing came, and none of the shortcomings of NASA or the woodenness of the astronauts, or Armstrong’s pedestrian first words from the moon could destroy the excitement. Not only were humans taking the fist step out into space, but everyone knew that anyone with a television set or radio was listening in, in a small way a part of the achievement. Suddenly, for all the social problems of the time, being a human being, and a citizen of an industrialized country didn’t seem something to be ashamed of at all. Despite all the efforts of the United States government to convince the world that the moonlanding was an American achievement, we knew it was a human achievement that highlighted the best that was in us.

For the next few days, the celebration continued. Newspapers got out the large typefaces to produce souvenir editions with front pages consisting of a single headline and a few pictures. Airlines offered souvenir vouchers, reserving seats on their first flights to the moon (I kept mine for years).

Somehow – I’m not sure how — by the time of the next moon mission, the excitement had died out, the usual social issues and divisions returned if they had ever really gone away. Yet despite the hype and jingoism surround the event, the days of the moonlanding still lingers in my memory as a significant event.

Like the ending of the World Wars, it was a defining moment that combined the fulfillment of anticipation with genuine achievement and the hope for a better future. Like the assassination of John F. Kennedy, it was one of the first major events to receive massive television coverage – but, unlike the Kennedy assassination, it left people in awe rather than horrified disbelief. It was like nothing that has happened since, not even the Falkland War or the two Gulf Wars, and probably can never happen again, given our modern cynicism and knowledge of the media.

How much of it was hype, I couldn’t say. But somehow, the point is academic. For a moment, the moonlanding made those who watched it believe – and that it why so many can’t forget it. Despite its shortcomings, I only wish that we could have a moment like that again.

Read Full Post »

The Nu-chu-nulth (formerly known as the Nootka and West Coast) were among the earliest First Nations to have contact with European explorers. Yet today, very few Nu-chu-nulth artists are well-known. I can think of Patrick Amos, Joe David, and Tim Paul, and have to do a web search to come up with any other names. This lack is unfortunate, because, while the Nu-chu-nulth sometimes work in the northern formline tradition, their art also includes at least one other – possibly two — schools of design that are unparallelled anywhere on the Northwest Coast.

For that reason alone, a few months ago when Kelly Robinson recently offered his “West Coast Wild Man” mask for sale, I was happy to add it to the works on the walls of my townhouse. But I was also glad to buy because the mask was not like anything I had ever seen before.

A 2012 graduate of the Freda Diesing School, Robinson has been selling his jewelry to galleries for several years. More recently, at the Northern Exposure show at the Spirit Wrestler Gallery, three of his four pieces sold within the first three hours of the show. He is also a skilled painter, and one of his canvases, “Mother of Mischief,” already hangs on my wall.

However, most of Robinson’s work is in the Nuxalk style. He has only occasionally explored the other side of his heritage and worked in the Nu-chu-nulth style, but if “West Coast Wild Man” is any indication, he could have a significant contribution to make to that tradition as well.

There are few references for Nu-chu-nulth stories, no matter what name you search under. I assume that the Nu-chu-nulth wild man has some similarities to the Bukwus of the neighboring Kwakwaka’wakw or possibly the Gagiid of the Haida. All three are often depicted with large hook noses and grimaces, and probably their symbolic taming was a feature of the midwinter dances in all three cultures.

Probably, though, the parallel is not exact. The Bukwus is a dwarf, often conceived as being dead, who tries to tempt the living into eating its ghost food so that he can carry them away. Often,  like the Gagiid, he is said to originate as a shipwrecked voyager.  The Nu-chu-nulth wild man seems to share these characteristics, since the culture often raised memorials of skulls to shipwrecked sailors, but almost certainly some of the other context is missing.

To even a casual observer, Robinson’s mask shows obvious signs of the Nu-chu-nulth style, with the inverted skull dangling below the chin, the straggling hair, and the unusually large eye sockets and relatively small eyes. Whether the hair, which resembles dreadlocks, is also traditional or Robinson’s own innovation, I am uncertain, but either way, the general influence is obvious when you compare the mask to the work of carvers like David or Paul.

However, if you continue the comparison, you will notice something else. If you search on the Internet, you will soon notice that David’s or Paul’s work has an air of historical re-creation. Both artists reach a high level of quality, but their work is little different from that done a century and a half ago in the same tradition.

There is nothing even mildly wrong with this choice, and I look forward someday to having works by both David or Paul around the house to enjoy. But, having trained with some of the leading woodworkers on the coast today at the Freda Diesing School – artists like Dempsey Bob, Stan Bevan, and Ken McNeil – Robinson is trying to do something more.

Consciously or unconsciously, Robinson is following his teachers, and thinking of his work as fine art. His use of both paint and abalone is restrained, and his wood is finished to modern standards. He also takes full advantage of the grain, shaping it to fit his carving. While obviously based on past Nu-chu-nulth tradition, the result is something that – so far as I am aware – no other Nu-chu-nulth artist has attempted. And what is even more important, Robinson succeeds, producing a work that is both contemporary and not quite like that of any other artist.

This originality – admirable in anyone, but especially so in such a comparatively young artist – is sensed almost immediately by anyone who views the mask. Robinson delivered the mask to me at the opening of the Northern Exposure show, and the first response of each of the half dozen people I showed the mask to responded was a sigh of wonder. “West Coast Wild Man” is an original work of unexpected power, and if Robinson can continue to meet the same high standards in other works, his future as a major artist on the coast seems assured.

Read Full Post »

I know it’s rained at the Vancouver Folk Music Festival. I’ve stood in the mud many times, worrying what I would do when my last dry layer was soaked. I know, too, that at least twice rainy weekends have almost drowned the Festival in debt. Yet somehow in my mind, when I think of the Festival, the sun is always shining.

But to be exact, I need to expand the image. In my mind, it is early afternoon. A mild breeze is coming off the water. It makes Jericho Park, the home of the Festival for most of the last thirty-five years, looks deeply green, and makes the temperature in the shade several degrees Celsius cooler than in the sun. The young crows are picking at the garbage, so surprised by their luck that they forget to be afraid of humans. Everywhere you look, blankets and elaborate fortifications of chairs, tents, and banners are spread in front of the main stage, reserving spaces for the evening concert.

Overhead are a few wisps of cloud, just enough to keep the worst of the heat away for most of the days. But as the day continues, an increasing number of the crowd are opting for the three stages in the shade, where the risk of sunstroke is lower.

From wherever you are, you can tilt your head in a different direction and catch the strains of whatever is playing at another stage – sometimes, two or three. Very occasionally, you can hear applause or a roar of approval, but not often. The problem isn’t the quality of the musicians, which varies wildly, but has included dozens of superb acts over the year. The problem is that, after the first set or two, most of the crowd are too drowsy in the sun to get excited by anything except the very best.

And everywhere, there are people – people sitting cross-legged on the grass, or stretched out staring up at the sky, people dancing to one side of the stage, and people trudging in long lines along the increasingly dusty footpaths to the next stage, or to grab something to eat in the food corner.

Mostly, the crowd is couples and of European descent, but there are always families and cliques of teenagers, and a scattering of other ethnics as well. Recently, women and men in their late sixties and seventies have also started to become more common — some of them with enough tattoos to forever put to rest the idea that people will regret their tattoos when they’re young. A few of every age are in scooters and wheelchairs – including ex-Vancouver city council radical Tim Louis – because both the site and the Festival are more accessible than almost any other event that claims to be. And weaving through this crowd are volunteers, driving performers and drum sets and bass fiddles to distant stages, or picking up garbage or selling raffle tickets (the ticket sellers, by tradition, in costumes).

Most of the crowd, though, are in T-shirts and shorts, or tank tops and halters and long cotton skirts. A few are in bathing suits (and looking increasingly red and pained as the day continues), and women in elaborate and expensive fantasies of what they imagine the counter-culture must have been – fantasies that seem to owe more than a bit to ElfQuest. Some wear costumes. A few women go bare-breasted, believing themselves in a safe place, and a few men who want to show off the results of their weightlifting do so as well. However, far more have hats, either carried with them or improvised from programs or whatever else is at hand. Many have bare feet, despite the warnings in the program that shoes are advisable.

Or so the gestalt image appears in my mind. In reality, I know that that the Festival is not always The Peaceable Kingdom that its organizers sometimes like to pretend. In the early days, drugs were often obvious (and spot the narc one of the informal games that everyone played). More recently, the addition of a beer garden has created an increased need for security (or so I’m told). There are complaints, too, about the high prices charged for tickets and food, the selection of acts, and just about any other aspect of the Festival that you might name.

But I’m talking about my imagination, and not trying to give a balanced assessment. In my mind’s eye, at the Festival my brain is always slightly sodden with the heat, and the rest of me mildly dehydrated and seeking more fluids. The next day, or maybe the day after, I will be back at work, but that time seems centuries away. For the coming hours, I am relaxed and doing nothing but listening in a way that I rarely manage at any other time, even when on holiday.

Last year, I didn’t feel that way. As I said, it was raining. More importantly, the trip was a pilgrimage in which I remembered being there with my deceased partner. “It’s the nearest thing we have to religion,” she used to say, and, it’s true: although we missed the first two, and one for a vacation and one for a wedding, the years in which we missed the Festival altogether were rare.

But this year, I went alone, not expecting to do more than strike up a few casual conversations, and the magic was back. This year, it was the Festival of my dreams once again, and I know that next year I will be back and it will be a blazing hot summer day.

After all, isn’t it always?

Read Full Post »

I first heard about the concept of dormitive explanation in university. Ever since, it’s been one of my basic tools for errors in logic and thinking more clearly.

The concept has been articulated several times in the last few centuries, but, so far as I can tell, the person who named it was system theorist Gregory Bateson. The name comes from Moliére’s play The Imaginary Invalid (aka The Hypochondriac), in which a doctor claims that opium puts people to sleep “because there is a dormitive principle in it.”

In other words, opium puts people to sleeve because it puts people to sleep. When the statement is reworded, the circular cause and effect is obvious, but an essential part of a dormitive explanation is that the circularity is hidden by changing the word being used. Often, as in Moliére’s play, the change in word involves using a word with a Latin word rather than a Germanic one, or, at the very least, a more impressive, mufti-syllabic word. In any case, the effect is to leave the impression that something has been explained when it has only been renamed.
Or, to explain the concept another way, dormitive explanation is a fallacy, an indicator of an illogical argument that the classical Greek and Roman studies of rhetoric somehow missed.

Dormitive explanations rarely exist in the hard sciences, although at first they might appear to. Explanatory principles like “gravity” or “mass” might act like dormitive explanations for the semi-trained – for example, things fall because of gravity, which is the tendency for things to fall – but the fact that they can be used to calculate other behaviors indicates that they are more than circular causality hidden by a change in terminology.
However, on the fringes of science, dormitive explanation becomes more common.. For instance, consciousness is often described as the capacity for self-awareness. Often, words like “syndrome” or “complex” or “effect” are used, so that feelings of inadequacy become “impostor syndrome”with no attempt to classify symptoms systematically.

Similarly, in many New Age philosophies, explanations are give in terms of “energy,” which – since the term obviously does not refer to any sort of energy recognized by physics – amounts to just another name for a dormitive principle.

In some ways, dormitive explanation can become an appeal to authority, either to the authority of the explainer, or to the force or principle evoked as an explanation. For example, if you say that men have evolved to be better at mathematics than women, not only are you suggesting an evolutionary tendency whose existence is unproved, but you are mentioning evolution in the hopes of presenting an argument that others cannot challenge.

In fact, dormitive explanation is all about authority. It makes the person who gives it sound authoritative, and, if accepted, gives listeners a sense that something that concerns them has been explained. In practice, no explanation has been given at all, but unless the listeners can analyze while someone else speaks, they are unlikely to recognize what is happening until later.

Many, of course, never recognize it all, and have no desire to do so. After all, which would you rather do: suffer from joint pain, or have arthritis? The problem is not that arthritis doesn’t exist, but that it is a generic term that covers dozens of different conditions. That means that being told you have arthritis actually does people little good. Yet, having a scientific-sounding name for their condition is reassuring for many people, even if the name does little to suggest treatment or prognosis.

What makes the concept of dormitive explanation so important to me is the fact that it is generally unrecognized and used to assert authority and give false reassurance. By contrast, by being aware of the concept, you can learn to notice it when you encounter it, and reject the lack of logic behind it. The result can be not only clearer thinking, but a clearer sense of what to do next.

Read Full Post »

When I was in the first grade, I took speech therapy lessons. I sometimes think that is the single most influential fact in my life.

I had two main problems. First, I pronounced a hard “c” as “t.” For example, I would say “dut” for “duck” and “tind” for “kind.” Second, I spoke so quickly that I often slurred my words.

These two deficiencies were more than just inconveniences. They were major factors in how people regarded me. I’m told that, when I was in kindergarten, some of the parents dismissed me as mentally retarded (to use the term of the times). More than once, I was aware that an adult to whom I was talking didn’t understand me. These deficiencies were also the reason that, in Grade One, I was assigned to the slowest reading group, along with at least one child with Down’s Syndrome, although what relation my speech patterns had to my reading ability eludes me to this day. But they were major reasons to ostracize me, and I knew they meant something was wrong with me, although I didn’t know exactly what.

To correct my problems, my parents started sending me to speech therapy once or twice a week. For a long time, I struggled through the pronunciation drills, feeling increasingly inadequate, since it was obvious that I wasn’t giving the right response. Being poster-boy for that year’s March of Dimes campaign, pretending to be deaf, helped a little, but even that, I realized, was not an unalloyed honor.

Then, one day, I managed to make the hard “c” sound. Over the next few months, I learned to make it consistently where I needed to. I also learned to pronounce more clearly in general. Slowing my speech down, however, was a lost cause – decades later, I still speak too fast unless I make a deliberate effort.

To say that this experience changed my life is an understatement. Now people understood me, I was soon in the most advanced reading group. Before long, I was leaving most of my class mates behind.

But the experience also affected me in other ways. Having listened on head phones to my own endless efforts to pronounce words correctly, to this day, hearing my own voice makes me remember feeling like a failure, to the point that I wince at the sound. I dislike being judged and tested as well, knowing how fallible those doing the judging can be. And I still speak with a deliberateness that makes me sound far more serious than I am, and that many people – at least in North America – mistake for an English accent.

More significantly, I was left with an unshakeable sympathy for those who are easily dismissed the same way that I was. My social and political feelings, as well as my feminism, come directly from the sense of injustice I automatically feel when I see someone who has been judged less than mainstream. Nor can I ever feel comfortable being part of an elite, knowing how superficial the membership requirements can be.

Even more importantly, I became obsessed with language. My reading ability, which had always been advanced for my age, improved so dramatically that, by the time I was awarded a copy of Rudyard Kipling’s Just So Stories for the best student in my first grade class, it seemed too juvenile for me (I only learned to appreciate it as an adult). That summer, I read the entire Hardy Boys series. By the time I started Grade Two, I was reading translations of The Three Musketeers, and my mother was worried what I might come across in my readings.

Meanwhile, on the side, I was starting my own first efforts at writing. Borrowing a few ideas from my father, I wrote a long story about discovering a prehistoric world inside the local mountains.

Another early effort involved a pack of wild dogs that were being rounded up evil men in a van. I was especially proud of the fact that, while I understood that the dogs couldn’t read the entire license plate on the van, I had the canine protagonist remember the last few digits, which seemed much more probable to me.

Such efforts led me to teaching English, and eventually to publishing manuals, articles, stories, and poems. In a very direct sense, speech therapy gave me my vocation.

But, just as importantly, speech therapy gave me my personal myth. It gave me a narrative of starting from behind, and then succeeding through persistence. At an early age, it taught me to endure and to keep trying, and to ignore the opinions of the skeptical as I worked towards my chosen goals. Perhaps, it even gave me an early orientation to goals.

I sometimes wonder what might have happened if I hadn’t needed speech therapy. Would I be a political lefitst? Place the same value on endurance? Be a writer? I might have still been and done all these things, but not, I suspect, as strongly. If I had had things easier, then probably I would be a much milder, more innocent person than I am today, even if my general tendencies remained the same.

Read Full Post »

To my bemusement, I realized recently that over a third of the articles I do in a month are opinion pieces. Back in 2004, when I first started full-time journalism, I wouldn’t have believed that was possible. I believed then that I had no talent for editorials, and the thought of doing one intimidated me so much that I barely knew how to begin.

My background as an academic and a technical writer had a lot to do with that belief. Ask me to summarize or quote accurately in an news story or interview, and I could draw on my experience writing academic papers. Ask me to write an accurate how-to, and I could depend on my experience writing manuals and tutorials. Even a review didn’t seem impossible, because, while it gave an opinion and was shaped by an opinion, the opinion was based on clear facts.

But a commentary on free software-related events? That left me much more exposed. I had only been involved in the community for a few years, and I was all too aware that dozens of people –maybe hundreds or thousands – had more experience than me. So why would anyone be interested in my opinion? I’d be shredded as soon as I opened my mouth.

Besides, years in a university English department had conditioned me to avoid giving a firm opinion whenever possible. I had got used to softening my opinions with words like “almost” or “seems” to lessen the possibility of an attack.

Fortunately, writing at Linux.com and hanging out on its IRC channel every day, I had some strong role models. The late Joe Barr was the master of the attack piece – of angry diatribes full of sarcasm and humor, the kind that is read less for insight than for entertainment, like a review of a play by Dorothy Parker (“And then, believe it or not, things get worse. So I shot myself.”). By contrast, Robin “roblimo” Miller, the senior editor could write editorials just as forceful, but milder in tone and more thoughtful.

These models were important to me, because, when I came to write my first opinion pieces, I had some idea of what I could manage. While I admired Joe Barr’s expression of anger, I knew there was no way that I could match it for more than a sentence or two. I would have to assume a persona that was mostly foreign to me, and would feel foreign – maybe dishonest – to me.

By contrast, my academic background made the thoughtful editorial seem a more attainable goal. While writing academic papers, I had discovered I had a knack for getting to the core of a matter and stripping away irrelevancies. I knew how to anticipate opposing views, and disarm them by answering them before anyone else could make them. I knew that, even if I didn’t always respect opposing views, reporting them fairly made me appear to do, and that the effort improved my own argument. I might still shoot off the occasional one-liner caked in sarcasm, but, most of the time, I had a better chance of managing a thoughtful tone rather than an outraged and witty one.

What I didn’t anticipate was how my style would add to my voice. My model for style was George Orwell, with clarity and simplicity my main goals. In particular, I got into the habit of ruthlessly deleting all the qualifiers that academia had taught me to use to soften my opinions. Add a tone that is partly a reflection of my own speech-therapy influenced conversation and partly the influence of Orwell’s very English tone, and the result is that I come across as more forceful than I initially realized.

This combination of habits and tone meant that, as I ventured into writing opinion pieces, I had a more distinctive result than I realized at first. Not everyone liked it, of course: to this day, I still have critics who claim that my ability to look at all sides of a discussion mean that I will write anything, even for shock value (not true; although I do sometimes write to explore the possibility of an idea). Others find my tone patronizing (usually when they disagree with me). At times, too, I have been called disloyal to free software, or worse.

I can see where these views originate, so I don’t feel much need to argue against them, except to say that they have as much to do with readers’ expectations as anything I actually do.

At any rate, over the years, I have grown much more accustomed to hostile responses than I was when I started writing opinion pieces. If people disagree with me (or with what they think I am saying), they are at least reading me, which means that editors will pay for my opinions.

As for myself, I’m content to express an opinion that I either hold or am considering. So long as I can do one of these two things as thoroughly as possible, writing an opinion piece has long ago lost its terror to me. I sometimes need half a draft to know just what my opinion on a subject happens to be, but opinion pieces have long since settled into being a familiar part of my repertoire.

At times, I can even imagine that I have a talent for them. When Carla Schroder tweeted, “Bruce Byfield writes calm, thoughtful, lengthy articles that somehow ignite mad passions and flame wars,” I couldn’t have been more satisfied. That is exactly what an opinion piece should be and do, and someone, at least, was saying that I was succeeding in doing exactly what I was trying to do.

Read Full Post »

O Canada!
Our home and native land!
True patriot love in all thy sons command.
With glowing hearts we see thee rise,
The True North strong and free!
From far and wide,
O Canada, we stand on guard for thee.
God keep our land glorious and free!
O Canada, we stand on guard for thee.
O Canada, we stand on guard for thee

-Robert Stanley Weir and others

Some of my favorite pieces of literary criticism are Robert Graves’ line by line readings of famous poems. Often, Graves proves to his satisfaction, as well as mine, that the poem under scrutiny is not a masterpiece, but poorly thought out and incompetently rendered. The same can be said of Canada’s national anthem, “O Canada!” – which is hardly a surprise, because few if any national anthems are meant to do anything more than rouse a moment or two of cheap sentiment in those who happen to live in the country.

You know right from the start that Canada’s anthem is in trouble, because it starts with a vocative sentence. This is trouble because the vocative is so rarely used today that few people except Latin scholars understand that the first sentence is addressed directly towards Canada. So far as most people understand the sentence, they usually think it starts with a sigh, as though the speaker’s emotions about Canada are so strong that they can’t resist a wordless exclamation — an interpretation that hardly seems justified by what follows.

Not that there is much meaning to destroy. The song is addressing the country in the abstract – a mawkish approach, but one that, in a spirit of generosity, I have to admit is too common a poetic convention to reject. But what do the singers say to this great abstraction? It tells Canada to command loyalty from all those who are born there – and I think I have to be forgiven for wondering just how the singers’ pious wish will affect the matter in any way whatsoever. You might as well tell the waves that it’s fine with you that they continue hitting the beaches.

Then there’s the exclamation point at the end of the line – the first of four in ten lines. This is another unpromising sign, since the over-use of exclamation points is always a sure of sign that the speaker is trying to whip up some excitement while saying something unoriginal or dull.

And sure enough, the next line is a redundancy with another exclamation mark added in the hopes of adding some dignity to the sentiment. The only reason, of course, for the redundancy of “home and native” is that the writer of the words didn’t know what else to add that fitted the music.

But it gets worse as the song continues. What, I wonder, is “true patriot love?” How is it different from false patriot love (perhaps that of those who come “from far and wide” below)? More filler, followed by the unnecessary sexism of “in all thy sons command.” At least twice in my life time, feminists have tried to change the line to something like “in every child command,” only to be met by outrage, as though the English words had not been changed several times, and several different unofficial versions exist.

Struggling on, I suppose we have to bear “with glowing hearts.” After all, we are in the realm of patriotic doggerel, where the participles fly thick and fast, streaming and gleaming and beaming. For some reason, “ing” at the end of enough words lulls us into a sort of drowsy acceptance of whatever else follows. And I have to say that, after “glowing,” I am not surprised to see the line end with “thee,” an archaicism completely out of keeping with the rest of the poem and useful only in efforts to elevate a trite idea. Basically, the line is saying, “We’re proud to see you develop as a nation,” only much less clearly.

As for “True North,” I suppose that is supposed to mean “faithful,” and to refer to Canada’s position as a former colony that is still on good terms with the mother country (It almost assuredly doesn’t mean that Canada is the location of True North for navigators). But “North,” alone, leaves Canada defined entirely by geography – an all too common occurrence that makes the place sound about as exciting as a mound of three month old snow on the curb.

And don’t get me started on “strong and free.” The last time that Canada could defend its own borders was in World War Two. Very likely, that was the only time. The history of the country can be neatly summarized as, “Era of French Domination, Era of English Domination, Era of American Domination.” To say the least, it’s incongruous for a satellite country to be describing itself as either “strong” or “free.”

Next up is one of the more recent bits of editing, “from far and wide.” Most likely, it was added to acknowledge the number of immigrants in the last few decades. But how do you reconcile this line with “home and native land?” If you’re born in the place, you don’t come from “far and wide,” and if you do come “from far and wide,” then Canada isn’t your “native” land.

Even more importantly, how do you “stand on guard” “from far and wide?” It sounds as physically impossible as some of the awkward poses of female super heroes on the covers of comic books. Anyway, as I said, Canada has rarely been able to defend itself, never mind against whom (perhaps the Americans buying up our corporations?).

Even to the composer, the jumble of thought is too much. Another vocative and another “thee” are thrown in, with God and another mention of freedom added to the mix as well, all in the impossible hope that an elevated mess can be mistaken for something meaningful.

Unfortunately, this mishmash and all the efforts to play on listeners’ emotions don’t lead anywhere, so the ending is problematic, All that can be done is to repeat what has already been said. That’s not a bad trick if you have something rousing to say, but here it falls flat. That’s probably why, any time you ever hear “O Canada” there is always an uneasy silence and an almost audible shuffling of feet: there’s nothing is nothing to indicate that the mercifully brief ordeal is over.

Someone – I forget who – once said that more Canadians of my age knew the words to the opening of The Bugs Bunny / Roadrunner Hour than knew the words to “O Canada.” That was mainly a reference to the number of changes that have been made to the anthem in our lifetimes. It may have referred, too, to the fact that, to many Canadians, overt displays of patriotism are embarrassing.

But I think that it also has something to do with the fact that the national anthem is rarely comprehensible for more than two or three words at a time. It is difficult to remember words you don’t understand – just try memorizing a dozen lines in a language you don’t understand if you don’t believe me.

You don’t expect original or deep thought in an anthem. But is basic literacy too much to ask? At least “The Maple Leaf Forever,” for all that it ignores the Quebecois and First Nations, makes literal sense. But Canada’s anthem, I’m ashamed to say, is almost entirely nonsensical and border-line illiterate. It only really serves its purpose when the music is played without the words. With the words, it’s either confusing or embarrassing.

Read Full Post »

I am not a Christian. Nor am I follower of any other religion, or even a theist. For years, I have wavered somewhere between agnosticism and atheism. But I thought I had made my peace with being a non-believer in a culture whose origins were Christian, making myself tolerably familiar with the Bible and the history and philosophy of belief throughout European history.

Then, some years ago, I was blind-sided by a statement of the obvious.

Although I hadn’t been Christian since puberty, I had always thought that the most recent parts of the Bible had a historical background. Probably things hadn’t happened quite as described in the New Testament, but I assumed that the descriptions were roughly true. After all, the New Testament accounts mentioned historical figures like Pontius Pilatius and Herod Antipater.

So didn’t it follow that a historical Jesus had existed? Of course, he probably bore about the same relation to the stories as the historical King Arthur bore to the writings of Thomas Malory, but after you discarded the religion doctrine and traditions like the sacrificed god, there would be a core of truth.

Then, I read a book called The Jesus Puzzle by Earl Doherty. The book is poorly written, and has the obsessiveness that marks a crank, but it introduced me to the idea that the whole of Christianity was a neo-Platonic myth, most likely originating among the Jewish population of Alexandria that had started being taken literally.

I learned, too, that there were reasons to question external references to Christianity like those in Josephus, and that reputable references to Christianity did not occur until well into the second century of the common era. Even some of the references to the modern story of Christ in the later books of the New Testament were metrically suspect.

These ideas are not universally accepted. But the fact that they can be reasonably held at all shows how shaky the conventional views actually are. More importantly, they give reasons for some aspects of Christianity that I had never heard adequately explained, such as the neo-Platonism on the gospel of John, and some of the references to the Christ figure that seem strangely vague if they are supposed to be about a man who had lived. Although not proved, the ideas were at least plausible.

To my surprise, I found myself reacting as though I’d been tackled by someone I hadn’t seen. I suspect that belief in a historic Jesus is the last refuge of an agnostic or atheist who used to be a Christian, a minimal adjustment of their thought that allows them some continuity with their past and cultural history. Even in our disbelief, we cling to a sense that the stories of Christianity must have some degree of reliability. But, suddenly, even that minimum belief seemed questionable.

I realized, too, that I was angry. I’d been lied to, which always makes me self-righteously indignant, told false certainties were established fact. The fact that, on reflection, I realized that the liars had probably lied to themselves first did not make me any less angry.

If my reaction could be summarized in three words, those three words would have been: How dare they?

But the closer you look, the more dubious the founding legends of Judaeo-Christianity become. Despite the record keeping of the Egyptians, no evidence of anything remotely resembling the Exodus has ever been found. What evidence exists points to the Ancient Hebrews being offshoots of the Canaanites – locals rather than invaders. Similarly, no reference exists in any of the surrounding cultures of the empire of Saul, David, and Solomon. The few references to the kingdoms of ancient Israel that have been found suggest that, at best, for most of its history it was a satellite kingdom of the surrounding superpowers, a fact that should have been obvious from one look at a map.

Yet I remember seeing maps of Solomon’s empire when I was growing up, and other maps showing how the twelve tribes settled Palestine (in fact, look up Judea, and you can still find this map on Wikipedia). The maps, that are supposed to value accuracy, are works of fantasy, charting as certainties facts that are questionable and unsupported by the archaeological record. In fact, the more archeology that is done, the more the Biblical accounts look like fiction embellished with a few sprinkles of fact for verisimilitude.

Was anyone surprised when the James Ossuary, allegedly the container for the bones of Jesus’ brother, proved suspect? I wasn’t. It was exactly the same as every other effort to reconcile fact with the Bible: unproved, the product of wishful thinking at best, and of outright fraud at worst.

And when I consider that European culture is built on such foundations – well, don’t come trying to convert me is all that I can say. Because if you try, you’ll have a lot of explaining to do.

Read Full Post »

Paintings have never been a large part of modern Northwest Coast Art. Since the 1960s, artists have preferred to release limited edition prints instead. Recently, though, this trend has shown signs of changing.

Ever since the 1960s, limited prints have been far more common than paintings. The reason is simple economics: A limited print costs the buyer anywhere from half to one-tenth the price of a painting, which pleases buyers not interested in an investment. If a run of a hundred can be sold, the artist makes much more than they would from a painting – enough, with luck, to allow them to earn a living from their art.

As a result, limited prints have long been the norm in Northwest Coast Art, despite the forgeries that have been periodically discovered. By contrast, artists interested in painting have often found selling their work to galleries difficult. A few exceptions exist, such as Robert Davidson in the last decade, but they are exceptions because of their fame.

A better indication of the status of paintings in Northwest Coast art is the fact that even an artist as accomplished as Lyle Wilson could only manage a show consisting entirely of paintings this year – and at least two-thirds of the pieces were completed decades ago and had never sold. Meanwhile, an artist’s first limited print is still seen as an important step in their career.

However, the days when prints could be counted on to fund an artist’s career are rapidly coming to an end. Hundreds are entering a market that once sustained dozens, thanks in part to the relative cheapness of producing a print from a computer compared to traditional silk screening.

Perhaps as a result, the average price of a print has declined or remained static, with many prints available for well under a hundred dollars unless the artist is well-known. Moreover, where, thirty-five years ago, so-called limited prints could have a release of five or six hundred copies, now releases of a hundred, or fifty, or even twenty have become common, partly to reduce forgery and partly to ensure that artists are not left with a large inventory of unsellable prints.

At the same time, Northwest Coast artists are more closely connected to other schools of art than they have been at any time in the last sixty years. Artists like Dean and Shawn Hunt have succeeded to some extent in selling canvases outside the usual Northwest Coast markets, and new artists – an increasing number of whom have attended art school – are becoming more interested in painting as well. In fact, I know several young artists who began working on canvas and only learned carving and metalwork later.

Whether on wood, paper, or canvas, painting has suddenly become semi-respectable. The Douglas Reynolds Gallery has been showing an increasing number of high-end paintings over the last couple years. Similarly, Lyle Wilson may have had to go to the suburb of Maple Ridge rather than downtown Vancouver to mount his recent Paint show, but the point is he managed to have the exhibit. And, as I write, I have just returned from the Lattimer Gallery’s opening reception for “medium: Painting on Canvas,” an exhibit of over fifteen canvases by both new and leading artists.

Slowly, painting is becoming acceptable in Northwest Coast art. It still has a ways to go – according to Peter Lattimer, for many of the artists in his exhibit, working on canvas was a new and not wholly comfortable experience. But the change is coming, all the same.

Most likely, painting will not replace limited prints. A handful of top artists are still doing well with limited prints, and will probably continue to do so for years. However, a day might come within the next decade when most limited prints are viewed as tourist wares and no longer as fine art.

Read Full Post »

“And the pageantry, the panoply, the sanctified decay —
But I knew the hour was coming that would sweep it all away.
Now time has me in a corner, and I’m moth-eared from the fray,
But Her Majesty is reigning still today.”

-Leon Rosselson, “On Her Silver Jubilee

Science fiction fans joke that they are disappointed that the future has yet to produce flying cars. But my disappointment lies elsewhere. It lies in the fact that the society I expected to see when I was middle-aged is almost as distant as when I was coming of age – a fact that the recent Diamond Jubilee of Elizabeth II reinforces all too sharply.

The mood of the late 1970s was so different from today’s that I can barely remember it. Probably, anyone born after that time can’t conceive of it at all. But for a naïve, idealistic teenager like me, it seemed a time of infinite possibilities and constant progress.

Consider: Prosperity in North America was at an all-time high. So was income and social mobility. In recent memory, activism had helped to end the Vietnam War and to force Richard Nixon’s resignation. Based on the previous decade, it seemed self-evident that ethnic minorities were about to win their rights, and so were women and gays and lesbians. Probably, Canada would be a republic, without a monarch to remind us of a now non-functional past. Sure, problems like pollution and poverty remained, but once we focused on them a bit more, they would be solved.

In other words, we were still in the era in which Arthur Schlesinger, Jr. could write a book called The End of History and not be ridiculed. The problems that had haunted humanity for centuries were about to be solved, and all that would remain was the question of what to do afterwards – colonize the stars, most likely, or maybe begin a cultural Renaissance.

Ever since, each year has added to the progressive disillusion. Instead of the end of history, we got the Counter-Reformation of the reactionaries, who proved to be better organized and more tenacious than the rest of us could imagine. Year by year, living standards declined. We got Ronald Reagan in the United States, and a denial of the lessons that Vietnam should have taught. In Canada, we got Brian Mulroney and Jean Chretien, who between them swept away any idea that politics could be about anything except pragmatism and the cold-blooded survival of career politicians. The promising beginnings for cause after cause turned out to be end points that were fiercely and – all too often — unsuccessfully defended.

Oh, not everything was reason for gloom. In Canada, abortion rights remained in legal limbo, permitting access in theory despite constant efforts to chip away at them in practice. Same sex marriage became legal. The Internet and cameras in mobile devices made organizing and calling authorities to account easy. But against the losses and the constant wallowing in the same old arguments, these gains mattered for little. If the losses didn’t outweight the gains, we believed that they did. We stopped believing that social progress was possible, although many of us kept on fighting or wistfully believing.

Against this background, the Diamond Jubilee is only a tawdry reminder of what hasn’t happened in recent decades. The occasion is not only a celebration of mediocrity, but also a celebration of how things have failed to change. For me, the fact that Leon Rosselson’s song remains as applicable today as when it was written in 1977 only adds to the irony. Seeing the media’s continuing attention to the non-story of someone whom Rosselson describes as “so commonplace a woman in her fuddy-duddy hat” makes me want to mourn, not celebrate. Do we really have nothing more to show for the last thirty-five years?

To all appearances, we don’t.

Science fiction readers have been known to cry, “Give us our flying cars!” But as I tried to avoid the coverage of the Diamond Jublilee the other week, what I wanted to do was to plead for a reason to believe in social progress – and then to go and find a quiet corner in which to mourn the unlikelihood of any answer.

Read Full Post »

« Newer Posts - Older Posts »