Feeds:
Posts
Comments

Archive for the ‘Personal’ Category

Yesterday evening, as I stood shivering at the corner of Robson and Burrard in Vancouver in the middle of a flash mob, the insight struck me: The people who refer to the Idle No More events as protests have the wrong idea. The events are not just protests – they’re at least as much celebrations.

Not that politics don’t enter into what the Canadian First Nations are doing. Most of the people at last night’s event could cite at least Bills C-45 and C-27 among the half dozen bills that the movement is protesting. A political pamphlet, obviously hastily made, was being handed out, and the organizers speaking to the media could talk knowledgeably about the issues.

However, politics were no more than half the story. Political signs were scattered throughout the crowd (My favorite: “We want to speak to the Crown, not the court jester,” a reference to requests for the Governor General to intervene, and a dismissal of Prime Minister Stephen Harper), but there were also Canadian and British Columbian flags, as well as variations on the flag that the Iroquois Warrior Society flew during the Oka Crisis. One man carried a flag with a Northwest Coast copper in the center. Others had tied flags around their shoulders that proclaimed, “Idle No More” in large letters.

Even the organizers didn’t spend much time on the issues. Two or three made some obviously unprepared remarks for the cameras before moving on to the drumming and dancing as soon as possible. In fact, of the entire ninety minutes of protest, no more than fifteen were concerned with talking politics.

That’s not surprising. The flash mob was the third Idle No More event that day, and many in the crowd had gone to all three events. They must have had every opportunity they could wish to hear about the politics, and almost everyone in the crowd must have made up their minds long ago.

Anyway, you could tell it wasn’t a political crowd by its composition. A crowd bent on political action is usually young, and predominantly male. It doesn’t consist of grannies and elders on scooters, or mothers carrying toddlers and families with strollers.

Unless I am very much mistaken, the people I saw had come to celebrate being First Nations, to feel good about being survivors and the descendants of survivors of disease, neglect, and abuse. Some were wearing traditional button blankets. Others were wearing T-shirts that talked about Haida Gwaii, or simply declared an cultural identity like Haisla.

But, more than to support any cause, they had come to show their pride in being aboriginals in a modern world, and most of them couldn’t get enough of the idea that their identity was something to proud of. For some, especially the senior citizens in the crowd, that might have been a new idea they were still exploring.

But you could tell what they were there for: the drumming and the dancing. They couldn’t get enough of either. At first lone singers with drums played at scattered points through the crowd, the drumbeats echoing stirringly among the tall buildings above them. Then many of the drummers formed up in two facing lines, each line trying to outdo each other in volume and enthusiasm until it seemed only a matter of time until a few drums were broken from the pounding they were taking. Around me, people swayed and shuffled to the music, clapping hands and whooping as each song finished.

Later, as the crowd moved to block the intersection, many didn’t walk so much as dance. As the drumming and singing continued, several chains of circle dancers formed, continuing for at least twenty minutes.

I remember sitting on a fire hydrant through part of the intersection blockade, watching the police diverting traffic to make sure they were continuing friendly, and my eyes kept continually drifting back to the dancing. It seemed a little tentative, as though some of the people couldn’t quite believe what they were doing, but they were enjoying it anyway.

I remembered the early twentieth century anarchist Emma Goldman saying that if there was no dancing at the revolution she wouldn’t be attending, and found myself thinking that, with that attitude, she would have loved what I was seeing.

Even when the crowd moved back on to the sidewalk and started breaking up into twos and threes and drifting away, there were some who couldn’t stop dancing. I had seen one teenage girl with “Idle No More” painted on her face who was already dancing half an hour before the start of the event; I saw her at the end, and she was still dancing, seemingly tireless.

Then the last echo of the last drum beat faded. The dancers continued for a few seconds before stopping to clap and cheer, and the noise of the traffic suddenly seemed unusually loud.

To all appearances, nothing happened or didn’t happen because the intersection of Robson and Burrard was blocked early in a winter evening. But it would be a mistake to imagine that the event served no purpose. The event ended with the participants feeling good about themselves and their cultures – and I suspect that it would be an even greater mistake to dismiss that result as having no consequences.

Read Full Post »

Whenever someone claims they can tell if a piece of writing was written by a man or a woman, I have to suppress a knowing smile. They have only a fifty percent chance of being right, and a near certainty of embarrassing themselves with rationalizations if they are wrong. Writing, apparently, is a skill that has very little to do with gender.

I first became aware of this basic fact through the reactions to James Tiptree, Jr. As a young teen, I remember critics praising Tiptree for a supposedly masculine prose style. When rumors emerged that Tiptree might be a woman, many explained at great length why that could not possibly be so. Then it was revealed that Tiptree was actually a woman named Alice Sheldon – and in a perfect demonstration of double-think, many of the same critics began explaining how they knew that all along, and pointing out aspects of her prose as evidence for what was suddenly an obvious fact.

Something of the reversal happened a few years later with F. M. Busby, a writer of intelligent space opera. Because Rissa Kerguelen, one of Busby’s greatest successes, featured a female protagonist, dozens of people assumed that Busby was a woman. A man, they argued, couldn’t possibly write such a sympathetic female character. But Busby was a man – although one fond of saying that “An intelligent man who isn’t a feminist isn’t.” The reasons that he went by his initials were that he disliked his given names of Francis Marion, and that his publisher considered his nickname “Buz” too informal for a book cover.

Having these two counter-examples, I have always been skeptical about efforts to identify gender through writing samples. Like too much alleged social science, such efforts always assume that certain subject matter and stylistic choices are somehow innately masculine or feminine (gay, lesbian, transgendered, or queer are always left out). A male writer, for instance, might be supposed to use “I” and to write short, unqualified statements. By contrast, a woman might be said to be more tentative in offering an opinion, and write about emotions or domestic subjects. Needless to say, such divisions say more about the devisers of such studies than any actual differences.

In fact, I’ve always found such studies rather dismissive of writer’s abilities. Most writers I know would have no trouble imitating the so-called masculine or feminine prose styles of such studies. Once they knew the required mannerisms, all that would be needed is a few hundred words of practice.

Moreover, whenever I have tried any online versions of such studies, the results have been random. For example, this morning I ran samples of my writing through Gender Genie, an online adaptation of one such study. My journalistic articles registered consistently as male, and my personal blog entries as female. My fiction registered as both male or female, although neither very strongly. Similarly, two women writers of my acquaintance registered as male, and a male friend as female. I would have tried more samples, but at this point, it was obvious that the results had such a large margin of error as to be unreliable in any given case.

And apparently, my personal observations were correct. Recently, fantasy writer Teresa Frohock invited readers of her blog to identify the gender of the writers of ten different writing samples. Of 1,045 guesses, only 535 were correct – a number slightly above random chance, but well within statistical variation. As Frohock noted, despite all the elaborate rationalizations and the stereotyped ideas that men were more likely to write epic stories and women emotional-driven ones, people were unable to tell men from women based on how and what they wrote.

In other words, exactly what my experience would predict.  Excuse me while I cackle, “Told you so!”

But this subject goes far beyond a mildly diverting observation. The obvious conclusion is that, if writing samples don’t reveal who is male or female, then why are most people so quick to assume that supposed differences in male and female brains are significant? If the products of those brains are indistinguishable from one another, then the brain differences can’t matter much, either. As often happens when gender is discussed, too many people tell themselves comforting stories, then look for reason to believe the stories instead of examining the evidence.

Read Full Post »

I was raised in a very English family – a fact that means parts of my education in cooking was backwards. Contrary to the stereotypes, traditional English cooking does have high points, including cheeses, desserts, and Yorkshire pudding, but those points do not include vegetables. For the most part, vegetables are an after-thought, cheap items designed to make the expensive meat go further. Growing up with this attitude, I have had to learn about cooking vegetables piecemeal and on my own.

As times changed and I left home, I did learn a few things. I learned that salads can consist of more than lettuce, and need not contain it all. I learned that vegetables in general taste better if you don’t boil them until the color is drained from them, and that boiling corn on the cob in particular is an Abomination. I learned that casseroles and stir fry were often more interesting possibilities.

Yet throughout this re-education, until a few months ago, boiling remained the default option for vegetables, unless I was following a specific recipe. Oh, I’d add spices and sauces, but that didn’t change the fact that when I was rushed or tired, I’d leave little of the original taste of whatever vegetables were unfortunate to fall beneath my paring knife.

This default was particularly unfortunate for rice, condemning me either to a soggy mound on my plate or else a burned pot unless I watched and stirred it nervously and turned off the heat at exactly the correct second.

A few months ago, desperation drove me to pour rice into a strainer balanced over a pot of water and covered with a pot. The cooking was so even and the taste so much greater than when I boiled the rice that I tried the same method with various vegetables. I was equally pleased with the results.

I had heard distantly of steam cooking, but vaguely assumed it involved expensive appliances and was impractical for anyone cooking for one. However, an Internet search soon showed that steam cookers were available for well under a hundred dollars. In fact, many models were fifty dollars or less. When no one took up my pointed suggestions for a present, I returned home on Christmas Day and promptly ordered the model of my choice.

Yesterday, it arrived, and I broke off writing long enough to unpack it and check that it worked. As a sometime student of usability, I was intrigued by its egg-shaped base and its tiered shelves, each with a perforated lid complete with indentations for holding eggs and a rice tray. Like a growing number of kitchen appliances these days, the cooker was a thing of elegance, form and function matching perfectly. The only design flaw is that the reservoir for adding water is too small, which prolongs what should be a straightforward task.

Naturally, I had to try my new toy as soon as possible. For dinner that night, I began with a wad of chicken breast. I half-expected that steaming would leave the meat pink, but instead it browned lightly and evenly in twenty minutes, shredding easily in my fingers.

Another twenty minutes did for the vegetables and arborio rice needed for risotto. Mixed with the shredded chicken and covered with tomato sauce and a few spoonfuls of pesto sauce, the result was the most delicious risotto I had managed in seven years of preparing the dish, full of flavor and textures that boiling would have done its best to remove.

Had I cooked both meat and rice together using two shelves, I could have assembled a complete meal in twenty minutes, all without turning on the oven or any burners. Just as importantly, washing the pieces was no more than a matter of running a soapy dish towel over all the surfaces.

That was enough for me. I still have plenty of experiments to try, including cooking fish and adding spices to the water. I can see, too, that coordinating the various parts of dinner and positioning them on the shelves will take some practices. But so far as I’m concerned, steaming is now my preferred cooking method for most vegetables. I’ve entered the Age of Steam, and there’s no going back.

Read Full Post »

I never really knew my maternal grandmother. She died before I was five, and I can no longer separate what I remember from what I’ve been told. But I remember my maternal grandfather, a kind but faintly abstracted man who outlived her by seventeen years. He never remarried, although he could have easily enough, but, now that I’m a widower myself, I imagine that I understand why: he wasn’t unhappy, but after his wife of over forty years died, nothing seemed to matter much any more.

Those who lack his experience or mine might leap to the conclusion that my grandfather suffered from depression, and that I do as well. Even if they can make the empathic jump to the understanding that melancholy would be a more accurate description, they would still be prone to tell us to not give up hope, that we might still find someone else with whom to share our lives.

I don’t know about my grandfather, but I know that I haven’t entirely ruled out that possibility. However, what other people have a hard time comprehending is that I don’t particularly care if I do.

Still, let me try to explain: My attitude has nothing to do with grief. I am not telling myself that I’m staying faithful to the memory of partner, much less keeping a promise I made to her. If anything, she would have preferred me to find another relationship.

Nor do failing health, a reduced sex drive, or any of the other ready-made explanations that some people are no doubt preparing to categorize and dismiss me with relevant. If anything, I’m fitter today than I have been for over a decade, and as appreciative of good looking and intelligent women as I have been at any time since puberty.

Okay, I am reluctant to take up again the tired games that most men and women play with each other. I thought them demeaning the last time I was single, and I am even more contemptuous of them now. They seemed to be changing about the time I married, and one of the great social failings of our time is that to a large degree they changed back again.

I admit, too, that it is harder in middle-age to make time for someone else in my life now that I’m middle-aged. When I was a young adult, everything about my future was uncertain, not just who might become my partner in life. But today, how I earn my living and the pattern of my days is well-established, and I am much less inclined to change my routine to search for someone, let alone make to make changes to accommodate someone new coming into my life. I’m more settled than I was as a young adult, and I have far more of what I want.

Almost certainly, self-defense helps shape my attitude as well. When you think you know the pattern of the rest of your life and who will feature in it, then have those assumptions swept away, it is only natural to be wary of falling into such pleasant complacency again. The effort of rebuilding alone is enough to make my uneasy – suddenly reverting to a state I last endured in adolescence is not something I would care to do again. Once is more than enough to instill caution.

Yet all these are secondary. The main explanation is this:

Being married was the central part of my life in my youth and early middle age. I regret none of it, not even the bad times, because they were easier to struggle through in company. Nor is there a day that I don’t miss Trish. But I’ve had all that, which is more than most people can say, so I’m not greatly concerned if I don’t find it again. Almost certainly, the odds are against it.

In other words, being a widower has taught me stoicism. The ambitions that everyone has for themselves, the expectations they have for me and their advice on how they think I should spend my life simply aren’t important to me. I might still manage to do or say some worthwhile thing (although my own ambitions matters less than they once did, too), but whether I do or don’t, it doesn’t greatly matter – not even to me, except in the most abstract sense.

My present attitude is neither something I’m proud of, nor something I feel ashamed about. From habit, I try to step back and describe it as accurately as possible, but trying to change it? Why would I bother? In this attitude, I suspect, I am no different than my grandfather was, all those years ago.

Read Full Post »

Quoting is a delicate art. Depending on your preferences, you can clean up the grammar or elide a few words to make what remains pithier, but what you can never do – at least, not if you have any integrity – is present someone’s words in such a way that you misrepresent their opinions. However, recently I’ve noticed that the claim that a quote is taken out of context is becoming the last refuge of everyone from politicians to social media users trying to distance themselves from something they’ve said that happens to be inconvenient or embarrassing.

Probably, this defense has become popular because of the seriousness with which quoting out of context is viewed by academics and journalists. However, the distinction between legitimate and opportunistic users of the idea of context quickly becomes clear when you look at examples.

First, an example of someone actually quoting out of context. Five years ago, in an article on Linux.com, I wrote,”I’m not a great believer in the idea that women are less aggressive than or interact differently from men. Yet even I have to admit that most of the regulars on free software mailing lists for women are politer and more supportive than the average poster on general lists.”

In the comments, an anonymous poster wrote that he found himself “convinced that Bruce Byfield is single, has no daughters, and doesn’t have a close women friends. The fact of the matter is that (most) women interact differently both men do, in their interactions with both other women and men. If he doesn’t know this, he hasn’t spent much time around women.”

This comment, as another poster was quick to point out, focused entirely on the first sentence I wrote. Even then, he missed the nuance of “I’m not a great believer.” But, even more importantly, by stopping at the first sentence, he formed an entirely mistaken opinion of what I thought by ignoring the next sentence, which completed the thought I was expressing. Instead, he derided me for an opinion that I had never expressed, and made himself look like foolish rather than me.

By contrast, recently I wrote an article about how the priorities in GNOME, the free desktop used on Linux, appeared to have shifted. I quoted at length one member of the project who wrote during an online discussion that they were against allowing extensions that would alter the vision of the design team. I carefully mentioned that the discussion had taken place over a year ago, and went on to add that the member was now focusing on other matters, meaning to imply that they were no longer opposing the idea of extensions, and that their previous views no longer prevailed in the project.

The day after the article appeared, the person whose email I quote denounced me on Google+. I had quoted them out of context, they insisted. I should have asked them for their current view, and I was unprofessional because I didn’t. Yet when I asked them to explain exactly how I had misquoted them, they either would not or could not do so.

I never did get an explanation out of them. So far as they were concerned, I must have deliberately attacked them, and they were under no obligation to explain (although they were apparently quite willing to attack me, and to rant vaguely but ominously about the dangers of discussion on a public mailing list). I suspect that the person in question was now embarrassed by their former views, and was concerned about being associated with them. Perhaps their concern was that others might think they didn’t support the current policy.

My use of the quote had nothing out of context. It was clearly presented as a past view, contrasted with the present, and included several sentences in order to represent accurately the opinions expressed. But, whatever the exact reasons for the person’s reaction, the words “out of context” were a convenient form of denial. Never mind that they could not point to any misrepresentation – by savaging my reputation, they hoped to salvage theirs.

These two examples clearly show the difference between using the phrase “out of context” legitimately, and as a defense. In the first case, going to the original source quickly shows that the context has been misrepresented or misunderstood. In the second case, particulars are avoided for a generalized accusation, and the original discussion is deflected by a personal attack.

Fortunately, the response to cases like the second is exactly the same as for those like the first. In both circumstances, looking at the source immediately shows whether anything has been taken out of context or not. The real danger is when politicians and public figures claim that they were misquoted loudly enough that any methodical debunking of the claim is missed, and they are able to evade responsibility for their own words by launching a misleading counter-attack.

Read Full Post »

As with many men, a daily shave is part of my morning routine. But I didn’t realize how ingrained the habit was until yesterday. I was up at 6AM, rushing so I could catch the ferry to Gibson’s Landing, when my razor became quieter and quieter then died out altogether, leaving me with one side of my neck and both cheeks unshaved.

The problem wasn’t a social one. My hair is a muddy brown and my skin reddish, so anyone else would have to get within a few centimeters to notice the incomplete shave.  However, so far as my sense of myself went, my half-shaved self was a surprisingly strong violation of my self-image.

The problem was not the idea of a beard, although I’ve never been strongly tempted to grow one, even as a young adult. Admittedly, a few days without shaving leaves me with the impulse to scrape the skin off my cheeks and necks in the hopes of stopping the itching, Then, too, a beard would be high-maintenance compared to being clean-shaven, especially for someone like me for whom sweaty exercise is part of most days, and sooner or later one of my parrots would find it irresistible to pull or climb across.

Nor do I have any desire to add anything to my morning routine that would require me to stare at myself in a mirror just minutes after waking. I simply lack the vanity, and would far prefer using a safety razor while reading.

All the same, I have sometimes toyed with idea of growing a beard. I associate it with ancient Greek philosophers and playwrights, and a few periods of ancient Rome, so I am alive to the romance of facial hair. If I had ever found myself in the usual time-honored circumstances, such as a week long camping trip, I would succumbed to the temptation and endured the skin irritation just to see what I looked like. If nothing else, in my earlier years, I might have been tried the look simply in the hopes of looking my age.

However, under almost any circumstance, I would have shaved off any beard in a matter of days. Even though five o’clock shadow is a problem for me, starting the day clean-shaven matters to me. It is as important a part of personal hygiene to me as having clean and trimmed finger nails. Without either, I am vaguely uneasy just under the surface of consciousness, and haunted by the feeling that I am at disadvantage. My confidence, as flimsy as it is at the best of times, always feels like it is about to buckle and snap unless I am properly shaved.

Unfortunately, yesterday morning I could only endure. I caught my bus, glad it was still dark so my neither-nor state was concealed. Arriving downtown, I was just in time for the start of the Boxing Day sales, and when I missed my connection, I resisted with difficulty the impulse to dart into the nearest department store and buy a razor to use on the ferry.

Somehow, common sense took hold of me. Catching the ferry was more important than my personal preferences, I told myself. The relatives I was going to spend the day with wouldn’t care what I looked like, even if I did. Anyway, it was a holiday, and many men around me hadn’t bothered to shave, although mostly the unshaven were younger than I am, and more obsessed by fashion as well. Never mind that they were trying for a casual elegance and I only felt scruffy.

With a mental grip like an eagle’s talons, I marched self over to the queue, making a point of making eye contact with the driver, the man at the ticket booth, and the servers in the ferry cafeteria. Resisting the urge to lower my head and scurry through the shadows, I willed a firmness to my stride and tried to project an air of confidence as I approached the relative who was picking me on the other side of the water.

Then, after exchanging the greetings of the season, I looked my relative squarely in the eyes. “Can we stop by the drug store?” I asked, with just a hint of a self-pitying whine.

Read Full Post »

When people call British Columbia “Lotos Land” or “the California of Canada,” they’re not just talking about the alternative cultures or the casual standards of dress. They’re also talking about the weather in the southwest corner of the province, which has fewer extremes of heat or cold than anywhere else in Canada.

Unfortunately, this reputation has one overwhelming problem: the locals believe it more than the tourists.

Most of the year, this delusion is harmless. Anyone who has lived here for more than a few years is unlikely to carry an umbrella, much less wear rain boots, but the weather is mild enough that going through the day slightly soggy is no great hardship – especially since half the locals have stripped down to shorts and T-shirts at the first sign of the temperature inching above five degrees, so that no dry cleaning bill is involved.

However, denial of rain is one thing, and denial of snow another. Because the average winter has only a few weeks of snow – and, every few years, none at all – the general population has convinced itself that the region never suffers snow at all. Every year, a majority of drivers resist adding snow tires to their cars at the end of October. It isn’t unheard of for local municipalities to forget to set aside money for snow removal, or to run through the entire budget for that line item halfway through winter. And only in the Vancouver area could the provincial government pay $3.3 billion for a bridge so badly designed that snow and ice falling from the cables is a major danger to traffic.

Consequently, the first half centimeter of the season sends the entire region into a panic more commonly reserved for a visit by a radioactive monster from the sea. Within an hour of the first flakes falling, the downtown core is deserted, except for the people crowding the Skytrain stations waiting to flee. Often, they have a long wait because, true to regional form, the system wasn’t designed to minimize the effect of ice on the tracks. One memorable year, the doors iced shut, and a uniquely Canadian solution had to be found – beating the doors with hockey sticks to knock the ice off.

Meanwhile, on the roads, the refugees from the office towers are demonstrating their total ignorance of physics, sliding over the snow in their summer tires and slamming on the brakes every thirty meters. Soon, cars are being abandoned in the middle of the road. Occasionally, someone from back east can be seen holding themselves upright on the frozen lampposts, unable to stand because of the helpless laughter that has possessed them as a few stray flakes of snow cripple a city. The easterners have seen real snow storms, and driven in them, too.

The next day, as likely as not, half the city will take the day off on the excuse that no one can get into work. This response to the weather fits well with the casual work ethic, but it’s not just an excuse. The chances are that only the major roads have been ploughed overnight, and getting to them can take hours.

Even if you leave your car at home, your odds of getting anywhere are remote. No municipality clears sidewalks, insisting that home and store owners must do so. Most do not.

As for public transit, forget it. You’re lucky if a few extra buses or Skytrain cars are put into service. And, even if you are lucky enough to find a place on a bus that takes you where you need to go, water is running over its floor as slick as any ice, and the steam rising from people’s clothing leaves you half-blind and disgusted by the prevailing levels of personal hygiene. All you can do is bury your face in the old scarf you hastily pulled from the bottom of the closet last night and do your best to avoid eye contact.

All this is discouraging enough, but it gets worse. Of those who stay home, few will spend the extra leisure winterizing their cars. Instead, what happens is that most people get an unexpected holiday, and the snow disappears in a freezing deluge of rain that floods the streets for a day or two.

Then, like trauma victims everywhere, the locals promptly forget their experiences. A few weeks later, they go through the whole experience with the same details, and again a few weeks after that, until the cherry blossoms appear, and the regional delusion comes slowly into some kind of rough sync with the weather and reality.

Read Full Post »

In the last year, my life-long habit of playing games has diminished greatly. I am vaguely disturbed by this turn of events, because I can’t decide whether it is a mark of maturity or of having lost something.

So far as I can remember, the habit began with playing chequers with my maternal grandfather. He was never condescending enough to let me win, but held back enough that I always hoped I could win next time. Early in elementary school, I started chess, usually winning although I never systematically learned opening moves or defenses – in fact, I felt that doing so was next to cheating.

Then, some time around the age of ten, I discovered Avalon Hill Games. Nowadays, the imprint seems given over largely to variations of Axis and Allies, but, at the time, they had games based on everything from the Battle of Jutland and The Battle of Britain to the American attack on Guadacanal and the street-fighting during the German invasion of the Soviet Union, all on such an abstract level that I could forget about the implied bloodshed and concentrate on the strategy, as well as ignoring the implied American jingoism I sensed in some of the games

An Avalon Hill Game could take hours to set up, hunting for the right units. It could take hours to play, too, which made it perfect for a long afternoon under a tree with whatever friend I could convince to try the game. But ending the game wasn’t the point to me. What mattered was learning the lay of the land, and the names on the little squares and rectangles. To this day, I can still remember many of the names of the unit leaders on both sides at Gettysburg, and the names of Caesar’s commanders during the siege of Alesia (although I now wonder if some of those were false). In some ways, they were as good as reading.

I never could find many who would play with me, so I often played against myself, taking each side in turn, a practice that may have helped me to write impartially about complex issues. Often, I played against myself when I should have been doing homework, or my own writing.

When the first arcades came out, they were very nearly my downfall – especially the closest one to where I was living. I spent far more quarters than I should have, totally fascinated at the same time I was aware of the banality of most of the games.

Computer games were safe, although in the early days I was inevitably disappointed in the graphics. The best, like the various releases of Civilization, were like extended versions of the Avalon Hill games – ones that I didn’t need to set up. For a couple of years, I kept a Windows partition largely to play games, although I practically danced through the living room when Loki started releasing Linux versions of popular games.

Fortunately for my time management, when I switched entirely to Linux, few games were still available, although I occasionally wasted time on Battle of Wesnoth. Even more fortunately, I never quite got started on online roleplaying; from the couple of times I wrote about them, I’m guessing I would have had serious problem.

But in the two and a half years since I was widowed, I haven’t had time for more than few games of solitaire or backgammon each day to get myself thinking in terms of possibility. Increasingly, I’ve had no time at all, and I’m not sure what to make of the fact.

On the one hand, I worry that this change of habits might be a suppression of the imagination. Like any other faculty of the human brain, the imagination seems something that needs to be exercised. Am I growing dull? I wonder. Letting my imagination and perception stagnate for lack of stimulus?

Or is moving away from games a sign that I am overdue for ending my preparation for life, and getting on with the real thing? Maybe, as as I get on with practical things, I don’t need to prepare so much. I might be too busy getting on with my own business.

I suppose a third alternative is that I’ve been running on the enhanced emotions of grief, until no mental stimulus is effective. But I’m being cautious about finding out, because none of the alternatives appeal to me much.

Read Full Post »

Writers are supposed to have a history of different jobs, and I’ve always done my best to keep that tradition alive – even before I started writing professionally. But looking back, I see that three turning points have brought me to where I am today, each of which was driven more by desperation than careful planning.

The first was my decision to return to university. I had finished my bachelor’s degree in English and Communications four years earlier, but I had done absolutely nothing with it. I was working part time in a book store, because, after five years of university, I was burned out. Just as importantly, I had no idea whatsoever how I wanted to make a living. But the two years off I had promised myself had turned to four, and I was feeling trapped, and more than a bit of a failure. Book stores, I had discovered to my surprise, were not about loving books at all, but selling them, and I might as well have been selling slabs of raw chicken breast in a butcher shop.

Not having a better idea, I listened to the common wisdom, and decided that what my double major amounted to was the first steps in the qualifications I needed to teach. That seemed plausible; I had given poetry seminars at my old high school, and they had gone over well. So I scraped up the recommendations I needed, and applied to both the faculties with which I had an association.

For a while, I considered studying parrots in the Communications Department, and even wrote to Irene Pepperberg, the recognized expert in the field. But my moment of desperation was in December, and the department only took new grad students in September.

By contrast, the English Department would let me start in January. I still wasn’t sure exactly how I would use a second degree, but I could be paid as a teaching assistant while getting it, and the pay and the responsibilities were much better than at the book store. So, for four years as a teaching assistant and seven as a lecturer, I taught, finding the job mostly satisfying, aside from the necessity of occasionally having to fail students.

Slowly, however, that became a dead end job, too. Unless I took my doctorate, the best I could hope for was a non-tenured position as Senior Lecturer, and even those jobs were rare unless my partner and I were prepared to move. We weren’t, and I realized that I was not only in another dead end, but one where my choices could be limited by the whim of the department chair.

Trying to ignore the despair and panic nibbling at the edges of my thoughts, I attended a Saturday afternoon seminar on technical writing at Simon Fraser University’s Harbour Centre campus. My partner picked me up so we could have dinner at her parents’, and, as we drove down the highway through Richmond, I summarized what I had learned about the profession to her.

A long silence fell as we considered the possibilities. Then suddenly, I turned to her and said, “You know, I can do this.”

So I did. I did it so well that I worked largely as a freelance. In four months,  I more than doubled my income, and, in nine months I was hiring sub-contractors. Unlike a surprisingly large amount of technical writers, I had realized that success required the ability to learn my subject – and if there was one thing that all those years of school had taught me, it was how to learn. I also discovered that being able to offer technical writing, marketing, and graphic design made me an all-in-one package that many employers found irresistible.

But that success made me an executive at a couple of startups. When they crashed, I found it hard to return to my base professions – I could see the mistakes that business owners were making, and I had just enough sense to realize that my opinions wouldn’t be welcome, particularly since I would have been correct more often than my employers.

I endured two particularly dreary jobs at companies that were being led into the ground. Then, one afternoon, walking along the Coal Harbour seawall in the autumn sunlight, I realized I could no longer be even conscientious about helping to prepare mediocre proprietary software.

I was already doing the occasional article for Linux.com. Now, in my desperation, I asked Robin (“roblimo”) Miller if he would take me on full-time. To my unending gratitude, he agreed to let me try. At first, I thought I would never manage the dozen stories he expected per month, but soon I was not only meeting my Linux.com quota, but writing another six to eight stories every month. Writing, I discovered, was like everything else, becoming easier with practice.

Looking back, I see that each of these turning points brought me a little closer to the work I did the best, even though I didn’t realize at the time what was happening.

More importantly, I realize that none were the rational, careful planned moves that the typical career advice suggest that you make. In fact, if I had followed that typical advice, I would probably still be at the book store. At each of these points, what motivated me was my unhappiness with my current position, and a realization that taking a leap into something new was no more chancy than staying where I was. It wasn’t ambition, or careful planning that made me move on – just a dim sense that I wasn’t where I wanted to be, and a growing awareness that I really had nothing left to lose.

This, I suspect is how most people make such decisions. These days, when someone comes to me asking for career advice (as happens two or three times a year), I don’t tell them to plan their career moves rationally. Instead, I ask them how they want to spend their lives, and what risks they are prepared to take to do it.

Read Full Post »

Growing up, I had more than my share of prizes. Book awards for school work, ribbons and medals for running and other sports, scholarships – they all came my way not once but many times, and having developed a bit of an inferiority complex due to an early speech impediment, as a boy I was more than glad to accept them. Yet by the time I graduated from high school, I had developed a dislike of prizes, and resolved to think twice about accepting any. It’s a dislike that has only grown over the years.

Part of the reason I object to prizes is that they can become a motivation in themselves. As an inwardly-directed person, I am convinced that a person ought to do things because they are right, or contain their own sources of satisfaction.

By contrast, if you do something to win a prize, then you are abandoning your responsibility to judge your actions. If winning is your motivation, then you can easily end up doing anything that is necessary, regardless of the ethics or morality of the situation. You end up acting to please those who are giving out the prize, abandoning your personal integrity for something much less important.

Another reason is that, while most prizes are supposed to be based on merit, very few of them are. I first realized this fact in Grade Three, when I did not win the book prize for my class – not because I didn’t deserve it, the teacher explained to my mother, but because I had won in the previous two years and I would be a good sport about giving someone else a chance.

The teacher was right. Even back then I had too many ideas about being a good sport to let my jealousy show. But I did think that such rationalizations made the whole idea of competition meaningless, and I never thought quite so fondly of that teacher as I had before.

Having seen a prize devalued so obviously once, I had no problems noticing when the same thing happened again and again. For instance, I remember my bemusement when writers started lobbying their fellow members of the Science Fiction Writers of America for the Nebula Award, a prize given for excellence in writing. Similarly, a few years ago, I saw the most promising student in an art program was passed over because the teachers disliked them.

If the rules are broken, I keep thinking in such circumstances, then the prize itself is devalued. And what makes the situation worse is that the rules inevitably are broken. A prize may start out being for excellence, and stay that way for a year or two, but, sooner or later, decisions are made on the basis of who is the most popular, or the most well-known. Or maybe the recipient is chosen to make a political point, or to avoid giving the prize to someone else. No matter what the reason, once an award is given for any reason other than merit, it ceases to have worth.

Still another reason for my attitude is the fact that, for parts of life that really matter, the idea of competition is meaningless. I first understood this simple fact. when my correspondent Avram Davidson, the great American fantasist, won the World Fantasy Award for LifeTime Achievement, and I wrote to him, “I understand that congratulations are in order.”

In his next letter, Avram shot back, typing in his usual haphazard way, “Congratulations are NOT in order. I told them that if nominated I would not stand, and if elected I would not serve. I would have thought I made my position pretty clear, typos and all.”

The next time we met, Avram explained that, having achieved a certain literary reputation, he felt that competition was meaningless. He could not hope to write a Fritz Leiber story, or a Theodore Sturgeon story, and neither of them could ever hope to write an Avram Davidson story. True, a particular editor might have to choose between one of them for reasons of budget or available space, but such decisions had little to do with the quality of whatever works happened to be involved.

It was undignified, Avram concluded, for writers who had reached the height of their craft to go grubbing for marks of recognition. So he did not attend the ceremony, and, when the convention team wanted to send him the bust of H. P. Lovecraft (a writer he despised) that went with the award, he grew strangely forgetful about his mailing address. Eventually, the bust did arrive in his mailbox, but he buried it somewhere inconvenient among the books and papers that made up his apartment.

Listening to Avram was enough to silence any lingering doubts. The logic was irrefutable, and the position more classy than I can easily explain. Aspiring to be, if not a peer, then at least an accepted colleague of people like Avram, how could I take any weaker a position? Besides, I was already favoring a similar outlook, so the adjustment wasn’t exactly difficult.

I now believe that the only legitimate reason for a prize is to help someone who needs and deserves the help – preferably by giving them money, but, at the very least by giving their reputation a boost. But to position myself to win a prize, or to accept one would make me despicable to myself, and I would rather be able to look at myself in the mirror in the morning that court a brief popularity with other people.

I know – nobody’s offering me one. But ask me if I care. I would rather have the satisfaction of knowing I did the best job I could in the circumstances than win the most grandiose prize imaginable, regardless of whether anyone ever knows what I have done or not.

Read Full Post »

« Newer Posts - Older Posts »