Archive for July, 2008

Hospitals are not my favorite places at the best of times. They are such concentrations of pain, stress and raw emotion that I barely need twenty minutes before I start feeling emotionally overloaded in them. But, after last week, I have reason to like them even less.

Thanks to the overcrowding that has become the norm, Trish had to share a semi-private ward with a man having psychotic episodes. This circumstance is not (I say with understatement) recommended for someone who has just had major surgery.

At first, he seemed normal enough. Possibly, I thought on my visits, he was a little simple, not being able to distinguish his current hospital stay with past ones or give his doctors and nurses much information about himself, but I could hardly blame him for that. Even when he insisted on giving a half-incoherent, half-rambling reply to every comment made in the room, I dismissed his behavior as annoying but harmless.

Then, last Friday night, he went off like a bomb, trying to tear out his IV and catheter and other connections and struggling to get out of bed (which, fortunately, he was unable to manage). At first, he seemed to think he was in a war movie – and, before long, the movie became real. He seemed to believe that the Chinese had landed troops in British Columbia, and that he was on a boat that was shelling their positions. A little later, after nurses and security swarmed around him and tied him to his gurney, he seemed to believe that he was in a town called Dawson, where he had been taken prisoner and was being tortured for information.

Between swearing and shouting abuse, he made his plans out loud. He would pretend compliance, he said, so he could escape. He would even eat the food provided – although it was undoubtedly poisoned – but just enough to stay alive.

And Trish? In her room mate’s delusion, she was pretending to be his mother to trick him. She came in for a share of the swearing and abuse. She managed to get some sleep after the nurses brought her some ear plugs, but trying to sleep less than two meters from such events is not exactly restful.

Nor could she help thinking what might happen if her room mate got loose – he may have been too weak to walk far, but he still might get as far as her.

The next day, the hospital found a nurse to sit with the man, and Trish finally managed to get a few hours’ sleep. She also spent as much time as she could manage outside the room. Her room mate was mostly sedated, but he was still rude and angry when awake.

By the time Trish came home on Monday, she was more than a little tense. We weren’t sure she was healed enough to go home, but she wanted out of that room badly.

I don’t blame the nurses for what happened. They do the best job they can in trying circumstances, and, anyway, surgical nurses aren’t experienced in dealing with psychiatric patients. I’ve often thought that the medical system would be more equitable if doctors’ pay was halved and nurses’ pay was doubled. They do a job that I would flee screaming after half a shift.

But I do blame the organization and budget cuts to the medical system that such a patient was put in with another one who could only be traumatized by his behavior. The next time someone claims that the British Columbia health system is fine, I’m going to reply with this anecdote. It’s one that would be compelling as a Stephen King short story – but even King would have trouble convincing readers that such a real-life incident could happen in fiction.

Read Full Post »

Something has always bothered me about so-called celebrity bloggers, but I’ve never been quite able to identify it. I’ve vaguely thought that a lot of fuss was being made over very little, but never troubled to clarify the impression. The other day, though, I made a mental connection that explained why I was unimpressed.

When I was a university instructor, I did more than my share of first-year composition. When you’re new and being hired by the semester, that’s the price of clinging to the edges of academia. But the point is that, in most semesters, I would encounter students who had passed high school simply by completing every assignments. A few had even got scholarships because they had completed every assignment at exhaustive length. Often, some of these students would do poorly on their first few assignments – and, when they did, they couldn’t understand why. My explanation that, at university, you got marks for what you accomplished rather than what you attempted might have been talking in tongues for all the sense it made to them. How could they not pass? They had done the assignment, hadn’t they?

Too many celebrity bloggers, I concluded, were like these students. To a surprising degree, what they are known for is not for writing about interesting topics, or for insightful comments, or even for pithy turns of phrase, but for blogging and nothing else.

I remember that, at one networking event, the organizer announced that a celebrity blogger would be live-blogging the event. Immediately, everyone applauded, while the blogger looked around modestly. The blogger didn’t participate much in the event, being hunched over the keyboard of a laptop all evening, so naturally I expected some clear and concise reporting, if not the original insights along the lines of Joseph Addison’s or George Orwell’s.

What I found the next morning was an unfiltered stream of consciousness, perhaps of interest to the blogger’s friends, but no more intrinsically interesting than a conversation overheard on the bus. Authentic it might be, but also a well-bred bore, with little except basic literacy to recommend it.

The blogger, I realize now, was famous for blogging – not blogging well, but simply blogging. And, like the high school kids whose world view I used to detonate, to the blogger and their audience, that was supposed to be enough.

This impression was confirmed by a recent local blogathon, in which a number of these celebrity bloggers posted an entry every half hour for twenty-four hours, each trying to raise money for a favorite charity.

As a fund-raising idea, the blogathon seems futile and full of self-importance. Most people simply aren’t that interested in blogs. In every case where I could find figures, the amount of money raised was less than my average charity donation (and I’m far from wealthy).

But what matters here is how the effort was regarded. The organizer referred to participating in the blogathon as a “sacrifice” — mostly of time and sleep — when really it was nothing of the sort. It’s not a sacrifice when you get something in return, and, in my view, the sense of excitement and importance participants obviously received removed any sense of sacrifice from their efforts. And while such efforts are interesting when someone as accomplished as the American fantasist Harlan Ellison does them as a calculated bit of grandstanding (he has, for example, written in the window of a book store), I couldn’t help noticing that, in the case of the blogathon, what mattered in the blogathon was producing the requisite number of entries, not the quality of the entries.

Is anyone surprised that, except for an entry from a blogger who specialized in humor and one or two others, the entries were almost entirely void of interest for anyone except perhaps the bloggers and their immediate friends? Despite the popularity of personal journalism these days, it takes an expert to write a personal essay that interests acquaintances or strangers, and these didn’t. As Attila the Stockbroker used to say, it would take a mentally subnormal yak to care about most of the blogathon entries.

But that didn’t matter. What the blogathon participants care about was that, like my composition students, the fact that they had completed the assignments.

I don’t mean to insult celebrity bloggers by this observation. I’m friendly with one or two local ones, and, away from their obsession, some of them are interesting enough people. If they or their friends get pleasure from such entries, who am I to say that they shouldn’t? But I do mean to say that what they are doing is played by relaxed rules, and that I’m not interested imitating them.

For me, playing by real world rules is the only way worth playing. That doesn’t necessarily mean being paid for your writing (although it’s true that few reactions suggest that you are writing to at least a minimal standard than having someone buy the right to publish you). But unless my concern is catch the interest of others with every trick I can muster and risking failure, then I’m no better than a high school student expecting to be rewarded just for trying.

That’s fine for practice. But high school was a long time ago, and I prefer to operate by real world rules. If the rise of failure is greater (and I’m the first to admit that I’ve failed many times), then so is the chance of a truly satisfying success (and I’ve had a few of those, although far fewer than my ego likes to admit). In the end, what matters to me is not how much I write, but the reception it gets from readers.

Otherwise, in my own estimation, I am no better than those owners of one-person companies who call themselves CEOs – self-aggrandizing, lacking self-perspective, and more than slightly pathetic.

Read Full Post »

Ever since I learned to read, I’ve been a chain reader, sometimes literally finishing one book and picking up another one. Books have been my refuge from the bleakness and bad news of the day, a way to while away time while in line at the store, and my companion on constant rides on transit and planes. I even shave while reading to alleviate the boredom of the task (obviously, I use a safety razor). And, inevitably, I re-read.

The first time I read a book, I may have many motivations. Obviously, I need to have an interest in the topic or the writer, but I’m not a very discriminating reader, so that hardly narrows down why I might read something – everything from graphic novels to Middle English poetry might seem interesting to me in different moods. At times, I read because the writer has a reputation, and I want to push back the boundaries of my ignorance a furlong or two. At other times, I read because I’ve been given a book (I count heavily on friends to urge on me books that I might not pick for myself, and, often enough, I find myself pleasantly surprised). Still other times, I read because nothing better is at hand.

However, why I re-read is easier to delineate. I rarely re-read non-fiction from cover to cover, although I might return to particular pages when researching or needing to prod my memory. Mostly, what I re-read is fiction. If I was trying to be a snob, I would claim that I re-read only worthwhile books, but that would be a half-truth. Unless my tastes change, I doubt I’ll re-read standards of the literary canon like Henry James or Anthony Trollope; I recognize that their writing shows some skill, but, like opera, it’s a skill I recognize without appreciating.

It would be more exact to say that I re-read fiction whose skill has impressed me with its craft, regardless of how the canon regards it: Charles Dickens, but also Wilkie Collins; John Fowles and Lawrence Durrell, but also any number of writers who labored their life away in the science fiction ghetto.
What others think of my taste makes little difference to me (although I confess I can’t quite bring myself to read graphic novels on the bus). Instead, what matters is that the work shows some skill. The over-maligned Stephen King, for instance, is a master at pacing and observation of Americana – two skills that are usually missing from the academic’s checklist for greatness, but which average readers reward unconsciously by purchasing his work.
However, the books I re-read the most are those that are not only give aesthetic pleasure, but also reinforce my world view. Three books (or series) in particular come to mind: T. H. White’s The Once and Future King, which shaped my sense of right and wrong; J. R. R. Tolkien’s The Lord of the Rings, from which I learned the core values of endurance and rising to the occasion; and Patrick O’Brian’s Jack Aubrey and Stephen Maturin novels, which idealize friendship and a detached but amused view of the world while also offering historical adventure in the 19th century British navy.

Probably, you could gauge my character very accurately, not only from the common nature of these books – none, notably, have a modern or mundane setting – but also from the number of times I’ve re-read them. White I’ve re-read at least a dozen times since childhood, and Tolkien – the last time I checked – over 33 times, a number that astonishes me as I write it. By contrast, I have only read the twenty or so novels in O’Brian’s series three times through, but, then, I came to them much later that the other two, and they probably amount to two or three times the words of White’s and Tolkien’s classics. I’m re-reading O’Brian now, savoring favorite lines (“Jack, you have debauched my sloth”) and finding new subtleties.

The chances are, I’ll re-read all three – to say nothing of other favorites – many times in the rest of my life. However, I doubt I’ll re-read any of them as many times again as I already have. As I grow older, I am more jealous of time, and more aware of all that I have still to read. In fact, probably a new book has to impress me more than my classics did before I’ll re-read it in preference to moving on to something new. But a change of heart or a prolonged illness might change that, and, even if they don’t, I still expect many hours of pleasure ahead with my old favorites.

Read Full Post »

“Let us now compare mythologies”
-Leonard Cohen

One advantage of blogging, I find, is that it reveals my personal mythology. A single entry might not do so, but if I look over a few dozen entries (something I rarely do, because the urge to edit and improve is almost irresistible), a definite pattern starts to emerge.

When I talk about mythology, I’m not talking about lies. Rather, I’m talking about myths in the anthropological sense – stories that explain where you come from and why you do things in certain ways. In this sense, whether a myth is true or a lie is only of secondary importance. What matters is whether the myth sustains you and gives a sense of identity.

For instance, to an American, does it really matter whether America was settled by the best and brightest from older countries? Or, to a feminist, whether a prehistoric universal matriarchy ever existed? You can examine and even debunk such stories – and there can be a certain satisfaction in disproving what everybody knows – but you won’t be thanked, and your proofs will not be welcomed (if anything, you’ll be pilloried). What matters is not the objective reality of myth, but the sense of identity it gives a culture or an individual. If a story helps to sustain identity, that is all that really counts.

So what is my personal myth? Looking through blog entries about my past, I’d say it could be summarized in five words: triumphing after a bad start. Or, in a single word: endurance.

Time and time again, the narrative I tell about myself begins with me doing something badly. Often, I am humiliated by how inept I am. But I am determined, and through perseverance, I make myself competent and even highly skilled where I was once inept.

Considering this story more closely, I find that it has all sorts of implications. For one thing, it’s not a story tied to a particular group or set of circumstances; instead, it’s about attitudes and applicable to a number of situations. Since I’ve always considered myself a generalist with a broad array of interests, I’m fascinated to find that view reflected in my personal myth.

For another, it’s about education – again, not surprising considering that I’ve always believed in education for its own sake, and research is what I currently do for a living.

But what I find most interesting is that my myth that emphasizes persistence. It make no claim to my brilliance or talent. Natural ability isn’t even a consideration. Instead, it’s about learning from mistakes and not giving up. Learning to speak, learning good handwriting, becoming a high school running champion, finding the right profession – time and time again, the story I tell myself is about plodding along until I do or find the right thing.

That’s not surprising, I suppose. The one story I knew about someone with my name when I was growing up was the one about Robert the Bruce learning persistence from a spider. And, as a distance runner, I had concrete knowledge of the importance of endurance, because it wasn’t speed or even strategy that won races so much as the ability to keep going. But, until now, I hadn’t realized how deep-rooted such values were inside me.

In fact, I’m not sure that this is a myth I would have consciously chosen for myself. It has limits, such as a distrust of anything that comes too easily. Perhaps, too, it suggests a lack of confidence, and an expectation of failure the first time. It certainly dropped me into the worst stress that I have ever endured in my life.

Nor, now that I have opened up the myth to examine it, am I completely sure that it is always true. I can think of exceptions to the myth, and, looking back, I think I can see places where I have tugged the raw material of my life to make it fit into the myth better, like the corner of a sheet on a bed. In other places, I suspect I’ve exaggerated or even made up things out of whole cloth.

Still, for better or worse, the myth is mine. And like all myths, what matters in the end is that, on some level, I’ve made it a part of me.

Read Full Post »

Today, I suddenly realized that I was enjoying swimming – enjoying it immensely. The reaction comes as a surprise for several reasons.

To start with, I learned to swim under what I remember as the most miserable conditions when I was a child. In my mind, all the swimming lessons I took in the local outdoor pool occurred in the pouring rain and freezing cold, when all I really wanted to do was stay huddled in my towel in the cabana where the class met.

To make matters worse, I was a poor learner. Or so I thought, because I took forever to struggle up the hierarchy of lessons. It was only in my last year of lessons that I had an instructor who was built like me, with an long torso and short calves, and that I realized that much of what I was learning was useless for anyone of our build. The instructor taught me some alternate kicks that actually worked, so I could tread water for the first time in my life.

Yet, even then, I didn’t care much for the crawl, which was the dominant stroke in those days. I found the swift glimpses above the water disorienting, and I didn’t care much for the sensory deprivation of swimming in general. For years, my main technique was a modified breast stroke that kept my head above water.

Then, just to make me even less inclined to enjoyment, I started swimming regularly a few years ago when I realized that I needed a more varied exercise regiment if I hoped to save my much-battered knees more wear. After years of long-distance running, swimming was definitely second best, and something I endured more than I enjoyed.

Several things have made me change my mind, though. For one thing, after swimming daily since the Victoria Day weekend, I’ve reached the point where I fall into a rhythm while doing my laps, and don’t have to think about what I’m doing. It’s only at this point, I’ve learned from other exercises, that working out stops becoming a grim duty. However, I’ve reached that stage every summer for the past few years without more than mildly enjoying my swimming on most days.

But, over the past couple of weeks, the weather has turned hot suddenly, without any gradual build up that would let me get used it. Walking from an air-conditioned building to the outside, I can feel the heat wrinkling away from me as though it’s a skin that I’m shedding, and, after a run or a session on the exercise bike, my singlet is a sweaty mess that disgusts even me. Under these conditions, the coolness of the pool is luxurious. When I duck my head completely under, a delicious ring of coolness seems to encircle my forehead and temples.

Most importantly, this year I’ve been under considerable stress for several months. While most of the time, sensory deprivation seems hellish to me, as I cope with stress, this year it’s relaxing. In fact, it’s so relaxing that I’ve dropped my modified breast-stroke for the proper thing, dipping my head into the water and coming up for air. Propelling myself face down along the pool, I can see reflections from the sun, like a shimmering chain link fence of gold along the bottom, and not much else. Now, it’s a glorious sensation, being cut off from much of my usual sensory input while feeling my legs and arms moving in rhythm.

I’ve got to the point now where I can swim two kilometers, and, although my muscles know they’re had a workout, I feel like I could easily do as much again. I especially like the solitary feeling because the gym where I ride the exercise bike is usually so full of inconsequential chatter and posturing.

What I will do when the pool in my townhouse complex closes in the fall, I don’t know. But I’ll want to make some effort to find another convenient pool for the winter months.

Read Full Post »

At the start or end of my morning run, I often meet one of my neighbors running for the bus. He works as an on-call English instructor at various institutions around the city, and often gets the call to teach at the last moment in the morning. He doesn’t seem to mind, except for the irregularity of his pay, but I never meet him on his way to work without being thoroughly thankful that my own days as an itinerant instructor are long-past.

For the last few decades, most people with a graduate degree who hope to have a career in academia spend at least some time as a sessional instructor, scrambling for each semester-long contract, and often scrambling between community colleges during the work week to cobble together something like a regular pay cheque. My own experiences include a semester when I bused out to Fraser Valley College (as it was then) two days a week, and more than one time when I had fourteen hour days in which at least two or three hours were spent travelling. That’s time that I could have dearly used for marking or lesson preparation.

Sessional instructors have the lowest rank in academia, and everything about their working conditions reminds of them of the fact. Often, they don’t get a teaching assignment until a week or less before the semester starts – sometimes the night before. They get paid half what tenured faculty get, and often do twice the work, since they frequently teach lower level classes with more students. Most of the time, they have to share offices – or even study carrels in a crowded room. Officially, they don’t get paid for research, yet, if they don’t publish, they have less chance of being hired. Similarly, they are looked down on because their focus is teaching, but, unlike a tenured professor, they lose their position if their student evaluations are poor. Their rehiring is at the whim of their department, which means that wise sessionals will waste hours at every meeting and function, even though they have no voice. Yet they endure all this in the hopes that one day they’ll rise to the height of being a lecturer – which means they’ll be doing the same work for about the same pay, but not having to scramble for it. Meanwhile, they dream of winning a tenure position and dwelling in the halls of academia forever.

Sessional work is especially hard at the community colleges. For one thing, the classes are larger than at universities, and more assignments are required. For another, community colleges – even now, when they have morphed into degree-granting institutions of a kind – are often the continuation of high school under another name. Faced with the choice of finding a job or going to college, many middle-class kids will immediately register for college, which is cheaper than university and easier to treat lightly. For the sessional instructor, that means that the lesson that works in the more serious atmosphere of university has to be largely remade for use at a college. In fact, when I retreated after a few years into teaching only at university, the first thing I noticed was how much lighter my work load became. And I needed the respite, because, although I was young and healthy, the work was steadily grinding me down, especially since I needed to teach year round in order to keep above the poverty line.

Despite these disadvantages, I loved the work, especially dealing with the students. I was kept going, too, by a vague promise at one university that I would eventually be hired for some kind of full-time position. Dozens of Baby Boomer teachers would be retiring any day now, I was continually told – and when they did, I would be first in line for their jobs, because I had established myself as an effective teacher.

Then, slowly, the die started being weighted against me. My non-dogmatic approach to criticism was out of fashion with the then-dominant Post Colonialists, and, although I muttered jokes about being the token humanist, I was increasingly looked at askance. Then the chair changed, and the new one announced that, instead of reserving sessional positions for those who have proved themselves, the department would use the positions in order to trade favors for its grad students at other universities. Suddenly, my income became precarious. And, right about then, I noticed that, when tenured staff retired, they were either not being replaced or else being replaced by relatively lowly lecturer positions.

Seeing the writing on the wall, I made the jump to technical writing. From there, I was so busy leapfrogging into marketing, consulting, and eventually journalism that I’ve had little time to look back. If my new work was just as unsettled, I appreciated that it paid much better, although ordinarily I have only a minimal interest in money. It also had new challenges, such as taking on responsibility for large projects and developing customer relations.

Still, when I do look back, I sometimes wonder where I would be if I had stayed on the fringes of academia. Then I look at my neighbor and other people who started as sessionals the same time as I did, and I have my answer: In exactly the same place that I was.

Read Full Post »

The Myers-Briggs Type Indicator, in all its official and unofficial forms, is one of the most popular personality tests in the world today. It is widely used in by human resource managers, student counselors, and by just about any other kind of person who needs to assess others. Indirectly, it frequently determines whether people are hired or promoted, or get the break for which they have been waiting. Yet, for all its widespread use, I remain deeply skeptical of the basic concepts behind Myers-Briggs testing. Not only does it seem too simplistic and scientifically unsound, but, if my results are any indications, it fails to give a consistent enough picture of personality to make it reliable.

As you may know, Myers-Briggs testing is based on four axes: Extraversion (E) and Introversion (I); Sensing (S) and Intuition (N); Thinking (T) and Feeling (F), and Judging (J) and Perception (P). The poles you are closest to can be put into a four-letter abbreviation that summarizes your general tendencies, such as INFJ.

But why these four scales, I have to wonder? Does anything in the study of psychology suggest these four dichotomies over any others? Why, is there not a fifth category, such as visual versus written learning methods? Or physical versus mental activity, or types of organization?

While some additional implications are added to the four axes in the less superficial versions of Meyers-Briggs, the basic structure seems largely arbitrary. The categories are clearly not based on anything impartial. If I show people four blue circles and two red squares, those who are not color-blind or altogether sightless will agree on what they are seeing, but I doubt that people unaquainted with Meyers-Briggs will naturally divide personality into four types, or that, if they do, their four types will correspond to Meyers-Briggs. That’s why, for all the popularity of Meyers-Briggs, many psychologists do not give it much credence.

Similarly, when I notice that Meyers-Briggs gives sixteen basic types of personality, my first reaction is that at least that’s four better than horoscopes manage. The comment is unfair, since people can be anywhere along these four scales, which provides for more variety, but the impression of over-simplicity remains. What, I always wonder, is not being measured by Meyers-Briggs? And what is distorted because it registers on the tests but the tests are unable to diagnose it successfully?

As for the either/or questions that make up many forms of Myers-Briggs testing, don’t get me started. For many people, these are false oppositions. Albert Einstein, for example, preferred solitude for work, but could be extremely gregarious when his work was done. I suspect that most people do not think in terms of either/or on many questions of preference, either – I certainly don’t.

Another point that is deeply misleading is the common contention that Myers-Briggs testing is based on the work of Carl Gustav Jung. Being one of the few people I know who has ever read Jung, I strongly suspect that this claim stands only because almost no one is acquainted with Jung.

In fact, the main precursors to Myers-Briggs that you find in Jung are some simple diagrams with two axes (Intuition / Sensation and Feeling / Thinking), and a separate discussion of extraverts and introverts. The addition of Judging/Perception is the work of Myers and Briggs, and so is the codification – Jung was throwing out conceptual ideas rather than ones that could be observed and given a score. These conceptual ideas can be useful – which is why “extrovert” and “introvert” became part of everyday English – but Jung shows no signs of seeing his axes as something that can or should be directly analyzed. In many ways, Myers and Briggs have gone so far beyond anything that Jung intended that the insistence that their work is in any way Jungian seems nothing more than a rather desperate attempt to evoke the name of one of psychology’s great names to bolster a rather dubious theory. In other words, it’s a marketing ploy — and, personally, I tend to mistrust anything with misleading advertising.

However, the real reason I distrust Myers-Briggs testing is that my results can vary widely, not only from test to test, but also from day to day. Looking at the first four online tests I found when searching under “Myers-Briggs,” I received four results when I took them one after the other: ENFJ, ENFP, ESTJ, and INFJ. Similarly, taking the first one on two separate days, my results were ENFJ and INFJ. For the second test, on subsequent days I registered as an ENFP, INFJ, and INFP.

These results do show some general patterns. For example, I definitely register as relying on Intuition more than Sensing, Feeling more than Thinking, and Judging more than Perception. However, Sensing, Thinking, and Perception do show up, depending on the test I take and the day I took it. As for the extravert/introvert distinction, I seem evenly divided, even when other aspects stay the same.

Possibly, I am more mercurial than most people, or my personality is close to being balanced on the four axes. However, the fact that different variants of Myers-Briggs produced only broadly similar results, and none of them could produce consistent results from day to day makes me incline me to suspect that the problem lies either in the tests or their theoretical framework. If Myers-Briggs was an accurate indicator of personality, then surely it would have some way to register people whose temperament varied. Furthermore, if the results corresponded closely to any objective reality, then more consistency should be present.

Of course, none of these tests were the official Myers-Briggs ones, but online ones whose thoroughness and reliability are questionable. However, I have taken Myers-Brigg tests in the past under more formal conditions, and their reliability was no better. So, again, I suspect the tests are the problem.

At any rate, common sense and direct experience both cause me to be highly skeptical of all forms of Myers-Brigg testing. Like I.Q. tests, Myers-Brigg tests seem to be dubiously conceived, and far more influential than their equally dubious results would warrant.

Read Full Post »

Older Posts »