Feeds:
Posts
Comments

Archive for the ‘Personal’ Category

The mother of a friend of mine once said that he had raised himself to be a knight. She didn’t take any credit for the fact – she simply observed it, which it made it the best compliment of a child by a parent that I have ever heard. I knew instantly what she meant, because I had done much the same with Robin Hood, or at least Roger Lancelyn Greene’s version of him.

To this day, I happily devour any retelling of the stories. Robin McKinley’s The Outlaws of Sherwood, Parke Godwin’s Sherwood and Robin and the King, the Child Ballads, the Robin of Sherwood series that made him a mystical figure associated with Herne the Hunter, Robin and Marian featuring Sean Connery as the aging hero, the recent BBC series, the Errol Flynn version with Claude Raines as the Sheriff – all are part of my mental baggage, with what for me is an unusual lack of concern for quality. I’ll even watch Kevin Costner’s Robin Hood: Prince of Thieves, an admission that shows just how indiscriminate my obsession really is.

You see, for better or worse, a good part of my ethical standards was consciously modeled on Robin Hood, to say nothing of my politics as well. Only King Arthur and the Knights of the Round Table came anywhere close to be as influential, and Robin Hood – despite being the Earl of Huntington – had the same ethics without the sense of class and privilege. He wasn’t even much of a sexist, loving a woman who shared his dangers, rather than languishing at home like Queen Guinevre.

So what did I learn as a child from Robin Hood? Far more than the manly virtue of courage. I learned that I was supposed to be polite to everyone. That I was supposed to be a good sport, even if I had just been thwacked on the head by Little John or dumped into the stream by Friar Tuck. That I was to value honesty and abhor hypocrisy. That I was supposed to help people, even at inconvenience to myself. That I was supposed to face danger cheerfully – and this, and a hundred other things besides.

However, none of this would have impressed me by itself. I could learn the same values from Baden-Powell’s Scouting for Boys – never mind that I later learned that Baden-Powell was a traitor to his own standards, having starved the local Africans to keep his British troops alive during the siege of Mafeking during the Boer Wars.

What really impressed me was that, unlike the propaganda of the Scouts, or even the followers of King Arthur, Robin Hood decided for himself. Rather than acquiesce to things that were legal but immoral, he became an outlaw, and he enforced his own sense of right and wrong while he was in Sherwood no matter how anyone else condemned him. Greene never used the phrase, but his Robin Hood lived by a higher morality, deciding for himself where right and wrong lay.

Of course, the anarchy of Sherwood cannot last, and Robin Hood ends by being pardoned by King Richard. But even as a boy I understood that end as more symbolic than anything else: King Richard is the source of the law, and his approval amounts to a public acknowledgment that Robin Hood’s code of behavior was correct, no matter how eccentric it happened to be. The idea that he was substantially changed by his reintegration into society is quashed by his last moments, when he forgives the Prioress for poisoning her and tells his followers not to avenge themselves upon her.

Part of me wants to laugh at this set of ethics, but I can never manage to be quite so flippant. Robin Hood’s example helped me through the worst stage of my life, when only a handful of people believed in me.

At other times, his example is difficult. For example, while I believe in acknowledging when an opponent has done something ethical, I often suspect that belief only serves as a handicap. Certainly few of my enemies have ever reciprocated in kind.

However, at his best, Greene’s Robin Hood embodies a generosity of spirit that I can’t help but admire. I have often fallen short of imitating this generosity, but the idea that I should try to is lodged too firmly into me to ever root out. No matter how cynical or disillusioned I might become, the lessons I learned from reading Greene’s book into oblivion are likely to remain with me for the rest of my life, even if spend my last few years senile.

Read Full Post »

Of all the concepts I discovered at university, few had greater impact on me than type theory – not Betrand Russell’s mathematical application of it so much as Gregory Bateson’s application of it to psychology. I picked it up as part of my study of General Systems Theory, a popular intellectual fad of the times, and was lucky enough to have several classes with Anthony Wilden, one of the leading theorists in the field. Probably, Wilden’s originality and passion would have been enough to imprint the idea on my mind, but the main reason that type theory became a part of my thinking was that it explained in a satisfyingly complex way matters that would otherwise be inexplicable.

For Bateson, logical types were useful as a description of how information was organized. In a simple example, an organ like the heart and liver is of a higher logical type than a cell, while the body is of a higher logical type than an organ – higher, not in the sense of being superior, but of being composed of elements at a lower level.

For understanding how a body functioned, levels are useful because each one shows different characteristics and behavior. For example, consciousness is characteristic of the body as a whole, while a variety of chemical reactions such as the exchange of oxygen, take place at a cellular level.

Right away, this distinction has the advantage of reformulating apparent paradoxes. For instance, using type theory, “I am a Cretan. All Cretans are liars” can be quickly dissected to make sense. These statements are meta-statement, a statement about statements,” and is therefore of a higher logical type. As such, their truth needs to be evaluated separately from statements made at the usual level of discourse, and no paradox exists.

Similarly, philosophical discussion about individuals vs. society turn out to be in need of reformulation. Being at different logical levels, individuals and societies cannot be said to be at odds with each other, any more than your heart can be said to be at odds with your body. But if you say instead, “how can individuals and society interact for the benefit of both,” you have gone some ways towards focusing the subject and of increasing your chance of saying something meaningful.

Bateson used this set of intellectual tools to develop double-bind theory, the study of the psychological problem poised by a hidden meta-statement. For example, “Be spontaneous” contains two statements: first, a command to act naturally, and, second, a meta-statement to obey the speaker. The recipient of these commands cannot obey one without disobeying the other. Instead, they can only oscillate between the two. Bateson called such a situation a double-bind, using it to suggest that schizophrenia was the result of trying to live with a situation – an idea that R. D. Laing was to take to such flamboyant and extreme lengths as to discredit it in many people’s minds.

Later theorists went on to suggest that double-binds explained a great deal of human behavior. They suggested that it was a key form of social control, on both a political and a family level – present someone with a double-bind, they suggested, and they are kept too confused to question or rebel. Only if someone recognizes the structure of the double-bind – or, better yet, removes themselves from the situation in which they have to deal with it – can they hope to accurately assess their situation.

However, even in less pathological situations, those who came after Bateson found type theory useful. The idea that unclear communication exists because people are unable to separate statement from meta-statements suggests that any interaction is far more complex than it first appears.

For instance, a couple that bickers should not be explained just by the ostensible subject of the current argument, which is often petty. Rather, the argument should be read as having a level in which the couple are disputing their relationship. However, because neither realizes what they are really arguing about, the petty subject can be discussed almost endlessly.

I am well aware that type theory and double-binds have been criticized. But, as soon as I first read about them, they struck me in much the same way as Orwell’s analysis of how language can be used to corrupt communication instead of encourage it: both were concerned with opposing pathology and trying to describe what was really happening, rather than what appeared to be happening.

My conviction about the usefulness of type theory in dealing with people has never faltered since. Countless times, it has allowed me to de-rail an argument by suggesting that I and the others involved were really talking about something else rather than whatever had started us snapping.

It doesn’t always work, because, sometimes people would rather argue than reduce their differences. But, even then, it leaves me with an understanding of what is happening, which I far prefer to be mystified about what is going on.

Read Full Post »

“You can trust in the power of music,
You can trust in the power of prayer,
But it’s only the white of your knuckles
That’s keeping this plane in the air.”

– Oysterband, “Dancing as Fast as I Can”

Probably, it is no accident that, as North American culture has grown less religious that affirmations have become increasingly popular. Today, affirmations have become a form of secular prayer, used by New Agers, athletes, and many religious groups – yet the only evidence that they work is anecdotal.

Affirmations are verbal or written statements whose repetition is believed to help people accomplish their goals. A classic example is Émile Coué’s “Every day in every way, I am getting better and better,” but there must be millions in use, some of them long and specific.

So far as I know, no one has traced the history of affirmations. However, I suspected they have multiple sources. Besides the secularization of society, they may also reflect the rise of the middle class, and a standard of living that gives people the illusion of having far more control over their lives than they actually do, so the idea that a magical chant can help them influence the workings of society or the universe actually seems plausible to large numbers of people. Perhaps, too, affirmations are a kind of watered-down form of behavioral theory.

But, whatever their origins, affirmations were first popularized by early business writers such as Napoleon Hill and Norman Vincent Peale in the 1930s and 1940s. They received a boost in the New Age belief structures that emerged in the aftermath of the counterculture of the 1960s, spreading until, today, most North Americans must have tried them at least once for everything from quitting smoking to getting a job promotion.

My own experiments with affirmations came while I was a long-distance racer in my teens. Encouraged by coaches and some older runners, I did my best to make them part of my training regime for about six months. They had no noticeable effect on my speed or times, or on my efforts to train regularly, but they did some use on focusing my attention on a simple, immediate goal.

For example, during one Chandler Memorial race from West Vancouver to Kitsilano, I was determined to beat a rival from Burnaby with the last name Reid. As the runners snaked over the narrow sidewalk on the Lions Gate Bridge, he was ahead of me, but I could do little to pass him. However, as I wound through Stanley Park, I began thinking over and over, “I fly, Reid dies.” By the time I had left the park, I had passed him, and repeating the simply rhyme helped me maintain the steady pace I needed to pull far ahead and finish the race.
For more complex, more abstract goals, however, I never saw any evidence that affirmations helped any more than simple determination.

Searching the web suggests more or less what I concluded independently. There’s no shortage of testimonies to the power of affirmations, nor of cheery assumptions that they can improve any aspect of your life (and that, if they don’t, you must be using them incorrectly).

But scientific evidence? If many attempts to study affirmations have been done, most of them have apparently never found their way on to the web. Possibly, researchers are embarrassed to investigate such a central part of pop culture, or wary of the unwelcome attention from true believers they might receive.

Such studies that exist give little reason to believe in them. One study mentioned briefly online suggests that affirmations can actually make people with low self-esteem feel worse. The news item is to brief to give any detail, but I suspect that when the gap between reality and the goal is too great, repeating the affirmation makes the discrepancy harder to ignore.

Otherwise, hard evidence is practically non-existent. Probably the closest to any study of affirmations are the various studies of prayers. At best, these studies suggest that praying may temporarily improve a person’s mood. No correlation between prayer and any external effect such as healing or influencing events has ever been found, aside from one poorly designed experiment that was quickly discredited – although it continues to be cited by those who wish to believe in the power of prayer.

Not that this lack of evidence is likely to convince those who have made affirmations part of their daily routine. As Garry Trudeau, the writer of Doonesbury, once said, the beauty of pseudo-science is that you can always find an explanation why a belief doesn’t work. Affirmations are part of the superstitions of our times, and few people care to question them. Instead, if affirmations fail them, they will decide they need to try harder, or that something else went wrong, and continue with their belief systems unchallenged.

Read Full Post »

The older I get, the more I become convinced that most debates are a clash of half-truths. Instead of one point of view being right and the other wrong, almost always each has a limited validity. Necessity or pragmatism may mean that I need to choose a side, but my support is increasingly nuanced and qualified by context.

One of the latest examples of this perspective is my reaction to a discussion on the Geek Feminism Wiki. In response to a guest post, one commenter mentioned that they were put off by the amount of swearing in the post. A second commenter immediately said that the first was using a tone argument, and others quickly joined in.

A tone argument, for those who have never heard the phrase, is one that, rather than addressing what is said, focuses on how it is said. Feminists, for example, are frequently told that they might convince more people if they used a politer tone. Logically speaking, such an argument is irrelevant to a discussion, which means that, by invoking a tone argument, the second commenter was discrediting the first, condemning the objection to swearing as invalid.
.
What nobody in the ensuing discussion seemed to consider is that both positions might be true depending on context. Yes, by the highest standards of logic, a tone argument is a fallacy. How an idea is expressed does not alter how convincing or accurate it is, and complaining about the tone is basically an emotional appeal – often effective enough at swaying an audience, but unfair in any attempt to have a rational discussion of the issues.

Yet, at the same time, when you consider rhetoric as an art, the way the classical Greeks and Romans did, you would be rash to deny that tone is completely irrelevant. A writer or speaker who prides themselves on being ethical would avoid relying only on a tone argument, but no writer or speaker of any skill would refuse to think of tone as a useful support for whatever they were arguing. If nothing else, the chosen tone would vary depending on the audience. Usually, too, it would vary depending on exactly what response the writer hoped to encourage in the audience.

However, this does not mean you have to practice double-think and believe that both are simultaneously true. Instead, it means that you have a Schrodinger’s cat sort of situation, in which both perspectives are true, but only until you consider the context.

In the case of the argument about swearing and tone argument, the context depends on the motivation of the original comment. Was the disapproval of swearing meant to derail the discussion? Then it was a tone argument, and deserves not to be tolerated. But, if it was a meta-discussion, a discussion about the discussion, then it becomes a valid commentary, and bringing up tone arguments becomes an effort at derailment in itself.

What complicates this example is that, within the context, which is happening is difficult to determine. The written word is generally less subtle than the spoken word, and, unless I am mistaken, the first commenter is not well-known on the Geek Feminism Wiki, so anyone likely to read the exchange probably has no idea what their opinions might be.

Since the commenter writes that, “anyone with a strong point should be able to make it without swearing,” I suspect it is a meta-comment about technique. However, the comment is too short for me to have any strong confidence in that verdict.

Personally, that lack of certainty would have been enough for me to hesitate to mention tone arguments. However, choosing a side is always quicker than considering the possibilities of all sides.

The trouble is, once you support the idea that tone arguments are a fallacy that is particularly used against women, then your position can quickly degenerate in an either-or position in which any mention of tone is something to avoid, regardless of the circumstances. In the same way, insisting that mentioning tone is no more than a matter of technique, you can just as easily condemn the idea of a tone argument as being overly punctilious.

Even worse, taking either position as your own means that you can descend into an endless argument in which there is no right or wrong, not because they don’t exist, but because you are ignoring the circumstances that would determine them.

Increasingly, that is what I notice about many arguments – not just the utter impossibility of ever reaching a conclusion that might satisfy everyone, but, beyond that, the crushing futility of exchanging half-truths. After all, a half-truth is also half a lie.

Read Full Post »

My partner Trish loved miniature roses. At one point, she had over forty plants on the balcony of our townhouse and on the courtyard outside. On weekends, she would spend several hours in the morning caring for them. Then, when we did errands in the afternoon, she would take the flowers, wrap them in moist paper towels if it happened to be a hot day, and distribute them to the staff of the stores and services we visited. If any were left over, we put them on display in vases about as high as my thumb, mostly around the computer.

They were very much her concern. I appreciated them as little points of symmetry and color, as well as for their names – Pinstripe, Pandemonium, Cartwheel, Carousel, and Black Jade – but had little to do with them except when buying one occasionally for her.

At one point, Trish had over forty plants. However, by the time she died, the numbers had dwindled to half a dozen, partly through normal attrition, but largely because her final illness kept us busy with more basic concerns.

By the time I had steeled myself to clean out the remains, they were down to four, two of which were not looking overly healthy. Never having been a gardener, I didn’t mind too much. I had more basic things on my mind, and I gave them minimal care only because I associated them with Trish.

But about a month ago, I bought some basil, which I use in spanakopita and lasagna. Somehow, the splash of green made the living room more home-like.

Inspired by that realization, I decided to bring the surviving Black Jade inside. Far from its former glory, it is now a sprig barely twelve centimeters long, clinging to the original root structure, and I thought it needed some shelter in order to survive the winter. Like the basil, it seemed to make my surroundings more comfortable.

Then, last week, I was walking through New Westminster when I saw a half dozen miniature roses on a rack outside a dollar store. One, I was sure, was a Black Jade. On impulse, I picked it up as well as two more.

At home, I put the white and peach flowered plants on the television cabinet, and the Black Jade on the tea tray that I use for a coffee table. They seemed to crowd the living room a bit, but, considering their effect, I decided they belonged there.

They’re not a shrine to Trish. Thirty months after her death, that would be desperate, and more than a little pathetic. But they are a memory of happy times, and they relax my eye as much as the art on the walls.

Read Full Post »

Unlike most articles, an interview does not require multiple sources for legitimacy. By implication, the subject of an interview is either famous enough or interesting enough that readers will want to hear about them in detail. As the writer, your task is to present your subject’s opinions as accurately as possible, with a minimum of comment from you. Your ability to reach this goal will depend on how you conduct the interview, and how you structure it for publication.

This goal does not mean that you simply present your questions and the interviewee’s answers. That’s a transcript – and after you’ve made a few, you will understand why experienced writers say that the worst thing you can do to a person is quote them word for word. Even the most articulate are likely to have ums and ers and other hesitations, and to repeat themselves and forget to finish sentences. For this reason, very few people cannot be made to sound like rambling drunks when quoted verbatim.

Also, many readers react with dread to a long passage presented as a quote, and are likely to skip. Give them too many long passages of one person talking, and these readers are likely to stop reading your interview.

Instead, it is understood that your interview is an edited version of the transcript. As the writer you are expected not only to correct grammar and spelling, but to condense and reorganize to make the interviewee’s statements clearer. Similarly, you might edit your questions so that the context of what is being discussed is clearer.

What you must never do, however (assuming you want to be taken seriously), is edit the interviewee’s words so that they say something they would not to say, or edit your words so that you look clever at the interviewee’s expense. Both these practices are an abuse of your power as the writer.

Conducting the interview

To help you reach these basic goals, learn as much as possible about the interviewee and the topic of the interview before it takes place. Not only is preparation likely to give you better results, but you will be able to know if the interviewee is wrong or avoiding a topic and be able to ask more thorough questions.

Whenever possible, conduct your interview in person. At the very least, conduct it over the phone or video chat like Google+’s circles. These venues will help you to ask follow-up questions more easily.

They will also make the interviewee’s comments more natural-sounding. You want to do an email interview only as a last resource. Even chat gives more natural-sounding results than email. Some experienced interviewees may prefer email because they want to think about what they say, but you may be able to make them change their minds if you tell them that a live interview requires less of their time – which is generally true.

When you do a live interview, remember that it is about the subject, not you. While you should have some questions prepared, try to make sure, especially in the early stages, that your subject talks more than you. Start them out slow by asking easy, non-controversial questions such what their background is, then steer them gradually towards more detailed questions.

Try and talk yourself only when you need to focus the interview, or to ask for clarification. You’ll be surprised how often the interviewee will mention the points you wanted to cover without any prompting if you only wait a while. Cultivate the skills of a listener, including using body language to show your interest.

Occasionally, you may interview two or three people together. When you do, have each interviewee introduce themselves at the start, so you can identify their voices as you transcribe the interview. Ideally, you could have them name themselves each time they speak, but that can be awkward and is easily forgotten as the interview continues.

If you do have to do an email interview, see if your subject will consider several rounds of questions. The second and subsequent rounds will be shorter, but you may need them to get clarifications or details. These details may include the proper spelling of names, although you can sometimes use a web search instead.

Writing the interview

As you prepare to write, you will probably notice that the interviewee has some pet phrases and sentence structures. Use these quirks as a way of representing character, but not so much that the interviewee sound ridiculous or limited.

If you are preparing a transcript, you may also have to decide how to write down your interviewee’s favorite structures. For example, you may decide after a few examples to omit throwaway phrases like “I think.” Similarly, you may have to decide whether a dash or a semi-colon best represents how the interviewee joins two thoughts together

When you come to write your interview, resist the temptation to present it in simple question and answer form. The more interesting – and more difficult – choice is to use regular paragraphs, weaving the quotes into the grammar of your own sentences. Readers find this structure easier to read, and it has the advantage of making summaries and explanations easier.

But, regardless of this format, try to find a quote that will serve as a conclusion, even if you have to pick it out of an earlier point in the interview. Often, I find that ending an interview by talking about future plans, finishing with, “Is there anything we haven’t covered that you want to make sure gets said?” will provide that conclusion.

You will find that some editors dislike ending with a quote. If you ever write for someone with this preference, restating the last quote in different words will often be enough. Otherwise, a modest conclusion will usually do.

All these practices make an interview very different from the typical article. In a typical article, you may quote, but usually not at such length, and the effect on the structure is minimal. By contrast, in an interview, the content becomes the structure. Your goal is to discover the structure implicit in the content.

Read Full Post »

If chronic aches last long enough, they can become part of your daily background. They can even worsen without you noticing, because you are living with them daily. Only when something relieves them do you realize exactly what you have been enduring.

Take, for example, my feet. I have been running for over eighty percent of my life. For much of that time, I did long, hard distances, averaging seventy-five to ninety miles a week. But although I always took care to wear running shoes with good support, like many young men, I was convinced that I’d never have to pay for all this wear and tear. I would magically continue the training regime I had followed most of my life, maybe slowing down a bit, but otherwise going on for decades much the same as I had in the past.

What I didn’t notice is that all that pounding on the pavement was gradually making my feet as broad and as awkward as a duck. I did notice that I could no longer wear Nike, but I put that down to a difference in manufacturers. I could wear Reebok and some Adidas models, and, not having any brand loyalty, that was good enough for me – especially since I disliked the rumors of Nike’s sweat-shop practices.

But the spreading of my feet was aggravated by an attack of what might be called sports gout. Heavy training in hot weather had left my body critically short of sodium, potassium, and trace minerals, causing the joints of my big toes to swell agonizingly. A few supplements took care of the gout, but not before the joint at the base of my left big toe had become permanently twisted sideways and semi-locked.

Between the normal spreading and this mild deformity, my shoe size gradually increased so I could rarely get a shoe with the necessary width. By last year, I needed a shoe three half-sizes larger than justified by the length of my foot. It didn’t help, either, that stores seemed increasingly inclined to carry only normal shoe widths.

To say the least, the result was uncomfortable. A foot moving inside a shoe gets little of whatever support the shoe around it offers. It is always strained and feeling sore, and more fallen arches and pinched tendons happen. But, as I said at the start, I didn’t especially notice, because the condition had crept up on me so slowly. So far as I thought of the problem at all, I imagined the constant discomfort was a consequence of growing older after a life-time of abusing my feet. What worried me, though, is that it was getting worse, so that I could hardly walk three miles before it felt like a bruise was breaking out all over both feet.

Then, last week, I noticed an ad for SAS Comfort Shoes‘ new store in the free local paper. I rarely notice ads in newspapers or online, but perhaps my growing worry made me notice this one. Not only did the store make its shoes in the United States, but it specialized in wide shoes and styles designed to fit well. Its models included a training shoe, so after my stint in the gym today, I hobbled out to the store. I didn’t expect much but I thought I had nothing to lose. If a store that claimed those sorts of wares couldn’t help me, I would have to consider custom-built shoes.

I explained my particular needs to the sales clerk, and tried on the trainer. Immediately, I felt my feet relaxing, and realized how sore they were from my normal workout. Walking the length of the store and back again, I also noticed that my foot was no longer sliding about. The shoes were actually supporting my feet. For the first in several years, I was wearing shoes that fit something like properly.

I had to try on a few different sizes and widths to find a perfect fit, but in ten minutes I’d found it. I quickly moved on to buy a pair of business casuals, which fitted differently, but were equally comfortable.

I wore the trainers on to the street, feeling so light on my feet that I thought I could dance – a big change from the way I’d dragged myself in. I settled for walking a little straighter and enjoying lighter spirits.

Unlike many running shoes I’ve bought in my life, this pair needed no breaking in. Four hours later, when I returned home from my other errands, my feet were still feeling relaxed.

It was a feeling that I’m sure I could get used to. In fact, as I write, I realize that I already have.
Eventually, I plan on returning to the store for other shoes. If the shoes are more expensive than those offered in most stores, I am willing to pay the difference for comfort.

However, the real moral is not just an unpaid endorsement of a business. For me, the moral is that stoicism has its limits as a virtue. In the future, I’ll try to remember that, just because I’m used to a discomfort doesn’t necessarily mean that I have to live with it.

Read Full Post »

I never did care much for Wordsworth. But the rest of the Romantic poets – Shelley, Byron, Keats, and Coleridge, in that order – taught me the rudiments of poetic technique when I was a teenager. What’s more, I learned well enough to have a dozen or so published poems to my credit without trying too hard. But one aspect of Romanticism that I never managed to accept was having a muse.

That wasn’t through lack of trying. Having a muse is potentially convenient when you’re an adolescent boy and not sure how to approach girls. You can play out your infatuations in your attempts at poetry, and not risk actually talking with the object of your affection. Better yet, if – as happened to me – you are grief-stricken at the focus of your infatuation moving away, you can dramatize events until you feel better. I think of this as the Dante gambit, after the Italian writer of The Divine Comedy, who found a muse in a woman he had met only once, and was never around to casually disillusion him, as a real person might.

That was the trouble, really, with the whole idea of a muse. The closer you actually were to a girl or a woman, the less likely she was to act like a muse. She wouldn’t hang around inspiring you by looking soulful or sighing with bliss as you recited the poems you dedicated to her; she had school or a job and would insist on straying from your side on her own business.

I suppose the difficulty of reconciling the projection of a muse on to a woman’s life is part of what is behind Robert Graves’ White Goddess, and his attempt to cast the poet-muse relation in a myth — a myth that inevitably ends in the muse’s betrayal of the poet’s loyalty and aspirations, only to start again with the next woman he elevated in his mind. Graves was dramatizing the fact that any woman would eventually tire of being his inspiration, and find some other lover who wasn’t playing so many games.

It seemed to me a form of selfishness – especially when I learned from Graves’ biography that while he was enjoying the masochism of living his myth with a succession of muses, he also had a wife who raised their children and oversaw his household.

I thought much the same about Shelley, playing guitar with Jane Williams while Mary Shelley was nearing a nervous collapse, mourning the death of their child, and trying to run a villa in a foreign country without enough money. Having a muse sounded suspiciously like an excuse for flirting.

After a while, another point started to nag me. If poetry was the result of a literary-minded man’s (mostly) chaste infatuation for a woman, what was the explanation for Sylvia Plath? This was a matter of real concern for me as Plath became one of the first moderns from whom I learned.

Robert Graves did have a throwaway line about women’s poetry drawing on different sources than men’s. But he never explained what those sources were, being uninterested in anything outside his own personal mythology.

Obviously, though, women didn’t have muses in the way that men like Graves did. A new lover might inspire poetry – a lot of it in the early stages of a relationship – but no published woman that I could find seemed to view any man in her life as mystical or even temporarily mythological.

It was all very puzzling, especially since the idea of running off to some modern Missolongi  and dying prematurely had limited appeal. I was tolerably certain that dying of consumption wasn’t on the agenda, either.

Gradually, I came to realize that the idea of a muse was only possible in a culture where men knew few women, and had to fill in the blanks in their knowledge with their imaginations. It was a form of projection, really, not much different from pornography – just prettier. Neither was reconcilable with the real relationships I was starting to have.

Later, my readings in feminism would give me the concept of objectification, and encourage me to condemn the whole idea of a muse as something fundamentally unfair. But, even before then, I had abandoned muses as a concept that was not so much false as mentally exhausting. Trying to believe in muses, I found, only made me affected and self-conscious.

On the whole, fiction writers got along without muses. So, a few years after I discovered poetry, I decided that I could too, no matter what genre or style I wrote.

Read Full Post »

A couple of weeks ago, I went to the Amanda Palmer concert with a neighbor. He kept worrying that he would be the oldest person there – a concern that never occurred to me, although I am several years older than him. The truth is, working in tech makes me more comfortable with younger people than those my own age, who often seem stodgily suspicious of anything new. However, changing my main online photo tonight forces me to confront the fact that I’m aging, just like everyone else.

Few people, I suspect, can look at their own picture without feeling uncomfortable. Part of the reason is that most people’s self-image is always several years behind their actual age. Another reason is that all of us are most familiar with our mirror images, which of course are reversed. For both these reasons, a picture never looks quite right. The most we can hope for is that any given picture doesn’t make us squirm too much. Personally, I prefer to play the coward, allowing pictures of myself at only long intervals.

Anyway (I always grumble), people take far too many pictures of themselves, thanks to digital cameras. Keep your life undocumented, and at least you can busy yourself with living it. Spending all your time recording is more meta – and more trouble – than I care to for.

Still, I’ve been aware for a couple of years now that my picture needed updating. One of my regular publishers offered to pay for an update, and even that wasn’t enough for me to brave the ordeal of picture-taking. Then I thought I’d wait until I recovered from last year’s knee injury and had some faint whimper of fitness. Eventually, I just put if off, putting off the moment of truth like Kipling’s Queen Elizabeth psyching herself to look into her looking glass.

But today I felt braver than usual. I finally had a neighbor snap a dozen shots against the nearest neutral background. It wasn’t the best time to do so: I’d been several hours out in the sum, so my face was red and blotched. My ears looked as though I had folded them up and used them as a makeshift pillow the previous night. My eyebrows were so pale that most of them were invisible, and the angle of my head makes me look like I have a double-chin and shows that I could do with a shave.

As for the wrinkled neck and piggy eyes, please don’t get me started. I could go on and on – but I see I already have.

Yet, as uncomfortable as the picture makes me, I couldn’t mistake those escaped hairs dangling in the middle of my forehead. But at least my hairline was no higher than in my last picture, and I’ve finally aged enough that my face gives an illusion of character. To me, anyway, I look guarded, maybe politely skeptical. Either seems an improvement over the terminally gormless look of most of the pictures through my life.

I still have no idea how representative the picture is. But, all in all, I could do worse. Before I could change my mind, I updated all my online profiles. I now propose to forget what I look like for another few years, remaining blissfully ignorance of how I am changing and averting my eyes from even the vaguest possibility of a reflection that might confront me with the truth.

Read Full Post »

In the last couple of generations, modern industrial culture has seriously reduced the range of acceptable emotions. Certain emotions are not only unpleasant, the conventional wisdom goes, but should be avoided at all costs. However, the older I grow, the more I become convinced that this attitude is not only wrong, but actively harmful.

One of the most obvious examples of this attitude is the insistence on extroversion. Today, the model of the well-adjusted person has become an outgoing optimist who lives and works in groups, and feels uncomfortable alone. Not only are projects in schools and businesses increasingly done in teams, but even yoga and meditation, originally intended for private reflection, is done primarily in groups. By contrast, anyone with a preference for occasional privacy is seen as maladjusted at best, and at worst a potential perpetrator of another campus shooting.

This either-or distinction is a distortion of Carl Jung’s original concept, which described two poles of behavior, and was never intended to label people. Nor did Jung intend to condemn either extreme. Equating introversion with maladjustment is as accurate as it would be to condemn extroverts for being irresponsible and unable to focus; both extremes might include such behaviors, but actually cover a far broader ranger of behavior.

More importantly, as Anthony Storr points out in Solitude, many forms of creativity and original thought seem to require extended periods of introverted behavior. For that matter, the most successful forms of collaboration tend to be like the one used in free software, in which people work alone in the initial stages of their works, then collaborate for peer review and tweaking. By devaluing introversion, we are probably also undermining creativity – which may explain why movies with three or more names on the script rarely produce anything memorable.

Similarly, certain states of mind, such as depression and anger, are seen not only to be unpleasant, but to be avoided and medicated as quickly as possible. More – any decisions or conclusions reached in these undesirable states are questioned, or excused as being the indication of an unsound mind.

In some cases, that might be so. But always? Probably not. Depression and anger are natural reactions to events like the death of someone close, or being treated unfairly.. While dwelling endlessly on such things is unhealthy, accepting them for a certain amount of time is probably necessary for coming to terms with them. Denying this need, or trying to shorten the time in which such emotions are indulged may be as mentally unhealthy as removing a cast before a bone has had time to knit back together is unhealthy physically.

As for these emotions offering a skewed version of reality, why do we assume that the optimism that we believe is typical of a well-adjusted person is any more accurate of a perception? Personally, I have seen more projects – and companies – spiral downward because of decisions made by an optimist who was unable to admit when something was going wrong. A depressed person might at least anticipate problems so they could be countered, or admit problems when actually faced with them. In the same way, an angry person might drive themselves harder for success. Instead of accepting only one attitude as realistic, I suspect that we need to accept a much wider range of emotions as sometimes offering useful perceptions.

Yet another example is the nervousness and anxiety typical of someone who moves into a new job or set of responsibilities. When you stop to think (and even when you don’t), there are valid reasons for feeling uneasy. There are many things you can’t know about your new position, and you want to prove yourself to colleagues and ultimately become a success.

Many athletes and performers recognize such feelings – actors call them “flop sweat.” But rather than pretending that these feelings don’t exist, they worry when that not having such feelings will lead to a flat and uninspired performance. The trick, they will tell you, is to control these feelings, to channel them into the performance. If you can do that, you will have the extra edge that leads to outstanding performance.

However, we don’t admit that flop sweat is natural, let alone teach people how to cope with it. Instead, we give it a name like Impostor Syndrome, elevating it to a psychiatric condition – which except in a small minority of cases, it usually is not – giving the sufferers one more thing to worry about and elevating the feelings into some vast, impersonal force. Instead of teaching them how to reduce the anxiety by practice or planning, we encourage the sufferers to give themselves affirmations, or seek the approval of others. We encourage them to look for placebos rather than solutions that are known to work, and, as we do so, we are probably both preventing the development of competence and encouraging mediocrity.

I am not the sort of Puritan who believes that suffering is necessary for success, or needs to be sought out. But I do believe that it must be confronted directly, not avoided. Too often, in our panic to avoid the least unpleasantness, we limit ourselves and short-circuit the processes that are necessary for accomplishment and competence. We mean well, but in enforcing extroversion and pleasantness, we may also be suppressing necessary and useful emotions.

Read Full Post »

« Newer Posts - Older Posts »