Feeds:
Posts
Comments

Posts Tagged ‘Business’

“The secret to a long life is knowing when it’s time to go”
Michelle Shocked

I used to say that any company that hired me full-time was doomed to go out of business in six months. That was more of a joke than the truth, but I do sometimes have an instinct for looming failure. My enthusiasm draws me towards a quixotic organization, and my powers of observation quickly disillusion me. That was true in my teenage relationships, and has proved equally true since in my dealings with employers and various causes – a fact that causes me even more self-doubts than satisfaction.

For example, I once gave up my usual freelance status for a position with a company that I was convinced was doing new and exciting things, and would be a benefit for users. I was quickly promoted, but my increasingly insider position made me uncomfortably aware that the company was spending more time on development than in finding a business model.

After a deal I had nurtured for three months fell apart because the CEO couldn’t bother to check his notes before negotiating, I did a little mental graphing, and concluded that the company would never turn a profit before it ran out of money. I resigned, and the company declared bankruptcy ten months later. Its end would have come sooner, except that employees were on half-salary for the last six months. I never saw any figures, but the company must have had a gross income of well under ten thousand dollars, while spending several million.

A while later, I took a similar position, partly for the pay but mostly for the chance to work with some industry leaders. Soon, I realized that its strength,too, lay in research. At that point, it was two-thirds of the way through developing its own product, which supposedly would be the cornerstone of its future products. Not wanting to leave the product half-finished, I stayed until its release. It sold poorly, much as I had expected, leaving the company with nothing to attract additional investment.

After much conscience-probing, I resigned. A few weeks later, massive layouts hit. The company careened along for several years, but as a consulting house rather than a manufacturer. By the time its doors closed for the last time, its original plans were forgotten by everyone except the executives.

More recently, my enthusiasms lured me into becoming active on the board of a non-profit. In this case, familiarity soon bred alarm. Although I believed in everything the organization stood for, I couldn’t help seeing that the founders had an unfortunate combination of arrogance and inexperience that seemed certain sooner or later to produce a disaster.

Not only did they have no idea of how to deal with the public, but they were incapable of seeing the need for developing a community. Worst of all, with an approach that could only be called aristocratic, the founders gave the board little to do except to agree to decisions that were already made.

For some months after I resigned, the organization was struggling just to raise the money it needed to survive. It took a couple of PR hits, but survived, largely because no one was watching it.

Then the non-profit managed to annoy people in quantity. The founders were denounced, sometimes legitimately, sometimes abusively. Their commitment to their cause was questioned. Previous supporters declared they would not donate again. Other organizations stopped associating with it. Sponsors were questioned about their connection, and, instead of making a public apology, the founders chose to remain unrepentant.

The organization still has supporters, and can probably continue until the next time it needs funds. But, as I write, its future effectiveness seems doubtful, and maybe impossible. The death watch has probably started, although it may be prolonged through stubborness.

In all these cases, I found myself tangled in mixed emotions.

On the one hand, I felt that I had dodged a bullet against all odds, that I had been wandering oblivious through an obstacle course, and only escaped being dragged down myself through lucky coincidences. I also felt my prophetic gifts proved, and had to resist uttering variations of “I told you so!” in public.

On the other hand, I had committed myself on all of these occasions, and made friends – or, more accurately, perhaps, friendly acquaintances. Could I have done anything to stave off disaster if I had stuck around? I asked myself on each occasion?

Fortunately, I’ve trimmed back my megalomania by concluding that I could only have been tainted by each failure, but that doesn’t eliminate the guilt. I’ve been the first person to mutter against myself analogies about rats and sinking ships, feeling hypocritical because, for all my concern, my sense of relief is even stronger.

I suppose I should be grateful for the survival skill. After all, with my ideas of loyalty and obsessive tendencies, things could have been far worse for me. And I do feel grateful – just not very proud.

Read Full Post »

Forced to choose between being a follower or a leader, I would reluctantly choose to be a leader. But I would far rather be neither, because I perform poorly in both roles.

The trouble I have with being a follower is that I am not a passive person. I am quick to express my opinion, and ideas come quickly to me. But under all but the most enlightened management, these are not traits that are appreciated. Consequently, as a follower I either have to suppress my thoughts or risk antagonizing those who are supposed to be in charge of them.

This situation leads to repression, which either slips out in the form of sly and generally unwelcome remarks, or over-compensation, in which I try so hard to conform that I end up applying to myself some of the wonderfully inventive eighteen century terms for describing underlings, such as toady or lick-spittle. Probably no one else would think those words apply, since I don’t give management insincere compliments or anything like that, but the point is that this is an uncomfortable self-image to have.

Even worse, having managed in my time, I’m always second-guessing those further up the ladder, and thinking – no, knowing – that I could do better. I keep thinking that, if I were left to manage matters, everything would go more smoothly. Telling myself that this belief is probably a delusion does nothing to keep me from holding it.

Like most people, I can deal with being a follower if I get my exercise and rest, and keep up my other interests. But it’s an inherently unstable situation, and sooner or later I crack from the strain.

Being in charge is preferable because of the greater autonomy. I enjoy setting priorities, and being responsible for decisions.

All the same, as a leader I’m only slightly more at ease than when I’m a follower. If nothing else, knowing how I chafe as a follower, I’m constantly wondering I’m affecting those around me, and what they think of me.

It doesn’t help, either, that I don’t believe in leadership or hierarchies. My observations and personal experience has convinced me that, for all the emphasis on leadership you hear from management gurus, no one – including me – has any clear idea of what leadership is about.

What worries me is that I will start to confuse myself with the role – that, instead of thinking in terms of tactics or strategies, I will start to use my position as a justification of expressing my ego. I worry that I will get used to having people obey me, and actually get to like the power. If I’m not careful, I may start pressuring myself into actions that the leadership role logically demands, but which I would be reluctant to do in my personal life. The chance of losing myself in the role is always all too likely.

Even trying to be an egalitarian leader is only a palliative. Priding myself on an open door policy, talking about how I am against the cult of leadership, claiming that someone can replace you in a couple of years – all these things, I worry, will only hide the rot that is slowly setting in.

Nor, I suspect, that worrying about such things do anything except make me even less of an effective leader. In the end, I find being a leader only slightly more endurable than being a follower.

Given a completely open choice, I would far rather work in a group that pools its efforts, and hammers out tactics and strategies in discussion. But this is largely a fantasy born of reading stories about King Arthur and Robin Hood, and I’ve rarely and only briefly ever found such a situation.

Instead, I prefer the role of consultant or freelancer, in which I negotiate an exchange of services as something like an equal, and I’m less likely to twist myself out of shape. Being a freelancer isn’t always easy, but it’s the role that comes far nearer to preserving my self-respect than being either a follower or a leader.

Read Full Post »

Considering how much I dislike authority figures, I have had surprisingly little trouble with them in my working life. Maybe the fact that I am habitually polite in person helps – although it can also give rise to charges of hypocrisy if I criticize someone later in an article. Maybe, too, the fact my acts of subversion are usually covert has something to do with it, too. But whatever the reason, I only remember a single reprimand – and then it was without any intent on my part.

The incident happened when I was working for a small company that was being slowly ground under by its CEO. He was new and. while he was learning as he went along, he lacked the empathy to understand that repeated purges of the staff might have an effect on morale. I mention this background because worry among the top management might have been responsible for my reprimand.

At the time, I had the habit of entering small jokes into the screensaver banner – wry, mildly amusing one-liners of the sort you often see today on Facebook and Twitter. Most were so trivial that I no longer remember them. One might have been “Common sense isn’t,” and another (borrowed from Doonesbury), “It’s tough being pure. Especially in your underwear.” If I didn’t use either of these, the ones I did use were similarly innocuous.

So, too, I thought was the one that caused me trouble. It was a T-shirt slogan that I had first heard about at a Garnet Rogers concert: “Does anal-retentive have a hyphen?”

I changed the banner after a morning of editing a manual for publication when I reflected that I was lingering over changes that probably no one except me would ever notice or care about. To me, the expression was a comment about how overly-punctilious I was being and how close I was to losing my sense of proportion. I posted it, and went for lunch.

When I got back, the fourth highest executive in the company accosted me with a look so grim that I thought another company purge had come. Instead, with lips quivering with disapproval, he insisted that I take down the banner.

“Why?” I asked, genuinely puzzled.

The lip quivering increased. “I shouldn’t have to tell you. Some things are simply unacceptable in the work place.”

“There’s absolutely nothing wrong with that comment,” I said, secure in the knowledge that I had a consulting contract with a kill-clause. “What’s the problem?”

But finally, after the executive made a few vague efforts to talk around the issue without being specific, I relented. All I really understood was that he thought I had overstepped and that, more in sorrow than in anger, he had to correct my behavior.

“No big deal, if that makes you happy,” I said. “But you’re making a fuss over nothing.”

To this day, I am still not sure what he thought I was saying. I doubt that he was suggesting that I was making a comment on micro-management, because, if anything, the company management style was too remote.

The most likely possibility, since he was a fundamentalist Christian who had read little outside the Bible, was that he was unfamiliar with the term “anal-retentive” and jumped to the conclusion that the expression was obscene. Maybe he just felt that a phrase whose meaning he didn’t know should be deleted on principle.

But, whatever the reason, I not only felt that the matter was hardly worth bringing up, but that he had over-reacted. I had no point to make, and would have removed the comment at a simple request.

For a month after the incident, I had little to do with the executive. Technically, I was reporting to him, so matters might have been strained, but since his supervision consisted of approving the task list that I wrote for myself and collecting my time sheet so he could initial it before sending it off to payroll, the main difference was that we talked less.

Finally, he decided he had to discuss the matter with me. He claimed that he was the main reason I was hired as a consultant, and insisted that he had done the right thing, and expected me to agree.

However, I was in no mood to give him much satisfaction. “You over-stepped your authority,” I said, “But that’s in the past, so I’m willing to forget what happened.”

That wasn’t good enough for the executive. He tried to get me to apologize, but I simply continued to insist that we move on until he gave up.

We never did return to the relatively friendly relationship we had before. But, a few weeks later, I put in my notice, and the issue ceased to matter. Since then, I’ve thought more than once that the real sign of how anal-retentive I can be is that I’ve wondered occasionally since exactly what he thought was happening.

Read Full Post »

Having worked freelance for most of my adult life, I’ve set up my desk in countless locales. It took me a while, though, to realize how to set up the desk in relation to the view. That’s not the kind of thing I was ever taught – I had to learn the hard way, through experience.

When I was a sessional instructor in the English Department of Simon Fraser University, my desk was mainly something to lean on while I talked with students about their essay preparation and results. I always counted myself lucky that I had entered grad school in English, because the Communications Department – my other choice was in the windowless maze of the Classroom Complex. By contrast, the English Department was on the north and west side of the sixth floor of the Academic Quadrangle. Each semester, I would have a variation on one of two views: The inner one, where I could see people passing through the quadrangle and, in summer, lounging on the grass, and the outer, which gave a spectacular view of the mountains to the north. I could, and did spend hours at my desk staring out at that semester’s view. But I never expected to get much work done anyway, because it would be sure of being interrupted just as I became absorbed.

When I became a technical writer, and later a marketing and communications consultant, the view became more important. At one long-term position, my window overlooked the top of the Hudson Bay parkade in downtown Vancouver. Looking down, I could see not only the people coming to and from their cars, but also the car thieves going systematically down the row of cars. I can’t have been the only one watching, because security guards would always come along a few moments before I thought to call them. But I was lucky that the project kept changing directions over the thirty months of its existence, or else I might have been too obsessed with the view below to keep up.

The same was true when I worked in Yaletown. The two storey building across the road had a flat roof that, over the decades, had accumulated enough top soil to support meter-high weeds. The weeds make the roof a perfect place for seagulls to nest out of sight of predators. Later, when the chicks came, they would scurry into the grasses to escape the detection of crows. Later still, they made their first stumbling efforts at flight across the roof, crash landing in the clumps of weeds. I was more fascinated by the progress of the fledglings than in the work I was doing, by far.

But my real downfall came when I worked on the twenty-third floor of Harbour Centre. I was the fourth person hired, so I more or less had my choice of locations for my desk. I placed it squarely facing the window, looking down at the harbor and beyond it to the mountains. The view was relaxing when I was negotiating ad space and bundling agreements on the phone, but a disaster when I was trying to write a manual or ad copy. I’d find myself staring out the window, and realize guiltily that I had left my thoughts to rove freely for the last ten minutes.

I wasn’t prepared to give up the view entirely, so I moved my desk at right angles to the window. That way, I could focus on my work without the distraction of a seaplane landing or a cruise ship docking, but, when I was on the phone or wanted to take a moment’s break, I only had to swivel in my chair.

I’ve followed the same arrangement ever since. Now, as I write in my townhouse, I am at right angle to a view of the trees beyond my third floor balcony. The view is not as breath-taking as some I’ve had away from home, but with the parliament of crows thirty meters away and its occasional visit by red-tailed hawks, there is still more than enough to distract me if I permitted it. But by not looking at it directly, I keep my productivity high, and can still enjoy the sight of the swaying tops of the evergreens when I stand, stiff and in need of a stretch after a long bout of work.

Read Full Post »

You’ve got to sing like you don’t need the money,

Love like you’ll never be hurt,

You’ve got to dance like nobody’s watching,

It’s got to come from the heart if you want it to work.

– Kathy Mattea

Sometimes, I find myself rediscovering the obvious. When that happens, I’ve learned to pay attention, because it always means that I’ve forgotten something to which I need to pay more attention. A few days ago, I made the thirtieth or fortieth of these rediscoveries in my lifetime – this one to do with networking.

Most of my income these days comes from journalism, but I do pick up the occasional tech-writing, communications, or graphical design work on the side – especially since the rise of the Canadian dollar has reduced the converted value of my pay cheques in American funds. Consequently, like any consultant, I am constantly networking to keep my name out there.

The only trouble is, most networking events are at the end of the day. After eight to twelve hours of work, going out is often the last thing on my mind. I often feel like I have to drag myself out to the events, when, instead of meeting a room full of strangers, what I really want to do is sprawl out on a futon with a parrot or two.

Then, when I get there, I have to get into persona. Regardless of how I feel, I have to look and sound outgoing, and bring out my best small talk. I never have been one of those who believes in speed-networking, counting the evening’s success by the number of cards I collect, but I have usually felt that I ought to circulate when I was really more in the mood to find a good conversation with two or three people in some quiet corner.

Yet over the last couple of years, I’ve been thinking more and more than the typical networking event was becoming less and less worth my while. Part of the reason was probably the tight economy, and another part that many of the same people keep attending the local events. But it was only this week that I accepted that most of the problem was my attitude.

The revelation came because I was out at an altogether different gathering. It had nothing to do with work, or even technology – it was just a group of people with a common leisure interest. And there, when I wasn’t even trying, I got the first piece of consulting work I had picked up at public event in over a year.

If that had just happened once, I would have attributed it to serendipity. But the next night, under the same casual conditions, it happened again, which makes coincidence seem less likely.

The difference, I think, lies in the image I project. I like to think that I talk a good line of piffle, and can make myself likable when I make an effort, and to judge by how people respond, that is not completely my imagination. But when I am going against my inclinations and maybe trying too hard, I suspect that I am projecting – not falseness, exactly, but an impression that is less than completely genuine. Even if most people are unable to explain why, something about me does not seem right.

Should I be in the position of needing work, this lack of authenticity is compounded by desperation. Most people, I find, are made uncomfortable by the slightest hint of desperation, and will avoid people who show signs of it. A few will even try to take advantage of it, although that’s another issue.

By contrast, at genuine social events, people are more likely to be relaxed and able to enjoy each other’s company. Our attitudes create an atmosphere in which actual connections can be made. Although the contacts we make may be fewer than those made at a networking event, the ones we do make are more likely to run deeper. Paradoxically, the less we try to connect, the more likely we actually are to connect.

I’m thinking now that much of how we’ve been told about how to network is inefficient, if not a waste of time. When I consider how I react to most of the people at networking events, I suspect that I’m not the only person with authenticity problems in attendance. Many, perhaps most, I suspect have the same problems as I do to a greater or less degree.

Under these circumstances, is it really so surprising that so few of us connect? We all want something from such events – a connection, a lad, a job – and we are all trying so hard that most of us are being less likable than we could be. Moreover, if some of us do have something in common, we may never realize the fact, because we are too busy with our false fronts.

To suggest that we stop worrying about making impressions or collecting business cards may sound counter-intuitive. To go out and simply enjoy ourselves, trusting that we will make connections without really trying might sound irresponsible, and trusting too much to luck. And almost surely it will result in far fewer connections than a networking event. Yet the connections we do make when not trying too hard are likely to be ones that are meaningful to us. Best of all, they don’t result in hundreds of business cards that we keep in drawer for a few years before we throw them away wondering who exactly all these people might be.

Read Full Post »

The local real estate agent has always seemed a decent sort when I’ve talked to him. However, he has one annoying habit: he persists in filling my mail box with notepads and calendars that I will never use, because I don’t think the little things in my life should be converted into advertising. Today, he left a flier that included a quote that contained all the elements that I detest in Dale Carnegie and similar business gurus.

The quote was: “Did you ever see an unhappy horse? Did you ever see a bird that had the blues? One reason why birds and horses are not unhappy is because they are not trying to impress other birds and horses.”

Like much of what Carnegie had to say, the banality alone is enough to drive me screaming down the hall, banging my head against the walls in the hopes of driving the quote from my mind (perhaps I exaggerate). But such a quote passes for wisdom because it is short and makes a general statement, the way that aphorisms are expected to do. I suppose you could say that the quote is a triumph of style over substance; people sense the aphorism-like structure, then assume the profundity they expect, even though it isn’t there.

However, what really gets up my nostril about the quote (to use a wonderful Scots or Aussie expression I picked up from a live Eric Bogle album) is how sweepingly and utterly wrong it is on every possible level.

For the record, I have lived with four parrots for over two decades and I have seen my share of horses, considering that I’m a city-dweller. So I am in a position to confound Carnegie by saying that, yes, I have seen unhappy birds and horses. Many times.

I have also seen them trying to impress potential mates, sexual rivals, and the humans in their lives. They are social animals, and all social animals that I am aware of learn to do these things early. Many continue the attempts until their last moments.

Anybody who can assert that birds and horses are not unhappy and never try to impress simply hasn’t been paying attention. Both have enough sense of self that they have no trouble being unhappy (most often because they are being mistreated by humans) or worrying what others think of them. Moreover,they are in no way shy about revealing their feelings. It speaks volumes – if not flashdrives full of ASCII text – that Carnegie never noticed, and, this blindness alone disqualifies him from making any general statements about existence. I would sooner trust someone who had never noticed gravity, or was unable to judge an oncoming car’s speed well enough to cross the road safely.

Carnegie further reveals the shallowness of his own perceptions (or perhaps how sheltered a life he lived) in his implication that all unhappiness stems from the wish to impress. Hunger, poverty, violence, envy, unrequited love – you can’t begin to list all the causes of unhappiness without sounding banal yourself. But the point is: how could he have missed the falseness in what he said? Did he simply not care about the truth of what he said, so long as it sounded clever? Or was he so obsessed himself with impressing others that he was trying to elevate his own personality to the status of a universal truth? Either way, he reveals himself as an untrustworthy guide to any part of life, and unfit to dispense advice of any kind.

Personally, I work too hard to evolve a mental map of the complexities around me to accept over-siimplistic and inaccurate observations simply because their structure leads me to expect wisdom. Yet that describes every line of Carnegie that I have ever read. Next to him, Ayn Rand is a towering genius of literature; her prose may be tortured, and her world view is that of a failure dreaming of the esteem they would like to believe is their due, but at least her thoughts have some complexity and a relation – however distant – to observable truth. By contrast, Carnegie has only a superficial glibness that cannot hide his inability to say anything that is accurate, let alone profound. It says a lot about the business world (none of it good) that such a shallow thinker continues to be read and admired.

Read Full Post »

One of the ideas that keeps circulating through books and blogs about business is that emails should always be kept short. The suggested rules for keeping them short vary – less than a hundred words, short enough to be read without scrolling, and, most common of all, two to three sentences – but the idea is similar, regardless. It is also an idea that can do as much harm as good.

There are at least two rationales for this idea. The first is that if you are sending a marketing mailout, shorter ones are more likely to be read than longer ones. However, given that mass mailouts are likely to be flagged by spam filters, I suspect that the length doesn’t matter much, and that mailouts of any length are most likely to go to the Trash folder unread. Moreover, most emails are not mailouts, and, in a business context, you can count on them being read regardless of length.

The second is more convincing: Conciseness is more forceful, and more respectful of the recipients’ time. And, certainly, many emails, such as those arranging a time and place to meet, hardly need more than a sentence or two.

However, if issues are being discussed online, a brief email will hardly be enough. You should still make sure that an email is no longer than it needs to be, but if what you are discussing needs eight hundred words, then by all means you should give it eight hundred words.

By contrast, insisting on conciseness can cause problems of its own. For instance, this morning I received the following email: “Any specific request for the Greek food? How many boxes would you like?”

I immediately understood that the first sentence referred to the Greek food that the sender would be bringing around to my townhouse tonight. However, at first I interpreted the second sentence as a question about how much food should be bought. It was only after I replied that I realized that the second sentence was referring to another matter altogether – how many cardboard boxes the sender should bring, since they were helping me pack. This is a trivial matter, but it shows how easily conciseness can create confusion.

Moreover, in many contexts, being concise can create a negative impression. It can sound curt or rude, or even indifferent, even if your intention is not to create such an impression. Communication is always as much about relationships as it is about the ostensible subject, and, when you are being concise, on of the aspects most likely to be left out is the relationship part. If you get too caught up with efficiency and forget this fact, that you can easily leave the recipient feeling slighted.

I had an example of this when someone recently sent me a three-sentence email in response to news about my partner’s death: “I am so sorry to hear about your loss. It is so sad. My sympathy to you and your family.”

The sentiments were right here, but the shortness of the sentences undermined them, making their expression sound formal and insincere. The email also has a staccato rhythm and regularity of structure which makes it sound even more perfunctory. It sounded so indifferent that I found myself wondering why the sender had even bothered.

By contrast, here is another email I received at around the same time: “I am so very sorry to hear this news – please know that you have many friends thinking of you and sending you support and love.”

It is just as short, and just as obviously from someone who does not know me particularly well. Yet it takes the time too be personal in a way that the other email did not, and as I result, I found some small comfort in it.

Novice writers might be tempted by simple rules to help them write better, but the point is that they rarely work. Follow a rule like the three sentence email, and you can cause yourself more problems than you solve.

Read Full Post »

The Geek Feminist Blog, which is always a source of intelligent reading as I start my daily routine, recently posted an answer to question about how to maintain self-confidence. The poster responded with suggestions, several of which were about how to boost self-esteem – for instance, talk to supportive friends, celebrate your accomplishments, and “don’t forget to be awesome,” which apparently means to feel good about yourself and what you do. However, what neither the poster nor most of the commenters on the entry ever seemed to consider is that self-doubt might have any advantages, or, at the very least, be preferable to self-esteem.

One of the peculiarities of North American culture is that it emphasizes the extrovert. In the popular conception, to be confident and outgoing is to be successful – and not just at one end of a personality perspective.

By contrast, to be diffident and private is nearly synonymous with sociopathy. Geeky high school kids, for example, are widely viewed as the ones most likely to gun down their classmates.

Yet, when you stop to think, both these views fall far short of reality.

Confidence is based on experience, on having gained an understanding of a situation or the ability to handle a situation. But the problem is that North America favors the appearance of confidence – especially in men – and is careless about whether it is real or not. The result is a culture in which, all too often, criticism is ignored and those who argue risk being branded “not a team player.” The dangers of risk-taking are ignored, because to doubt is to show a lack of of confidence and to reveal yourself as being less than leadership material.

Sometimes, the result pays off, because audacity can take people by surprise. But, if you look around business, more often the result is rash, ill-considered, or just plain wrong decisions whose shortcomings a moment’s reflection would have revealed.

For instance, I once worked for a company that brought in a CEO armed with the latest managerial theories. His inevitable response to any company financial crisis was to purge the staff. He would protect his officer team, but otherwise his purges were random. Frequently, he fired key employees who were the only ones who understood major parts of the software that the company was producing. Not that he meant to fire key employees, but the problem was he couldn’t recognize them and was just as likely to fire them as anybody else.

The result? Survivors were demoralized, because not even the jobs of key players were safe. Often, a few months later, the key players were hired back at the more expensive rates of consultants. Other times, the company blundered on alone, trying to recover the lost knowledge instead of doing original development. Four purges and two years later, the company sold its resources and ceased business. What looked like bold and decisive action to the board of directors in the long-term destroyed the company because it was uninformed.

By contrast, self-doubt carried to extremes causes indecision. But what few people seem to consider is that, kept within reasonable limits, self-doubt can be a healthy and creative attitude. Where the artificially confident plunge unthinkingly ahead, the self-doubter looks for information and considers alternatives. Afraid they have left something out, they ask for feedback from other people. Before they act, they double-check, and try to allow some flexibility. While they may miss opportunities that require immediate response, the self-doubters are far less likely than the self-confident to do something wrong – or, if they do, they may have a plan to correct or mitigate the problem.

In other words, doubting yourself can be a source of creativity and painstaking. In fact, of all the accomplished writers and artists I have known, and of all the entrepreneurs I have known who were successful over a period of years or decades, not one of them fell into the category of the artificially self-confident. They might have a facade of confidence, especially the entrepreneurs and especially the men, yet talk to them in private and you would be in no doubt that they were self-doubters. Some of them were not the most naturally gifted, yet they succeeded because their self-doubts drove them to compensate for their perceived deficiencies.

What I have suggested seems a paradox: those who appear most likely to succeed aren’t. Yet I think this paradox is central to creativity and planning.

Robert Graves expressed the paradox elegantly in his poem, “Broken Images:”

He is quick, thinking in clear images;
I am slow, thinking in broken images.

He becomes dull, trusting to his clear images;
I become sharp, mistrusting my broken images,

Trusting his images, he assumes their relevance;
Mistrusting my images, I question their relevance.

Assuming their relevance, he assumes the fact,
Questioning their relevance, I question the fact.

When the fact fails him, he questions his senses;
When the fact fails me, I approve my senses.

He continues quick and dull in his clear images;
I continue slow and sharp in my broken images.

He in a new confusion of his understanding;
I in a new understanding of my confusion.

Read Full Post »

I see that some American states are starting to investigate the use of interns as unpaid labor. All I can say is that it’s long overdue.

So far as I’m concerned, most companies that use interns are like John Newton, who wrote the hymn “Amazing Grace” in the mid-1700s. Contrary to a popular misconception, Newton did not become a Christian and write the hymn, then turn against his job in the slave trade; instead, after writing the hymn, he remained both a Christian and a slaver for two decades before coming out against the slave trade.

Too often, companies are like Newton on a smaller scale when they hire interns. They may be environmentally conscious and contribute to charities in their communities, but their labor practices make them hypocrites hiding behind conventional business practices.

Understand, I am not talking about programs like Google’s Summer of Code that give small stipends to students who would otherwise be unpaid volunteers. Still less am I talking about companies who hire co-op students at proper entry level salaries, or about genuine apprentice programs. What I am talking about are companies that hire the young and aspiring for full-time work at far below what they would pay a new employee — if they pay them at all — while pretending that they are giving them something special.

The argument used to justify such internships is that those chosen gain valuable job experience. Moreover, because interns are generally untrained, their employers often argue that they require experienced employees to watch over them and redo their work if necessary.

However, the same arguments could be applied to new employees. In most jurisdictions, the fact that someone is a new employee is not grounds for denying them a living wage, so why should the same argument be considered valid for interns? In entry-level positions, new employees are often no more trained than interns are. New employees may receive a smaller salary while on probation, but even so they generally receive enough to live on.

When I was chief steward for the Teaching Assistant’s union at Simon Fraser University, we had a basic negotiating principle: a fair days’ work for a fair day’s pay. That is not the least socialistic (not that there’s anything wrong with that so far as I’m concerned; I can belt out “Where the Fraser River Flows,” “Solidarity Forever,” and a lot of the rest of Utah Phillip’s repetoire). Rather, it’s an insistence that our semi-capitalistic system live up to its own principles. Employees who are producing acceptable work for you deserve to be paid the going rate for that work; if their work is not acceptable, you fire them. The exchange of labor is as simple as that, and there is no excuse for making an exception for interns.

The real reason for underpaying interns — as if anyone couldn’t guess — becomes obvious when you notice that many companies delay filling full-time positions until after the interns have left at the end of August, or hire more interns than full-time staff. Such cases make clear that interns are simply a cheaper (or free) pair of hands. When you keep this reason in mind, all the the pious claims of helping interns by giving them experience becomes the modern equivalent of claims that 19th Century slaves were housed and fed better than in their homelands, or benefited from exposure to Christianity. All these claims are simply excuses for unethical business practices that conventional morality chooses to ignore because they are convenient.

True, some companies eventually hire the best of their interns. But only a handful of interns are ever so lucky. Besides, companies might as well ask new employees to pay a premium for their position, because, by giving a company cheap labor, that is basically what interns are doing when they are later hired as regular employees. No matter how you look at it, the fact that some interns are hired full-time doesn’t justify internships any more than the fact that diligent slaves were sometimes freed justifies slavery. Interns may be better off than slaves (although, considering what I’ve heard about certain gaming companies, I sometimes wonder), but the scope of the ethical dodginess doesn’t change the basic situation.

Low-paying internships would be objectionable under any circumstance. However, what makes them worse is the pretense that they are anything other than a cost-saver. At least if companies would say, “We hire interns because we save money that way,” an honest discussion could take place. But, instead, they hide what they are doing by claiming that they are the benefactors rather than exploiters.

This claim is an ethical dodge that Newton would have understood. But at least he eventually saw his own contradiction. There are few signs that, left to themselves, companies that exploit interns ever will.

Read Full Post »

If an artist has an apprentice work on a piece, are they dishonest if they sign the piece as though it were their own? By coincidence, two acquaintances have found themselves confronting that question. How each of them answered says something about how we regard art and the definition of authenticity.

In the first case, my acquaintance commissioned a mask from a well-known Northwest Coast artist (I am deliberately not mentioning names or any identifying details, because the issue touches on artists’ integrity). When the artist passed the mask to an apprentice to finish, my acquaintance was furious. I supposed my acquaintance would say that they paid for a work by the artist, not by the apprentice, although the fact that the artist gave the mask to the apprentice in front of them suggests that, the artist was not trying to be deceitful.

In the second case, an acquaintance bought a mask, and was contacted by the artist’s former apprentice, who claimed that the mask was theirs. The artist had frequently stolen their work, the apprentice claimed. However, investigation showed the matter was not so simple. The apprentice’s carving style was similar to the artist’s to begin with, and the apprentice had roughed out the mask, but most likely under the artist’s supervision. From the one picture of the apprentice with the mask, the artist seems to have finished the carving, painted the mask, and added many characteristic details. The result was far beyond the apprentice’s usual level of skill, and, according to one rumor, the apprentice had formally sold rights in the mask to the artist.

I suppose that, at some point, an apprentice’s work becomes extensive enough that they deserve credit on a work. Yet, while that should be true, the practice of having apprentices help with an artist’s work without receiving credit is extremely old. With many paintings done in the European Renaissance, the question of how much of a work is an apprentice’s remains a disputed point.

Similarly, in modern Northwest Coast art, it is an open secret that Bill Reid’s “Raven and the First Men” may have been designed by Reid, but was carved by Reg Davidson, Jim Hart, Gary Edenshaw and George Rammell, with Reid doing mostly finishing details. As for “The Black Canoe – The Spirit of Haida Gwaii,” Reid is supposed to have contributed only the design, partly because of his illness and partly because he knew next to nothing about bronze casting. Although these collaborators were chosen for their expertise, no one suggests that general credit for these works should not go to Reid, although many (including me) think that their contributions should be more generally known.

So why is the buyer in the first case and the apprentice in the second case angry? Part of the reason may be that, although collaboration is widespread in Northwest Coast cultures, especially on large projects, the idea persists in Euro-North American culture that fine art is done by one person. If the work is not the artist’s, then it must belong to the apprentices who worked on it.

A work on which more than one person deserves credit can easily be seen as inauthentic – and definitely not what the buyer paid for. And, possibly, a collaborative work will be less valuable than a work by a single artist, although that does not seem to have happened with the collaborations between Norman Tait and Lucinda Turner in Northwest Coast art.

Aesthetically, however, does who created the piece matter? I have not seen the mask in the first case, but the mask in the second case is an accomplished and sophisticated work, no matter who deserves credit for it. In this light, arguing over who deserves credit seems almost crass, even thought it might be the legitimate grounds for a law suit.

My own take is that the acquaintance in the first case has no reason to complain, given the traditional relationship between artist and apprentice. Letting an apprentice finish a mask may seem high-handed, yet it was done so openly that the artist probably did not intend to deceive. Nor does it seem likely that an artist would let an inferior piece out of their hands; to do so would affect their reputation. Before my acquaintance received the mask, it was almost certainly finished to the artist’s usual standards, no matter who did the work.

The second case seems just as clear. Although the apprentice’s hands may have been on the tools, the artist seems to have guided the making of the mask at every stage. Just as with the first step, conventional practice would attribute the mask to the artist, and not the apprentice.If the artist was feeling generous, they might acknowledge the apprentice’s contribution, but they are not obliged to.

Of course, in real life, the matter is not so simple. The buyer in the first case and the apprentice in the second might have a law suit over the expectations created by their positions. And possibly the apprentice might try to assert ownership and create a miniature nightmare for the buyer of the piece.

However, based on common practice, I doubt that either would get very far. If the facts are anything like those I’ve summarized, then by precedence, the art work should be attributed to the artist, and not the apprentice.

Read Full Post »

Older Posts »