Feeds:
Posts
Comments

Archive for the ‘Personal’ Category

“I don’t think I have ever been quoted as well by a reporter before.”

I like to think that I’m immune to compliments but this comment from someone I interviewed earlier this week got through my defenses. I interpret it as saying that I reported what the person said accurately.

Or, to be exact, my reworking of what the person said was a close reflection of their thoughts. Because, of course, no journalist quotes an interview subject word for word – unless, that is, they want to portray the subject as a incoherent half-wit. If you have transcribed as many interviews as I have, you’ll know that even the most fluent speaker can be made to look rambling and dull by quoting every little pause, space-filler, and change of direction in thought. To make the story read better, all journalists routinely edit quotations to help the continuity of their stories. If they are also ethical journalists, they do while making sure that they preserve the sense of what the subject said.

At any rate, the comment pleased me, because accurately reflecting what someone has to say is a skill on which I pride myself. When I pitch a story to my editors, I rarely have a fixed opinion on the subject, except when I’m writing a commentary. Instead, I want to write the story because I’m interested in learning more about the subject. My opinion emerges from the as I research the story and talk to different people; on those occasions when I do have an opinion on a subject, I frequently alter it as I develop the story.

This habit does little to soothe the nerves of potential interviewees who ask what my perspective on the subject we’ll discuss happens to be, or ask in advance for the questions I want answered. If I were being completely honest, I’d have to explain that, in most cases, I don’t have the least idea what the perspective will be in the story. Similarly, while I jot down topics I want to cover, I rarely prepare specific questions. When I do, the resulting article is never an example of my best work. Instead, I develop my questions while listening to the interviewee. But these explanations, I suspect, would not be believed by the suspicious. They’d be sure I had a hidden agenda. The more I explained, the more paranoid they would become.

All the same, they’re the truth. While I taught in an English Department when post-colonialism was the prevailing critical theory, I’ve never been a believer in completely subjective truth. At the risk of sounding naive, I believe, if not in objective truth, then in the effort to find it. I’m well aware that my bias creeps in to everything I writer, regardless of my intentions, but I don’t believe that my perspective is endlessly interesting, so I try to vary it with the opinions of those whom I talk to.

That’s not to say that I don’t have a define viewpoint by the time I finish an article – although I do try to subdue the expression of it, because I happen to think that a gently-delivered truth that guides readers to the conclusions I want to give them is more effective than a thundering oration. But if I want to persuade people to accept my outlook, I want to make my development of my points as accurate as possible to make them more logically acceptable.

So, yes, I do try to report the gist of what other people say. It is both part of my code of ethics and part of my style of discussion to do so. No doubt I often fail, both ethically and stylistically, but such are my ideals – and I’m warmed by the thought that someone has noticed.

Read Full Post »

After writing professionally on the web for several years, I’m no stranger to careless readers and wonky comments. I learned long ago that not only can you not count on everyone to read your thoughts carefully, but that some subjects, such as Microsoft’s intentions towards free software, cause many people’s critical facilities to go on holiday. But none of the subjects I’ve written about provoke as much blind reaction as the suggestion that grammar should be descriptive rather than prescriptive.

I was reminded of this fact recently when I noticed that my article “Tech-writers, Grammar, and the Prescriptive Attitude” had not survived a redesign of the Techwr-l site where it was originally posted. Because I still get requests for it, I asked Deb and Eric Ray, who maintain the site, to send me a copy of the published version, and posted it on my blog. The request also prompted them to repost the essay on the original site. In the days since, I’ve been fielding a comment or two per day from both sources, most of them sent privately.

Some people approve the sentiments in the article, but perhaps half are outraged. They set out to correct my thinking by pointing out that, without consistency, language ceases to communicate. Their assumption seems to be that only official English has consistency. I answer that all forms of English have their own rules – for instance, a double negative, which is considered wrong in official English, is perfectly understandable with the context of some Afro-American dialects (and, I might add, Old English). So far, none of these conversations have continued beyond this point.

Another conversation about the article begins with someone asking if I’m saying that we (by which the reader usually means sophisticated, literary folk like themselves) should go along with mispronunciations or incorrect uses, usually in sarcastic tones that suggest that, of course, I mean no such thing. I reply that I do mean exactly that, that, when a pronunciation or usage reaches a certain degree of popularity, it becomes standard usage.

To this comment, my correspondent usually replies that it is the duty of the literate to fight against such barbarisms. My response is that you are, of course, free to use any language that you care to, but, if you imagine that your example is going to inspire an outbreak of proper usage, then you think far too much of yourself. The most anyone can do is avoid usages that are vague in the name of clarity and personal style – and, at that point, another conversation peters out.

And these are just the most common ones. I’ve been accused of advocating complete chaos, of insisting on poetic language at the expense of clarity, and all sorts of other stances that have nothing to do with what I wrote, no matter how you construe my words. Often enough, people insist that I believe something that I frankly state that I do not believe. Apparently, many people – particularly those who work with words – have so much invested in their self-image as initiates into the secrets of proper grammar that any suggestion that their knowledge is as useless as heraldry immediately robs them of their ability to read and analyze text.

Personally, I find the idea that we have, not one but dozens of versions of English exciting and challenging. It means that, as a writer, I have more to explore than I can possibly learn in one lifetime. But that, unfortunately, seems to be a minority viewpoint.

A good thing that I thrive on being a contrarian, or I might find the hostile responses disheartening.

Read Full Post »

It is impossible to experience deja-vu for the first time.
I reckon the first time you experience deja-vu must be the second.

– Les Barker

These days, I can’t go to a networking event without meeting at least two or three people who are hoping to start their own high-tech business. Taking “Web 2.0” and “social networking” as their personal mantras, these contacts sound eerily like throwbacks to the dot-com boom. Enough time has passed, I suppose, for people to forget the lessons of that first infatuation with technology. As a survivor of that first era, I could tell them a thing or two, but mostly, I don’t bother. They wouldn’t thank me.

If the old dream was just about quick money, then the whole things wouldn’t be so painful. Most of the dreamers are going to fail, and that’s a lesson that can hurt, but can be valuable. If you find that your thirty thousand stock options are worthless in one company, you can always do what I did, and get another thirty thousand from your next company, continuing the process until reality sets it. You learn about persistence, and eventually you learn that hard-slogging work pays in smaller but more reliable returns – both useful lessons.

But, just like the dot-commers, the Web 2.0 generation isn’t only concerned about money. Most of its members would happily settle for survival as the owners of their own small business. Still more are attracted by being involved with something larger than their selves, for experiencing the sense of belonging that comes with being involved in the biggest trends of the era. And it’s this sense of purpose that is likely to shatter on the pavement when reality sweeps their feet out from underneath them.

Take me, for instance. My first dot-com startup, the pay was three-quarters of what I had been earning as a consultant. I never did believe – not really – that the company would go public and my stock options would let me retire. What concerned me was that we (and it says something about the spirit of the times that, for a non-team player like me, there was a “we”) were going to change computing by introducing GNU/Linux to the world.

Moreover, as the first non-developer hired by the company, I was playing a leading role (maybe theleading role in my own mind) in making that dream a reality, cutting bundling deals, hammering out a features list, going over legal contracts and licenses and discovering all the other thousand and one things needed to bring a product to market.

My second company offered much the same – only better, because this time I was working with big names in the field and being flown across the continent for the sake of my expertise.

Was I self-important to the point of blindness? No question. But other parts of my life were at an absolute nadir, and the dream gave some desperately needed meaning. It’s because I remember that desperation that I don’t want to spoil things too much for this next generation of dreamers. Let them dream while they can.

Of course, if they did ask, I would warn them that being tipsy with meaning doesn’t mean that they should abandon common sense. Half-intoxicated as I was, I never could see why those around me were working long extra hours when they didn’t need to, or sleeping in the cardboard boxes that file cabinets came in, just so they could have the full experience (in the same spirit, many line up for hours for tickets or Boxing Day Sales – not out of necessity but because they don’t want to miss the excitement). Nor could I see the point of those who hung on after I left, working for half pay and then deferred pay, or staying loyal before they were laid off. Too many dot-commers forgot in their quest for personal meaning that business remains business, and my only personal claim to foresight is that I twice remembered that simple fact and ejected before the crash came.

If asked, I would also tell them about my post-dot-com survival, about how, after feeling yourself in the avant-garde, laboring to produce dull and sensible things that people actually want to buy seems pointless and bland. And if you once believed that you were not only in the avant-garde, but leading it, then life in an ordinary office under managers and executives who know no more – and sometimes less – than you do becomes simply an exercise in sustained frustration. I would warn them that their experiments with meaning and work will make them unfit for anything else except becoming consultants in their own small business.

Not that this role is an unsatisfying one – far from it, I would say. After all, iit’s the one that I chose. But unless what you really want is not just purpose, but control of your life, it would be cruel to encourage anyone down this twice-trodden path. You’ll only be disappointed and unhappy, unless you are one of that handful who truly wants that direction in life, one of those for whom the boom-gone-to bust (and it always goes to bust sooner or later, believe me) means a hard-won chunk of satisfaction.

Like I said, I could tell this new generation of dreams these things, but they wouldn’t appreciate hearing them. So I try not to intrude on their dreams, and smile fondly as I hear their excited talk of commitment.

Goddammed kids with goddammed stars in their eyes. I hope they enjoy the roller coaster, and appreciate the ride when they stagger away.

Read Full Post »

Railroading on the Great Divide,
Nothing around me but the Rockies and sky,
It’s there you’ll find me as years go by,
Railroading on the Great Divide.

– Traditional

Reading The Globe and Mail a couple of days ago, I learned that folk-singer Bruce “Utah” Phillips had died on May 23. He had been ill since last summer, so the news wasn’t exactly a surprise. And with his unkempt gray beard, he had always looked a decade or two older than he was, so I had been expecting to hear of his death for some years. But I suppose that one of the consequences of growing older is that you have to watch your heroes die off one by one. And Utah Phillips was certainly once of mine.

As we come marching, marching, marching, unnumbered women dead
Go crying through our singing their ancient cry for bread.
Small art and love and beauty their drudging spirits knew.
Yes, it’s bread we fight for — but we fight for roses, too.

I first heard Utah at one of the early Vancouver Folk Festivals. In fact, except for a concert or two, most of the places I heard him play or talked to him (casually, and so little I would have been surprised to hear that he remembered me) were at one Vancouver Folk Festival or another. He was never the greatest guitar play – no one came to hear him on the guitar, Kate Wolf is supposed to have said when she coaxed him out of retirement when his fingers were stiff – and his voice was never more than adequate. But he was one of those people who know how to put a song across. There was a sincerity and passion in his voice that was infectious. Just hearing it could inspire you.

We have fed you all for a thousand years,
And we hail you still unfed,
Though there’s never a dollar of all your wealth,
But marks the workers’ dead.

Another part of his appeal was his material. Few people today know or perform the old Wobbly songs – the material used by the Industrial Workers of the World in their agitation. But, to the extent that anybody does know or perform them, they do so because of Utah Phillips. Often reworkings of popular hymns (“so they made more sense,” as Utah liked to put it), and inevitably attributed (often dubiously) to Joe Hill, they were a glimpse of the past outside the one provided by official history, and often wickedly humorous. “The Popular Wobbly,” “Where the Fraser River Flows,” “We Have Fed You All for a Thousand Years” — there was a time when those were the songs I played several times a week, whose words I memorized and whose tunes I went around humming.

I spent my whole life making somebody rich,
I busted my back for that son of a bitch,
And he left me to die like a dog in a ditch,
And he told me I’m all used up.

Occasionally, these forgotten songs would be joined by Utah’s own compositions, such “Enola Gay” “All Used Up” and “The Goodnight Loving Trail.” No doubt influenced by the material he was keeping alive, they were examples for me as a young man of the attitudes I needed to survive and think well of myself. For better or worse, I am who I am today in some part because of Utah Phillips’ songs.

What will I say when my children ask me,
Where was I flying up on that day?
With trembling voice I gave the order
To the bombardier of Enola Gay.

Then there were the stories that he came to tell with increasing frequency as he grew older. Some were tall tales with a hilarious, sometimes political point to them, but the best were stories about his life on the road. Early on, I inferred that Utah was not a natural anarchist or pacificist, but that he had done his best to reshape himself into something like the man he wanted to be. Not being naturally those things myself, I was fascinated to hear the bits and pieces of his life story that he had worked into stage material.

I have lead a good life, careful and artistic,
I will have an old age, coarse and anarchistic.

I remember, too, the outrage at one folk festival, when Utah’s children concert involved stories about how to get meals for free at a restaurant. Somehow, I’m sure that he delighted in shocking the trendy leftist parents as much as he enjoyed talking to the kids.

Hallejuah, I’m a bum, hallejuah, bum again
Hallejuah give us a handout to revive us again.

For about a decade now, whenever we tired of the second and third rate poets and dub artists that the Vancouver Folk Festival seems determined to inflict on us in the name of attracting a younger audience, we could always count on one of Utah’s sessions to provide both solid entertainment and inspiration.

Are you poor, forlorn and hungry,
Are there lots of things you lack?
Is your life made up misery?
Then dump the bosses off your back.

That’s gone now, but that’s not why we’re not going to the festival this year. The reason we’re not going (or one of them) is that we can’t stand the thought of the mythologization of Utah that will undoubtedly be going on. In the process of remembering him, the festival officials are likely to try to turn him into a kindly old eccentric, and, while I can’t say I knew him well (or even at all, really), I know that he was more than that. He was an original, and someone who tried to live what he believed, and he deserves to be remembered with all his human imperfection. I’d like to remember him as he was, so I’ll leave others to the creation of comforting lies about him and remember him by putting on one of his old LPs instead.

I dreamed I saw Joe Hill last night
Alive as you and me,
Says I, “But Joe, you’re ten years dead,”
“I never died,” says he.

Read Full Post »

At 5’9” (175 centimeters), I am on the short side of average for a man born in North America. However, I’ve never noticed felt my lack of height. For one thing, the last few decades’ immigration from countries with a traditionally lower protein diet make me closer to average in a crowd. For another, like many shorter men, I have the self-concept of a much larger man.

I don’t mean this in any metaphorical sense. Contrary to stereotype, I feel absolutely no need to compensate for a lack of inches with an aggression that might be mistaken for steroid popping, or with an exaggerated sense of competition.

Nor am I not likely to order an invasion of northern Italy or the reoccupation of the Rhineland to compensate for my lack of height. My own insecurities lie elsewhere.

Rather, I’m being literal. You see, until I was fourteen, I was tall for my age – I just didn’t grow more than a few centimeters after that. When I was entering adolescence, I was one of the two or three tallest and stockiest boys in my school, and that left a mark on me.

Objectively, I know I no longer physically dominate my immediate space, but, on some basic level, I seem to believe that I still do. Usually, I don’t think much about my height, but, every once in a while, I notice how tall some other person – usually a man – is in relation to me, and get a surprise when I realize that they are actually taller than me. Even after all these years, I still expect to be one of the taller people around. Nor have I entirely forgotten what being tall is like.

If you’ve never been tall for your age, you might not realize that, whatever other facets your personality has, height gives you a privileged position. Even if you are shy in other ways – and in adolescence, I could be extremely shy – you get used to being the first to catch everyone’s eye when you enter a room or location, and being able to see from the back of a crowd. That can be especially pleasing to you if you’re single, because members of the opposite sex will notice you, if only briefly.

Also, extremely short people seem fragile to you, so much so that you may feel a condescending pity towards them out of all proportion to any real difference to your strength. There are still times when having to elbow my way to the front to see seems an assault on my dignity, and I feel odd at times, knowing that a very tall person is looking at me and thinking dismissive thoughts.

Another characteristic of being tall is that you tend to have two distinct ways of using the space around you. If you are feeling anti-social or aggressive, as a tall man you claim as much space as possible, sitting with your legs forever sprawled out in front of you and your arms spread out along the back of your chair or couch. By contrast, if you are more polite. you are careful not to claim more than your share of space, especially around women. Sometimes, you may even claim less space than you need to be comfortable, as a courtesy. Even today, I tend to behave like a polite tall man, although if I’m feeling contrary towards someone, I occasionally find myself claiming all the space I can at their expense.

All these responses are muted in me today, but I remember them, and I am still aware of their remnants. They took a long time to fade, partly because my mental image of myself – like most people’s – is always a few years behind the physical reality, and partly because the endorphins and adrenalin addiction of heavy exercise gave me a separate reason for physical confidence. When I realized around the age of twenty that I would be a short man, I was unsettled, because the thought was so unexpected.

Still, if I try, I can remember being tall, and that occasionally works for me. Because I know what physical confidence looks and feels like, I can summon up the illusion well enough that it has helped me bluff my way through some of the rare occasions of threatened violence in my life. More often, because I’m aware of the way that the tall use personal space when they’re bullying, I have been able to ignore it during negotiations, or nullify it by mirroring it. Like a butterfly whose coloration imitates a poisonous one, I am only mimicking in these situations, but bluffing can be a useful skill to have.

Otherwise, contrary to the impression you might get from this post, most of the time I don’t spend time worrying about my lack of height. Sometimes, I think I might have been a better runner if my calves were a few centimeters longer and in accordance to classic proportions, but that’s about it.
If anything, these days I take a wry pride in my size. After all, a bullet is a small thing, but with a gun to give it velocity, it becomes deadly. And if someone I’m negotiating with under-estimates me, for my height or any other reason, I’m quite content to let them; their attitude means that I can enjoy triumphing over them without feeling guilty.

But in the end, I can’t do anything about my height – less, really, than my hair color or the shape of my nose, since boots with heels make me clumsy. For the most part, I tend to dismiss the topic as irrelevant, except as a source of amusement at the foibles of both others and me.

Read Full Post »

A correspondent tells me that Boycott Novell’s Free Software Credibility List gave me a rating of three on a six point scale (I could link, but I don’t want to give the site any more hits than I have to). Until hearing this news, I didn’t know about the list, because, so far as free software is concerned, I only read news sites and blogs with either technical knowledge or expert commentary. Usually, too, I make a habit of not commenting negatively in public on anyone with a claim – no matter how remote – to being a journalist. At the very least, I generally don’t mention them by name. However, since my informant seemed to think I should be upset, I’m making an exception here.

To be honest, I am more amused than angry about a list whose silliness is exceeded only by the self-importance of its owners. I mean,  how does any journalist, no matter how skilled a word-slinger, get the same rating as Stallman, the founder of the free software do? Yet several do. And why are authorities like Eben Moglen off the list?

I also notice that, at least in some cases, the list seems a direct reflection of how closely a journalist’s opinion corresponds with Boycott Novell’s, rather than any criteria that might be mistaken for objectivity. Robin Miller, the senior editor at Linux.com, is apparently denigrated because he took a group tour of the Microsoft campus a couple of years ago (I’m sure the fact that he presided over a podcast in which a Boycott Novell writer performed poorly has nothing to do with his ranking). Other writers seem to rate a 4 or 5 largely because they stick to technical matters and, rarely talking about philosophy or politics, say nothing for Boycott Novell to dissect for suspect opinions.

Strangely, the Boycott Novell cadre didn’t rate their own reliability, although whether that is because they are assumed to be the only ones who rate a perfect six or because the ranking doesn’t include negative numbers, I leave as an exercise to the readers.

From the link attached to my name, my own ranking seems based on the fact that I accepted that a comment signed with a Boycott Novell writer’s name really was by him; when he said it wasn’t, I accepted the claim and he suggested that I was owed “some apologies.” Yet, apparently I’m permanently branded as being only marginally trustworthy because of this minor incident. I suspect, though, that the writer’s belief that I lumped the Boycott Novell writers into the category of conspiracy theorists has more to do with my ranking than anything else.

But these foibles don’t disturb me unduly. Far from being upset, I’m glad of the list, because it gives me a goal. If I write consistently hard-hitting articles in which I dig carefully for facts, build a flawless chain of reasoning, and tell the truth no matter how uncomfortable the consequences, then maybe – just maybe – in a few years Boycott Novell will reward me with the ultimate accolade of a zero ranking some day. Then I’ll know when I have truly arrived.

And that is all that I intend to say on this subject. Ever.

Read Full Post »

When I was running cross-country in high school, my coach was blunt and unpretentious. One boy who briefly tried out of the team kept talking to him about getting his second wind (whether because he hoped to reach that mythical state or for some other reason, I could never figure out). But I used to be embarrassed for him, because I knew the coach was to straightforward to talk in such elevated terms. In his view, you just ran – you didn’t talk about it. I must have absorbed some of the coach’s matter-of-factness, because when I see how some people at the gym try to elevate the simple act of exercise, the same feeling of embarrassment on their behalf floods over me.

The self-aggrandizement starts with their clothing. Naturally, exercisers need a pair of shoes that will give them support, and at least a sweat suit for warmth and dryness. However, these needs are simply met. For all the exercise most people do, they can probably find an adequate pair of shoes for under $100. If they find a sale, they might get away with as little as $50. But, to hear people at the gym talk, anything less than a $200 pair of shoes, and they’re risking crippling themselves for life.

The same goes for shorts, T-shirts, and everything else that they’re wearing. Never mind that they are lifting weights, or only spending twenty minutes on the treadmill. They talk as though they’re planning an Arctic expedition, and one false economy will leave them to suffer the fate of Franklin.

In the same way, I notice that nobody can undertake a workout nowadays without a water bottle. I even hear the trainers who give personal sessions at the gym solemnly warn people never to exercise without their water bottles nearby, and to take a sip every ten minutes or so. You’d think they were planning to run a marathon across Death Valley in the middle of a summer afternoon.

All of which leaves me, whose workout lasts an hour and ends with a few sips of water before I jog home, more than a little amused.

But the worst are the grunters. You know the ones I mean: The ones who are unable to lift the lightest weights without providing their own soundtracks of agonies. Typically, they stand in front of the mirror, motionless for a minute, then heave their weights towards the ceiling, contorting their faces and grunting or moaning as if they just pulled a leg muscle. Apparently, they claim that their noises are the equivalent of a war-cry, and helping them to focus their energies.

Maybe. But I’d be far less skeptical if they were lifting a hundred kilograms rather than twenty.

What all these behaviors have in common is that they take the very simple act of exercise and try to make it more dramatic. In the process, the people who indulge in these behaviors make themselves and their actions feel more significant.

Personally, I always wonder: Why can’t they just get on with their exercise? They won’t have a better workout for any of these behaviors, and they probably won’t impress anyone who overhears them, either.

Read Full Post »

(Note: I wrote the following article six years ago for the Techwr-l site. For a long while, it was my most-requested article. Recently, I noticed that a site makeover had left the article down. By now, it may be posted again, but I thought I’d reprint it here, although it’s very long by the standards of blog postings. Although addressed to technical writers, almost everything in it applies to writers and writing in general.)

Most technical writers are confused about grammar. On any day on the TECHWR-L list, basic questions are asked: “Is ‘User’s Guide’ or ‘Users’ Guide’ correct? Maybe ‘Users Guide?'” “Should ‘web’ be capitalized when used to refer to the World Wide Web?” “Which is right: ‘A FAQ’ or ‘an FAQ?'” Many of these questions become the major thread on the list for a day or two, generating far more debate than they’re worth.

The confusion isn’t so much about the grammatical points themselves. It’s about the nature of grammar in general. Apparently, many tech writers do not see grammar as a set of conventions to help them write clearly. Instead, to judge by the wording of the questions and responses, they see grammar as a set of unchanging rules that can provide definitive answers in every situation.

Some are afraid to break the rules of grammar and risk being denounced as incompetent. A handful, smugly sure that they know the rules, use their rote learning of the rules as an ad hominem attack, nitpicking at typos and small errors to discredit writers without disproving their viewpoints. Most sit in the middle, haunted by the ghosts of childhood grammar classes until they can hardly tell on their own authority whether they are writing well or not. But underlying all these reactions is an attitude that rules are rules, and cannot be broken.

This attitude is usually known as a prescriptive approach to grammar. It assumes that grammar exists mainly to tell us how to speak or write properly–not well. It is an attitude that tech writers share with almost everybody in the English-speaking world. It is a form of conditioning that begins in kindergarten and continues through high school and even into college and university. It undermines nearly everyone’s confidence in their ability to communicate, especially on paper. Yet it is especially harmful to professional writers for at least three reasons:

  • It grotesquely exaggerates the importance of grammar. Although competence in grammar is sometimes proof of other writing skills, it stresses presentation over content. Even worse, it stresses correctness over precision, conciseness, or clarity.
  • It binds writers to viewpoints that are not only arbitrary and obsolete, but, in some cases, far from their own opinions.
  • It undermines writers’ confidence and their ability to make decisions about how to communicate effectively.

Why are we burdened by this attitude? How does it affect us? The easiest way to answer these questions is to look at the origins of the prescriptive attitude and the alternatives to it. Only then can we begin to grasp how we can live without it.

The Rise of Prescriptive Grammar

Prescriptive grammars are the products of the Enlightenment. Earlier grammars such as William Bullokar’s in 1586 and Ben Jonson’s posthumous one were also prescriptive, but intended for language students. The first prescriptive pronouncements for native English speakers date to the Seventeenth and Eighteenth Centuries.

This is the start of the great era of describing and recording. In every subject from biology to Egyptology, educated men struggled to write accounts so thorough that no other one would ever be needed. It was also a time that looked both backwards and forwards to Classical Rome. It looked back in the sense that ancient Rome was seen as the height of civilization. It looked forward in two senses: England perceived itself as a second Rome, and Latin was the language of international science.

The first prescriptive comments were very much in the spirit of their times. Their compilers hoped for definitive grammars that would fix the form of
English once and for all, and provide a source for settling grammatical disputes. Since Latin was the language of the world’s most sophisticated civilization, the closer this fixed form of English was to Latin, the more sophisticated English would be. Moreover, given the belief that culture had been degenerating since the fall of Rome, most grammarians automatically equated any change in the language with decay and degeneration.

John Dryden, the poet and playwright, was among the first to make prescriptive pronouncements. His main contribution to prescriptive grammar was to suggest that prepositions should never end a sentence. The reasons for this prescription are that Latin sentences rarely end in prepositions, and that the word “preposition” clearly indicates that this part of speech should go before (“pre”) the noun it is associated with. Dryden criticized Shakespeare and Jonson for not following this rule, and scrupulously edited his own writings until they conformed to it.

Dryden also hoped to fix the form of English, which was still rapidly changing. Together with the diarist John Evelyn and other authors, Dryden called for an English version of l’Academie Francais–the body of scholars and writers that oversee the creation of the official French dictionary and is the arbiter of correct French. Dryden’s plea for an English Academy was echoed later by Daniel Defoe and Jonathan Swift. The idea was being seriously considered by Queen Anne when she died in 1714. But with the ascension of the German-speaking George I, the question shifted from the monarch helping to purify English to teaching the monarch English, and the idea was dropped.

Deprived of royal assistance, English men of letters did their best to improve the language on their own. In the mid-Eighteenth Century, a number of grammars were published, as well as Samuel Johnson’s famous dictionary, whose Preface spells out its prescriptive purposes with both succinctness and dry wit. All these works were heavily prescriptive, although Joseph Priestley did include some comments about the importance of common usage in deciding what was proper.

The most influential of these grammars was Robert Lowth’s “Short Introduction to English Grammar,” published in 1761. Criticizing almost every major English writer from Shakespeare to Pope, Lowth made most of the prescriptive statements that people still follow today. His prescriptions include:

  • Two negatives make a positive, except in constructions such as “No, not even if you paid me.”
  • Never split an infinitive.
  • Never end a sentence in a preposition.
  • “Ain’t” is unacceptable in formal English.

These ideas have several sources: An attempt to model English on Latin (and therefore to arrive at a universal grammar that underlay all languages), a desire to be scientific, and Lowth’s personal preferences. For instance, Lowth would not accept split infinitives because Latin infinitives are one word and cannot be split. Similarly, two negatives make a positive because they do so in mathematics. The ban on “ain’t,” though, seems entirely Lowth’s idiosyncrasy. None of these prescriptions, however, took any notice of how people actually spoke or wrote. For example, despite Lowth, “ain’t” continued to be used by Queen Victoria and the upper classes until the start of the Twentieth Century.

Although Lowth later became Bishop of London, his ideas on proper usage would probably have remained obscure if they had not been borrowed for classroom use. In 1781, Charles Coote wrote a textbook grammar based on Lowth’s Grammar, adding his own preference for “he” and “his” as the indefinite personal pronoun (as in “everyone is entitled to his opinion”). A few years later, Lindley Murray borrowed from Lowth to write a series of textbooks for a girls’ school. Murray’s textbooks became so popular that they quickly became the standard English in American schools throughout much of the Nineteenth Century. With modifications, Murray’s “English Grammar” and Lowth’s Grammar have been the basis for textbook grammars ever since. With their unyielding rules, these textbooks have given at least eight generations of English-speakers the prescriptive attitude that inhibits people today.

The Biases of Prescription

In their language and their purposes, prescriptive grammars make a strong claim to objectivity. A. Lane was typical of the first grammarians when he wrote in 1700 that the purpose of grammar was to teach people to speak and write “according to the unalterable Rules of right Reason.” Similarly, in writing his dictionary, Samuel Johnson humorously referred to himself as a “slave of Science.” These claims are still echoed when modern defenders of the prescriptive attitude assume that the rules of grammar are value-free. Yet a closer look at prescriptive grammar reveals distinct biases. Some of these were openly admitted by the first grammarians. The problem is that some of these biases are no longer current. Others are demonstrably false.

The early grammarians’ claims to be scientific or precise concealed a strong personal bias. This bias was by no means a conspiracy–it was simply natural self-expression. Grammarians were highly educated, or the subject would hardly come to their attention at all. Since education was a privilege, they were either well-to-do or extremely talented. Since the higher levels of education were barred to women, they were male. And, as might be expected from the subject, they were all intensely literate.

While this background made the first grammarians supremely qualified for literary and scholarly work, it also carried a certain arrogance. The first grammarians showed scant interest in how English was used outside their own class and social circles. All of them assumed an unwarranted authority in their subject, appointing themselves as the arbiters of the language without any justification except their willingness to serve. Robert Lowth, for example, did not hesitate to include his own idiosyncrasies as grammatical rules. In much the same way, Samuel Johnson saw it as his “duty” to purify English usage through his dictionary. This same attitude continues in the prescriptive attitude today.

Moreover, the first grammars were a direct reflection of their authors’ bias and education. The problem is, the education of the Restoration and Enlightenment includes assumptions that we would question today. For example:

  • Rome is the model for all things: In fact, Latin is a poor model for English. Although both languages are Indo-European, the relation is indirect. Even the heavy influence of French, a Latin-derived language, via the Norman Conquest, does not make English’s structure much closer to Latin. At its core, English is a Germanic language, and if the first grammarians had used Dutch, Swedish, or German as a model, they would have had no precedent for objecting to split infinitives or double negatives. However, the Germanic origins of English were poorly understood in Britain during the Seventeenth and Eighteenth Centuries. At any rate, the first grammarians would probably have considered Germanic models too crude to replace the polished perfection of Latin.
  • Writing is the basis for English grammar: Like other grammarians, Samuel Johnson assumes that the principles of grammar should be taken from the written form of the language. Since Johnson was a writer himself, this assumption is understandable. However, modern linguistics regards written usage as simply one of many types of English, none of which is more valid in the abstract than any of the others. If anything, the spoken language is usually given greater priority today, partly because it tends to be the source of innovation, and partly because it reveals how people use the language when they are not trying to write correctly or formally.
  • Change is degeneration: When the narrator of “Gulliver’s Travels” visits Laputa, he is shown a vision of the Senate in Ancient Rome, then the modern English Parliament. The Senators look like demigods and heroes, the Members of Parliament scoundrels and ruffians. In writing this passage, Jonathan Swift reflects a widespread belief in his time that times are getting continually worse. This fallacy is the exact opposite of the modern one of equating all change with progress.

Applied to the English language, Swift’s view has little basis in fact. Admittedly, English has simplified itself over the centuries by dropping most noun declensions and verb conjugations, but that has not made it less useful for communication. Nor, despite the delicate shudder of prescriptive grammarians, does the shift in meaning of “cute” from “clever” to “attractive” in the early Twentieth Century or of “gay” from “happy” to “homosexual” in mid-Century weaken a language with as many synonyms as English. When clumsy or unclear constructions do arise (such as the use of “not” at the end of a sentence), their impracticality generally ensures that they are brief fads.

At any rate, what is degenerate and what is progressive is often a matter of opinion. To J.R.R. Tolkien, the Anglo-Saxon scholar and author of The Lord of the Rings, the hundreds of words added by Shakespeare and his contemporaries are a corruption that ruined English forever. Yet to most scholars, these coinages are an expression of a fertile inventiveness and part of the greatest literary era ever known.

  • London English is standard English: Like most English writers, the grammarians and their publishers centered around London. The language of the upper classes in the Home Counties had already become Standard English by the Fifteenth Century, which is why many people have heard of Chaucer and few people have heard of (much less read) “Sir Gawain and the Green Knight,” a brilliant poem by one of Chaucer’s contemporaries, written in an obscure North Country dialect. That is also why Chaucer and Shakespeare poke fun at other dialects–they already had the idea that some forms of English were better than others. In basing their work on the English spoken around London, the grammarians were simply working in a long-established tradition.

Far from seeing their biases as contradicting their claims to scientific objectivity, the grammarians openly proclaimed their goal of saving English from itself. In fact, by the standards of the time, proclaiming their educational and cultural assumptions was a means of asserting their ability to be objective on the matter.

Of all the early grammarians, the one most keenly aware that prescriptive grammars were biased was Noah Webster, the writer of the first American dictionary. Working on his dictionary between 1801 and 1828, Webster was not content simply to record how words were used. Instead, he was also concerned with producing a distinctly American language. To this end, Webster not only included words such as “skunk” and “squash” in his dictionary, but also introduced American spellings, such as “center” instead of “centre.” In addition, he encouraged the spread of a uniquely American pronunciation by consistently placing the stress on the first syllable of the word. In other words, Webster attempted to deliberately manipulate the use of English in the United States for patriotic reasons. Whatever anyone thinks of those reasons, Webster’s efforts are one of the best proofs that prescriptive grammars are not as value-free as many people imagine.

The same is true today in the debate over whether “they” can be used as the indefinite pronoun instead of “he/his” (“Everyone is entitled to their opinion”). On the one hand, traditionalists who insist on “he/his” are perpetuating the male bias of the first grammarians. On the other hand, reformers who favor “they” are trying to remake the language in their own world view. It is not a question of objectivity on either side. It is simply a question of which world view will prevail.

The Problem of Change

But the major problem with prescriptive attitude is that it resists the fact that languages are continually changing. If a community newspaper constantly includes editorials urging people to drink less, the amount of concern suggests a local drinking problem. In the same way, the constantly expressed wish to set standards for the language reflect the massive changes in English at the time that the first grammarians worked. Although the rate of change was probably slower than that of the Fifteenth and Sixteenth Centuries, English was still changing much faster between 1650 and 1800 than it does today.

Some of the changes that occurred or were completed in this period include:

  • The disappearance of dozens of Old English words like “bairn,” “kirk,” and “gang” (to go). Many of these words survived in Northern and Scottish dialects for another century, but became non-standard in written English.
  • The addition of dozens of new words. Some were deliberately coined by scientists, such as “atom.” Some were borrowed from the regions that England was conquering, such as “moccasin” or “thug.”
  • The replacement of “thou” and “ye” with “you” in the second person plural. These forms survived only in poetry.
  • The standard plural became “s” or “es.” Only a few exceptions such as “oxen” survived.
  • The loss of all case endings except the third person singular in most verbs (“I read,” “he reads”). Some of the older forms ending in “th” survived longer in poetry.
  • The loss of inflection in most adjectives.
  • The regularization of past tenses to “ed” or, occasionally, “t” (“dreamed” or “dreamt”).

Many of these changes are easy to overlook today because popular editions of texts from this period routinely modernize the spelling. However, the sheer number of changes makes clear that the early grammarians were fighting a rear guard action. If they have helped to slow the rate of change in the last two centuries, sometimes they have also accelerated it; the loss of many Northern words, for example, is probably partly due to the standardization on Home County English. Yet, despite these efforts, English continues to change as the need arises. Many of these changes come from the least educated parts of society–those ones least likely to be influenced by the prescriptive attitude.

Today, all prescriptive grammarians can do is resist changes as long as possible before accepting them. This constant retreat means that most prescriptive grammars are usually a couple of decades behind the way that the language is actually used in speech and contemporary publications.

The Descriptive Alternative

While prescriptive grammars were finding their way into the schools, an alternative approach to the study of language was being developed by linguists. Imitating the naturalists of the Eighteenth Century, linguists began to observe the pronunciation, vocabulary, grammars, and variations of languages, and began cataloging them in ways that suggested how they related to each other. In 1786, Sir William Jones established that most of the languages of India and Europe were related to each other. By 1848, Jacob Grimm, one of the famous Brothers Grimm of fairy tale fame, had detailed in his History of the German Language how English, Dutch, German, and the Scandinavian languages had descended from languages like Gothic. Content to observe and speculate, the early linguists developed what is now called the descriptive approach to grammar.

The descriptive approach avoids most of the distractions of prescriptive grammars. Today, most linguists would probably accept the following statements:

  • Change is a given. In fact, a working definition for linguistics is the study of how languages change. Linguists can offer a snapshot of how a language is used, but that snapshot is valid for only a particular place and time.
  • No value judgement should be placed on changes to a language. They happen, regardless of whether anyone approves of them or not.
  • The fact that one language is descended from a second language does not make the first language inferior. Nor does it mean that the first language must be modelled on the second.
  • Linguistics does not claim any special authority, beyond that of accurate observation or verifiable theory. Claims are open to discussion and require validation before being accepted.
  • No form of a language is given special status over another. Regardless of who speaks a variation of a language, where it is spoken, or whether it is oral or written, all variations are simply topics to be observed. For example, when Alan Ross and Nancy Mitford coined the phrases “U” and “non-U” for the differences between upper and middle class vocabularies in Britain, they did not mean to suggest that one should be preferred (although others have suggested that deliberately using a U vocabulary might be a way to be promoted).
  • Variations of a language may be more or less suitable in different contexts, but none are right or wrong. The fact that you might speak more formally in a job interview than at a night club does not mean the language of a job interview is proper English, or that the language of the night club is not.
  • Proper usage is defined by whatever the users of the language generally accept as normal.
  • A language is not neutral. It reflects the concerns and values of its speakers. For example, the fact that every few years North American teenagers develop new synonyms for drinking and sex reflects teenagers’ immense preoccupation with the subjects. The fact that they also develop new synonyms for “slut” reflects their sexual morality as applied to girls. A similar viewpoint is known in psychology as the Sapir-Whorf hypothesis.

To those conditioned by prescriptive grammars, many of these statements are unsettling. For instance, when I suggested on the TECHWR-L list that proper usage was determined by common usage, one list member went so far as to call the view “libertarian.”

Actually, these statements are simply realistic. Despite two centuries of classroom conditioning, average users of English have never been overly concerned about the pronouncements of prescriptive grammar. Instead, people continue to use English in whatever ways are most convenient. If the prescriptive usage is not widely used in practice, it may even sound odd, even to educated people. For example, to an ear attuned to the spoken language, the lack of contractions in an academic essay or business plan may sound stilted. In fact, an entire written vocabulary exists that is almost never used in speaking. The descriptive approach simply acknowledges what has always been the case. In doing so, it frees users from the contortions of prescriptive grammar, allowing them to focus on communication–where their focus should have been all along.

Professional Writers and Grammar

Writing well, as George Orwell observes in “Politics and the English Language,” “has nothing to do with correct grammar and syntax.” If it did, then two centuries of prescriptive grammar in the classroom should have resulted in higher standards of writing. Yet there is no evidence that the language is used more skillfully in 2001 than in 1750. The truth is that, prescriptive grammar and effective use of English have almost no connection. A passage can meet the highest prescriptive standards and still convey little if its thoughts are not clearly expressed or organized. Conversely, a passage can have several grammatical mistakes per line and still be comprehensible and informative. Prescriptive grammars are interesting as a first attempt to approach the subject of language, but today they are as useless to writers as they are to linguists. So long as writers have a basic competence in English, prescriptive grammar is largely a distraction that keeps them from focusing on the needs of their work.

By abandoning prescriptive grammar, writers shift the responsibility for their work to themselves. In practice, this shift means making choices that are not right or wrong in the abstract, but, rather, useful in a particular context or purpose. For example, instead of agonizing over whether “User’s Guide” or “Users Guide” is correct, writers can choose whichever suits the situation. They can even flip a coin, if they have no better means of deciding. In such cases, which choice is made is less important than using it consistently throughout the document to avoid confusion. Even then, writers may decide to be inconsistent if they have a good reason for being so.

Similarly, the decision whether to use a particular word or phrase is no longer a matter of referring to a standard dictionary or somebody else’s style guide. Instead, writers have to fall back on the basics: Will the intended audience understand the word? Is it the most exact word for the circumstances? Does it convey the image that the company wants to present? In the same way, while the need for clarity and a factual tone makes complete sentences and unemotional words useful choices in typical manuals, in a product brochure, sentence fragments and words heavy with connotation are more common. Writers may still want to summarize their decisions in a corporate style guide, but the style guide will be based on their own considerations, not the rules that someone else tells them to follow.

None of which is revolutionary–except that, under the prescriptive attitude, irrelevant purposes are often inflated until they become more important than a writer’s practical concerns.

That is not to say that taking a descriptive approach to grammar means writing in the latest slang. Nothing could date a document faster, or be more intrusive to a technical manual. Nor does it mean abandoning technical vocabularies that are known to the audience or that make explanations easier. If anything, a descriptive approach demands a much greater awareness of the language than a prescriptive one. Instead of learning the single correct version of the languages, writers who take a descriptive approach need to be aware–probably through constant reading–not only of dozens of different versions, but of how each version is changing.

If necessary, writers can use descriptive grammars such as journalistic style guides to help them. On the whole, however, the descriptive approach leaves writers where they should have been all along, deciding for themselves what helps their documents to achieve their purposes. The only difference is that, under the descriptive approach, they are fully aware of their situation. If they say anything unclear or stupid, they can no longer hide behind tradition.

Prescriptive grammar is useful for teaching English as a second language, but it has little value for the practicing writer. Clinging to it may provide emotional security, but only at the expense of making writing harder than it needs to be. The culture-wide devotion to it will not be changed in a moment. But conscientious writers can at least change their own habits, and make life easier for themselves. And, from time to time, they can even laugh some worn-out, crippling concept — such as not ending a sentence in a preposition, or not splitting an infinitive — into the recycle bin where it belongs.

(With apologies to George Orwell.)

Read Full Post »

“You’re an English major? You must be planning a career in fast food.” Comments like this haunted me from the moment I declared my major in university. But hearing the sentiment recently, I realized that it was far from accurate. The truth is, people who have a way with words can make a comfortable living in all sorts of ways, so long as they don’t limit their possibilities to the obvious.

The worst mistake that anybody with an English degree – or, in fact, any Arts degree – can make is to hang about on the fringes of academia, hoping for a tenure track position. Ever since my undergraduate days, I’ve been hearing about all the tenured positions that are going to become available as their current incumbents retire, but, between budget cuts and the increasing tendency to hire non-tenured staff or sessionals, the positions are unlikely to materialize. People who were hoping for those positions when I left academia over a decade ago are still waiting for those tenured positions. Meanwhile, they endure semester by semester contracts, last minute hires, and doing the same work as tenured faculty for half the money. That’s fine for a few years, but it’s no way to live in the long-term.

The same is true of editing piece work. Just like academia, the publishing industry depends on having a constant pool of cheap work-for-hire editors. You may be one of the lucky exceptions, but the odds are against you, no matter how talented. Those who run the industry are careful not to employ you so much that they become obliged to offer you benefits.

Instead of lingering in limbo, waiting for the academic or literary job you used to dreamed of, English majors should explore the possibilities in business. Not only is the power of self-expression in demand there, but the competition is far less fierce than in academia – partly because of the greater need, and partly because many English majors seem to consider that taking a job in business is beneath them. Often, too, they make the mistake of thinking that their writing skills are all they need, and are slow to learn the subject matter expertise they need to do the work properly.

But, if you can get beyond the idea that you are dirtying your hands and are willing to learn what you don’t know, then the jobs are there. As a technical writer, you need to write clearly and organize information for conciseness and accuracy; in many ways, the job is writing stripped to the basics. As a communications and marketing manager, writing news releases or blogs, you take on the responsibility of being the voice of the company. As a product manager, you decide how to present a product or line, and you’ll find your skills with textural analysis serve you well when you come to deal with end user license agreements and other legal documents. As an instructor, you are reprising your role as a teaching assistant while you were in grad school, the only difference being is that you are teaching software or policies and procedures, rather literature or criticism.

And these are only the most obvious career paths. Writing and teaching skills aren’t a bad foundation for going on to law school, for example. Best of all, the first thing you’ll notice when taking these positions if you’ve been vying for scraps of work around academia, your yearly income will increase by over fifty percent or more.

Admittedly, some of these positions aren’t on the express way to the top. Technical writers, for instance, may rise to supervise other technical writers at a large company, but they aren’t likely to become CEOs. But they can serve as entry positions, and, if you’re interested in climbing the corporation, you can always expand your skill set later on. Meanwhile, you can reasonably expect a salary that puts you solidly in the upper middle class, to say nothing of responsible and often rewarding work.

Really, the only thing holding you back with an English degree is your own lack of imagination or initiative. Just because those who prefer an education they should be getting at a technical college choose to belittle your liberal education is no reason for you to believe them.

Read Full Post »

And if you’re looking for me . . .
Hey, if you’re looking for me . . .
The boy’s still running

-OysterBand

“Aren’t you the guy who used to be running all the time?” a man I went to school with asked when we met recently in downtown Vancouver. I’ve been hearing variations of that question all my life, from everyone from clerks in local stores to potential employers. That’s not surprising, really, because, aside from reading and writing, few things have been a part of my life as much as running.

When I was in the first grades of school, we used to play tag in a little pieces of woods on the edge of the school ground. I soon found that, while I was only the third or fourth fastest of the boys participating, the longer the game went on, the less likely those faster were likely to catch me. The same thing happened on the soccer field, where by the last minutes of the game, I could outrun everybody.

However, it was in grade three that I first took up running seriously. Like many decisions in my life, it was taken by a wish to prove someone else wrong. The school’s PE teacher had assigned people from each grade to participate in a track meet with two other schools – and he hadn’t included me. Stung by this unfairness in a way that only the very young and self-righteous can be, I determined to make him regret his decision. While his chosen few practiced each morning on the makeshift track on the playing fields, I started doing laps of the school ground, sure that he would see me and be so impressed that he would have to reconsider.

If he ever noticed, he gave no sign of reconsidering. But the habit lingered, and soon I was running three mornings a week before school with several of my friends. I was reasonably athletic, although in team sports I made my mark by enthusiasm and energy more than skill, and I took running very seriously – so seriously, in fact, that when I discovered that a couple of people had cheated on an after-school training run that, when I saw them a block ahead of me, I charged towards them, yelling “Cheaters!” at the top of my lungs, wild with rage and determined that they weren’t going to get credit for finishing first. I beat them, too – although probably I was helped by the fact that they didn’t care as much as I did.

In high school, I had more than my share of firsts in cross-country and distance running, largely on the strength of having discovered that all you needed to beat most rivals was to train every day. Moreover, since my only strategy was to rush to the front of each race and then hold on, I learned the importance of psychology. I won several races when woozy from ‘flu solely because everyone else expected me to be out in front.

Somehow, though, I didn’t have get the victories in the provincial finals that everyone expected from me in grade 12. I was sick at the time, but I wonder now if the illness wasn’t an unconscious rebellion against the increasing seriousness I was finding in sport. By that point, I had been several years in the Vancouver Olympic Club, training under the legendary Lloyd Swindell, and not only had I found several rivals, but the seriousness of the training I experienced seemed to take the fun from the sport.

In university, the seriousness intensified. Not only that, but, as a team member, I was expected to help paint posters for other athletic events and show up to football and basketball games. Since I was commuting three hours a day to university, I couldn’t have given the time to these things if I had wanted to.

Moreover, the competition was tougher, too. At eighteen, I didn’t have my full adult strength (such as it is), and I was competing with fully grown men from across North America. Increasingly, I realized that, judging from the success of some of my older peers at the university and at the Vancouver Olympic Club, I might make qualify for the world championships or the Olympics in the five or ten thousand meters if I devoted four or five hours a day to training – but I almost certainly would not reach the finals, let alone finish with the medals. Reluctantly, I acknowledged to myself that I was a good runner, but not a great one, and somewhere near the end of my second semester, I ran my last race.

But that didn’t mean I quit running. Even then, it was too much a part of my life to give up. It was a form of meditation, a collection of peak moments of exertion and early morning sights that I could never give up. I’ve run up hills in Glacial National Park while on holiday, and several thousand meters high at Mount Lassen with my lungs on fire. Early in the morning, I’ve run through the streets of Berkeley on glorious summer mornings and Tacoma’s skid row, through the fog, and the outskirts of Indianapolis in the snow and stabbing cold. I sometimes feel that, until I’ve run an area, I haven’t really experienced it.

Admittedly, my mileage has dropped and my speed is a joke, especially in the last few years, when I’ve started varying my exercise with swimming and cycling to spare my knees some strain, but I don’t expect to quit altogether so long as I can hobble, however slowly. I sometimes joke that I won’t consider any retirement home that doesn’t have an all-weather running track.

So, yeah, if you’re looking for me . . .
Hey, if you’re looking for me . . .
The boy’s still running.

Read Full Post »

« Newer Posts - Older Posts »