Feeds:
Posts
Comments

Archive for the ‘technical writing’ Category

I haven’t written a manual for over seven years, so perhaps my opinions about technical writing don’t count for anything. All the same, I’m disappointed to see that writers are still being steered towards distractions such as writing personae I can think of little that could do more to waste a writer’s limited time or cause them to be held in lower regard.

A persona is an imaginary profile that represents one segment of your audience. The ones I’ve seen are usually 200-700 words long, and typically include not simply the segment’s subject matter expertise, but also the kind of detail more suitable for a character in a work of fiction. Age, gender, job title, hobbies, dining preferences, personal likes and dislikes – all these and more can find their way into each persona. Since the audience for technical writing can have several segments, creating personae is demanding, fun – but largely a waste of time.

Admittedly, the exercise of creating a persona can help a write fix audience segments in mind. Should they need to refresh their sense of the audience, they can simply re-read the persona. But for the most part, each of the personae that I have seen has been a nugget or two of information lost in irrelevancies.

The blunt truth is, the main facts that tech-writers need to know about their audience is their technical background and reading ability. Possibly, writers may need to know a few other facts – for instance, the audiences’ languages can affect page layout – but not much more.

And even expertise and reading ability are largely irrelevant, because, rather than trying to second-guess the audience, it’s easier to explain fully and write simply. After all, you never know who might use your manuals, or if the information you’ve received about the audience is accurate.

Under these circumstances, better to stick with the basics. After all, even experts appreciate a brief explanation of context, or may need a reminder of some aspects of your subject. Write a short, coherent explanation of the tasks at hand, and you can’t go far wrong. But spend your limited time on writing personae, and you may guess wrong or find that the context changes. In other words, writing personae just isn’t efficient time management.

My impression is that personae are favored by those who stress the writing in their job title at the expense of the technical. Desperate to have developers and managers take them seriously, they champion arcane embellishments like personae in the hopes of appearing experts and gaining the respect of those around them.

Nine times out of ten, however, such efforts fail, because they are usually made at the expense of actually learning the subject matter, and of writing and editing. The result? You’re left looking pretentious and turn in a finished manual that only reinforces everybody’s impression that you are a lightweight poseur.

If that’s your idea of being a tech-writer, fine – go ahead and fritter away your time on personae But don’t be surprised when you don’t get the respect that you think you deserve. You’ll only get that by mastering the subject matter and presenting it usefully, not by engaging in pointless intellectual exercises.

Read Full Post »

For eight years, I made most of my income from technical writing. Not the relatively glamorous technical writing involved with writing articles about free and open source software (FOSS) – glamorous, that is, to those who haven’t done it (those of us who have done it are usually considerably less starry-eyed) — but basic how-tos and detailed instructions to accompany hardware and software. Looking back, I must have been reasonably good at the job, since I went from a beginner to a consultant with a sub-contractor in eight months, and kept myself steadily employed most of the time and well-employed much of the time.

Based on that experience, I would like to offer some advice for those who are trying to fill the gaps in FOSS documentation. It’s a thankless job, under-appreciated and laborious, but, if you’re going to attempt it despite all the disincentives, you might as well do it properly. After all, your satisfaction in doing the job properly might easily be your only reward:

  • You must become an expert in what you are writing about: Some professional technical writers pride themselves on being specialists in communication, and feel they don’t need to know the details of what they are writing about. You can always tell manuals done by them, because they are shallow and have large gaps in them. Likewise, you can always tell this type of technical writer, because they’re despised by any developer with whom they work. The truth is that, while you don’t need to be an expert when you start documenting, if you aren’t expert by the time you’re finished, you aren’t doing the job properly.
  • It all comes down to structure: Anybody with average intelligence or better can learn to write a coherent sentence or paragraph. However, structuring several hundred pages is hard work – much harder work than the actual writing. The need to structure is also why you need to become an expert in your subject; if you’re not, how can you know what information to put first, or what’s missing? Don’t be surprised if you spend 50-75% of your time in planning the structure, or if your first outline changes drastically as you work. Both are indications that your work is developing the way that it should.
  • In the majority of cases, the best structure will be a list of tasks, arranged from the most basic or earliest to the most complex or latest: It will almost never be a list of menu items and taskbar icons, except in brief introductions to the interface. This task-orientation is a major reason why you need to be an expert in what you are writing – if you’re not, you won’t have any idea of what users might want to accomplish.
  • Think of your audience as being attention-deficit: Knowing your material is necessary, but it can also make you forget what new users need to hear. The best way to write to the level you need is to project yourself imaginatively into the position of a new user, but, if you can’t manage that, imagine that you are writing for people with low attention spans who are easily bored. The result may spell out the obvious for some readers, but other readers will be glad that you are thorough. Always remember: What is obvious to you isn’t obvious to your readers.
  • Don’t worry about style: In fiction, writers often call attention to their style. By contrast, non-fiction like technical writing is not about you. Your job is to provide simple, clear prose in which you are invisible. And if that sounds boring or unchallenging, you might consider Isaac Asimov’s observation that stain glass windows have been made for over a millennium, while clear glass was a much later development. In many ways, writing simply and clearly is much harder than writing with flourishes and personality. Focus on clarity and content, and let other style considerations take care of themselves. You’ll be surprised how well they work out without you thinking consciously about them.
  • Use structured prose whenever possible:Bullet lists, numbered lists, tables, and callouts on diagrams – all these techniques are conciser and easier to understand than straight prose
  • Your first draft is probably going to be terrible: But that doesn’t matter, so long as it improves by the end. What matters in the first draft is getting something that you or others can analyze for gaps and make estimates about the finished documentation from. Probably, the physical act of writing will be no more than 25% of your time. Often, it will be much less. If you’re planned properly, and begin writing with a thorough understanding, it should almost feel like an afterthought.
  • Don’t mix writing and editing: Writing is a creative process, editing a critical one. If you try to mix the two, you will probably do both poorly. You may also find yourself freezing up and being unable to write because your self-criticism is interfering with your ability to write.
  • Make sure editing is part of your schedule: Editing should not be a last-minuted effort. Instead, accept it as an important part of your schedule. Expect it to fill 10-20% of your time.
  • Editing is about structure as well as words: Editing is not just about spelling or correcting grammar. It’s just as much about the structure of the work.
  • Get second and third opinions: When you have just finished writing, you are probably unable to judge your work effectively. Get other people to review your work in as much detail as possible. If you can’t get other people to review, put the manuscript aside for several days. If you can’t put it aside, print it out, or take a break before returning to it.
  • Expect revisions: Based on my experience teaching first year composition at university, I can say that the average person takes 3 to 4 drafts to produce their best work. You may be naturally talented or reduce that number with practice, but don’t count on either until you have some experience.Make sure you budget the time. You’ll know if your efforts are succeeding if the general trend is that each draft becomes quicker and quicker to write. If that doesn’t happen – especially if you have to keep reinventing the structure or making major additions – then something is probably seriously wrong.

With any given piece of writing, you may not be able to follow each of these pieces of advice. Deadlines in particular may keep you from giving each of these points the attention it needs. But keep all these points in mind, and you will be more likely to write documentation that people actually use, instead of an after-thought to the software that is never used.

Read Full Post »

(Note: I wrote the following article six years ago for the Techwr-l site. For a long while, it was my most-requested article. Recently, I noticed that a site makeover had left the article down. By now, it may be posted again, but I thought I’d reprint it here, although it’s very long by the standards of blog postings. Although addressed to technical writers, almost everything in it applies to writers and writing in general.)

Most technical writers are confused about grammar. On any day on the TECHWR-L list, basic questions are asked: “Is ‘User’s Guide’ or ‘Users’ Guide’ correct? Maybe ‘Users Guide?'” “Should ‘web’ be capitalized when used to refer to the World Wide Web?” “Which is right: ‘A FAQ’ or ‘an FAQ?'” Many of these questions become the major thread on the list for a day or two, generating far more debate than they’re worth.

The confusion isn’t so much about the grammatical points themselves. It’s about the nature of grammar in general. Apparently, many tech writers do not see grammar as a set of conventions to help them write clearly. Instead, to judge by the wording of the questions and responses, they see grammar as a set of unchanging rules that can provide definitive answers in every situation.

Some are afraid to break the rules of grammar and risk being denounced as incompetent. A handful, smugly sure that they know the rules, use their rote learning of the rules as an ad hominem attack, nitpicking at typos and small errors to discredit writers without disproving their viewpoints. Most sit in the middle, haunted by the ghosts of childhood grammar classes until they can hardly tell on their own authority whether they are writing well or not. But underlying all these reactions is an attitude that rules are rules, and cannot be broken.

This attitude is usually known as a prescriptive approach to grammar. It assumes that grammar exists mainly to tell us how to speak or write properly–not well. It is an attitude that tech writers share with almost everybody in the English-speaking world. It is a form of conditioning that begins in kindergarten and continues through high school and even into college and university. It undermines nearly everyone’s confidence in their ability to communicate, especially on paper. Yet it is especially harmful to professional writers for at least three reasons:

  • It grotesquely exaggerates the importance of grammar. Although competence in grammar is sometimes proof of other writing skills, it stresses presentation over content. Even worse, it stresses correctness over precision, conciseness, or clarity.
  • It binds writers to viewpoints that are not only arbitrary and obsolete, but, in some cases, far from their own opinions.
  • It undermines writers’ confidence and their ability to make decisions about how to communicate effectively.

Why are we burdened by this attitude? How does it affect us? The easiest way to answer these questions is to look at the origins of the prescriptive attitude and the alternatives to it. Only then can we begin to grasp how we can live without it.

The Rise of Prescriptive Grammar

Prescriptive grammars are the products of the Enlightenment. Earlier grammars such as William Bullokar’s in 1586 and Ben Jonson’s posthumous one were also prescriptive, but intended for language students. The first prescriptive pronouncements for native English speakers date to the Seventeenth and Eighteenth Centuries.

This is the start of the great era of describing and recording. In every subject from biology to Egyptology, educated men struggled to write accounts so thorough that no other one would ever be needed. It was also a time that looked both backwards and forwards to Classical Rome. It looked back in the sense that ancient Rome was seen as the height of civilization. It looked forward in two senses: England perceived itself as a second Rome, and Latin was the language of international science.

The first prescriptive comments were very much in the spirit of their times. Their compilers hoped for definitive grammars that would fix the form of
English once and for all, and provide a source for settling grammatical disputes. Since Latin was the language of the world’s most sophisticated civilization, the closer this fixed form of English was to Latin, the more sophisticated English would be. Moreover, given the belief that culture had been degenerating since the fall of Rome, most grammarians automatically equated any change in the language with decay and degeneration.

John Dryden, the poet and playwright, was among the first to make prescriptive pronouncements. His main contribution to prescriptive grammar was to suggest that prepositions should never end a sentence. The reasons for this prescription are that Latin sentences rarely end in prepositions, and that the word “preposition” clearly indicates that this part of speech should go before (“pre”) the noun it is associated with. Dryden criticized Shakespeare and Jonson for not following this rule, and scrupulously edited his own writings until they conformed to it.

Dryden also hoped to fix the form of English, which was still rapidly changing. Together with the diarist John Evelyn and other authors, Dryden called for an English version of l’Academie Francais–the body of scholars and writers that oversee the creation of the official French dictionary and is the arbiter of correct French. Dryden’s plea for an English Academy was echoed later by Daniel Defoe and Jonathan Swift. The idea was being seriously considered by Queen Anne when she died in 1714. But with the ascension of the German-speaking George I, the question shifted from the monarch helping to purify English to teaching the monarch English, and the idea was dropped.

Deprived of royal assistance, English men of letters did their best to improve the language on their own. In the mid-Eighteenth Century, a number of grammars were published, as well as Samuel Johnson’s famous dictionary, whose Preface spells out its prescriptive purposes with both succinctness and dry wit. All these works were heavily prescriptive, although Joseph Priestley did include some comments about the importance of common usage in deciding what was proper.

The most influential of these grammars was Robert Lowth’s “Short Introduction to English Grammar,” published in 1761. Criticizing almost every major English writer from Shakespeare to Pope, Lowth made most of the prescriptive statements that people still follow today. His prescriptions include:

  • Two negatives make a positive, except in constructions such as “No, not even if you paid me.”
  • Never split an infinitive.
  • Never end a sentence in a preposition.
  • “Ain’t” is unacceptable in formal English.

These ideas have several sources: An attempt to model English on Latin (and therefore to arrive at a universal grammar that underlay all languages), a desire to be scientific, and Lowth’s personal preferences. For instance, Lowth would not accept split infinitives because Latin infinitives are one word and cannot be split. Similarly, two negatives make a positive because they do so in mathematics. The ban on “ain’t,” though, seems entirely Lowth’s idiosyncrasy. None of these prescriptions, however, took any notice of how people actually spoke or wrote. For example, despite Lowth, “ain’t” continued to be used by Queen Victoria and the upper classes until the start of the Twentieth Century.

Although Lowth later became Bishop of London, his ideas on proper usage would probably have remained obscure if they had not been borrowed for classroom use. In 1781, Charles Coote wrote a textbook grammar based on Lowth’s Grammar, adding his own preference for “he” and “his” as the indefinite personal pronoun (as in “everyone is entitled to his opinion”). A few years later, Lindley Murray borrowed from Lowth to write a series of textbooks for a girls’ school. Murray’s textbooks became so popular that they quickly became the standard English in American schools throughout much of the Nineteenth Century. With modifications, Murray’s “English Grammar” and Lowth’s Grammar have been the basis for textbook grammars ever since. With their unyielding rules, these textbooks have given at least eight generations of English-speakers the prescriptive attitude that inhibits people today.

The Biases of Prescription

In their language and their purposes, prescriptive grammars make a strong claim to objectivity. A. Lane was typical of the first grammarians when he wrote in 1700 that the purpose of grammar was to teach people to speak and write “according to the unalterable Rules of right Reason.” Similarly, in writing his dictionary, Samuel Johnson humorously referred to himself as a “slave of Science.” These claims are still echoed when modern defenders of the prescriptive attitude assume that the rules of grammar are value-free. Yet a closer look at prescriptive grammar reveals distinct biases. Some of these were openly admitted by the first grammarians. The problem is that some of these biases are no longer current. Others are demonstrably false.

The early grammarians’ claims to be scientific or precise concealed a strong personal bias. This bias was by no means a conspiracy–it was simply natural self-expression. Grammarians were highly educated, or the subject would hardly come to their attention at all. Since education was a privilege, they were either well-to-do or extremely talented. Since the higher levels of education were barred to women, they were male. And, as might be expected from the subject, they were all intensely literate.

While this background made the first grammarians supremely qualified for literary and scholarly work, it also carried a certain arrogance. The first grammarians showed scant interest in how English was used outside their own class and social circles. All of them assumed an unwarranted authority in their subject, appointing themselves as the arbiters of the language without any justification except their willingness to serve. Robert Lowth, for example, did not hesitate to include his own idiosyncrasies as grammatical rules. In much the same way, Samuel Johnson saw it as his “duty” to purify English usage through his dictionary. This same attitude continues in the prescriptive attitude today.

Moreover, the first grammars were a direct reflection of their authors’ bias and education. The problem is, the education of the Restoration and Enlightenment includes assumptions that we would question today. For example:

  • Rome is the model for all things: In fact, Latin is a poor model for English. Although both languages are Indo-European, the relation is indirect. Even the heavy influence of French, a Latin-derived language, via the Norman Conquest, does not make English’s structure much closer to Latin. At its core, English is a Germanic language, and if the first grammarians had used Dutch, Swedish, or German as a model, they would have had no precedent for objecting to split infinitives or double negatives. However, the Germanic origins of English were poorly understood in Britain during the Seventeenth and Eighteenth Centuries. At any rate, the first grammarians would probably have considered Germanic models too crude to replace the polished perfection of Latin.
  • Writing is the basis for English grammar: Like other grammarians, Samuel Johnson assumes that the principles of grammar should be taken from the written form of the language. Since Johnson was a writer himself, this assumption is understandable. However, modern linguistics regards written usage as simply one of many types of English, none of which is more valid in the abstract than any of the others. If anything, the spoken language is usually given greater priority today, partly because it tends to be the source of innovation, and partly because it reveals how people use the language when they are not trying to write correctly or formally.
  • Change is degeneration: When the narrator of “Gulliver’s Travels” visits Laputa, he is shown a vision of the Senate in Ancient Rome, then the modern English Parliament. The Senators look like demigods and heroes, the Members of Parliament scoundrels and ruffians. In writing this passage, Jonathan Swift reflects a widespread belief in his time that times are getting continually worse. This fallacy is the exact opposite of the modern one of equating all change with progress.

Applied to the English language, Swift’s view has little basis in fact. Admittedly, English has simplified itself over the centuries by dropping most noun declensions and verb conjugations, but that has not made it less useful for communication. Nor, despite the delicate shudder of prescriptive grammarians, does the shift in meaning of “cute” from “clever” to “attractive” in the early Twentieth Century or of “gay” from “happy” to “homosexual” in mid-Century weaken a language with as many synonyms as English. When clumsy or unclear constructions do arise (such as the use of “not” at the end of a sentence), their impracticality generally ensures that they are brief fads.

At any rate, what is degenerate and what is progressive is often a matter of opinion. To J.R.R. Tolkien, the Anglo-Saxon scholar and author of The Lord of the Rings, the hundreds of words added by Shakespeare and his contemporaries are a corruption that ruined English forever. Yet to most scholars, these coinages are an expression of a fertile inventiveness and part of the greatest literary era ever known.

  • London English is standard English: Like most English writers, the grammarians and their publishers centered around London. The language of the upper classes in the Home Counties had already become Standard English by the Fifteenth Century, which is why many people have heard of Chaucer and few people have heard of (much less read) “Sir Gawain and the Green Knight,” a brilliant poem by one of Chaucer’s contemporaries, written in an obscure North Country dialect. That is also why Chaucer and Shakespeare poke fun at other dialects–they already had the idea that some forms of English were better than others. In basing their work on the English spoken around London, the grammarians were simply working in a long-established tradition.

Far from seeing their biases as contradicting their claims to scientific objectivity, the grammarians openly proclaimed their goal of saving English from itself. In fact, by the standards of the time, proclaiming their educational and cultural assumptions was a means of asserting their ability to be objective on the matter.

Of all the early grammarians, the one most keenly aware that prescriptive grammars were biased was Noah Webster, the writer of the first American dictionary. Working on his dictionary between 1801 and 1828, Webster was not content simply to record how words were used. Instead, he was also concerned with producing a distinctly American language. To this end, Webster not only included words such as “skunk” and “squash” in his dictionary, but also introduced American spellings, such as “center” instead of “centre.” In addition, he encouraged the spread of a uniquely American pronunciation by consistently placing the stress on the first syllable of the word. In other words, Webster attempted to deliberately manipulate the use of English in the United States for patriotic reasons. Whatever anyone thinks of those reasons, Webster’s efforts are one of the best proofs that prescriptive grammars are not as value-free as many people imagine.

The same is true today in the debate over whether “they” can be used as the indefinite pronoun instead of “he/his” (“Everyone is entitled to their opinion”). On the one hand, traditionalists who insist on “he/his” are perpetuating the male bias of the first grammarians. On the other hand, reformers who favor “they” are trying to remake the language in their own world view. It is not a question of objectivity on either side. It is simply a question of which world view will prevail.

The Problem of Change

But the major problem with prescriptive attitude is that it resists the fact that languages are continually changing. If a community newspaper constantly includes editorials urging people to drink less, the amount of concern suggests a local drinking problem. In the same way, the constantly expressed wish to set standards for the language reflect the massive changes in English at the time that the first grammarians worked. Although the rate of change was probably slower than that of the Fifteenth and Sixteenth Centuries, English was still changing much faster between 1650 and 1800 than it does today.

Some of the changes that occurred or were completed in this period include:

  • The disappearance of dozens of Old English words like “bairn,” “kirk,” and “gang” (to go). Many of these words survived in Northern and Scottish dialects for another century, but became non-standard in written English.
  • The addition of dozens of new words. Some were deliberately coined by scientists, such as “atom.” Some were borrowed from the regions that England was conquering, such as “moccasin” or “thug.”
  • The replacement of “thou” and “ye” with “you” in the second person plural. These forms survived only in poetry.
  • The standard plural became “s” or “es.” Only a few exceptions such as “oxen” survived.
  • The loss of all case endings except the third person singular in most verbs (“I read,” “he reads”). Some of the older forms ending in “th” survived longer in poetry.
  • The loss of inflection in most adjectives.
  • The regularization of past tenses to “ed” or, occasionally, “t” (“dreamed” or “dreamt”).

Many of these changes are easy to overlook today because popular editions of texts from this period routinely modernize the spelling. However, the sheer number of changes makes clear that the early grammarians were fighting a rear guard action. If they have helped to slow the rate of change in the last two centuries, sometimes they have also accelerated it; the loss of many Northern words, for example, is probably partly due to the standardization on Home County English. Yet, despite these efforts, English continues to change as the need arises. Many of these changes come from the least educated parts of society–those ones least likely to be influenced by the prescriptive attitude.

Today, all prescriptive grammarians can do is resist changes as long as possible before accepting them. This constant retreat means that most prescriptive grammars are usually a couple of decades behind the way that the language is actually used in speech and contemporary publications.

The Descriptive Alternative

While prescriptive grammars were finding their way into the schools, an alternative approach to the study of language was being developed by linguists. Imitating the naturalists of the Eighteenth Century, linguists began to observe the pronunciation, vocabulary, grammars, and variations of languages, and began cataloging them in ways that suggested how they related to each other. In 1786, Sir William Jones established that most of the languages of India and Europe were related to each other. By 1848, Jacob Grimm, one of the famous Brothers Grimm of fairy tale fame, had detailed in his History of the German Language how English, Dutch, German, and the Scandinavian languages had descended from languages like Gothic. Content to observe and speculate, the early linguists developed what is now called the descriptive approach to grammar.

The descriptive approach avoids most of the distractions of prescriptive grammars. Today, most linguists would probably accept the following statements:

  • Change is a given. In fact, a working definition for linguistics is the study of how languages change. Linguists can offer a snapshot of how a language is used, but that snapshot is valid for only a particular place and time.
  • No value judgement should be placed on changes to a language. They happen, regardless of whether anyone approves of them or not.
  • The fact that one language is descended from a second language does not make the first language inferior. Nor does it mean that the first language must be modelled on the second.
  • Linguistics does not claim any special authority, beyond that of accurate observation or verifiable theory. Claims are open to discussion and require validation before being accepted.
  • No form of a language is given special status over another. Regardless of who speaks a variation of a language, where it is spoken, or whether it is oral or written, all variations are simply topics to be observed. For example, when Alan Ross and Nancy Mitford coined the phrases “U” and “non-U” for the differences between upper and middle class vocabularies in Britain, they did not mean to suggest that one should be preferred (although others have suggested that deliberately using a U vocabulary might be a way to be promoted).
  • Variations of a language may be more or less suitable in different contexts, but none are right or wrong. The fact that you might speak more formally in a job interview than at a night club does not mean the language of a job interview is proper English, or that the language of the night club is not.
  • Proper usage is defined by whatever the users of the language generally accept as normal.
  • A language is not neutral. It reflects the concerns and values of its speakers. For example, the fact that every few years North American teenagers develop new synonyms for drinking and sex reflects teenagers’ immense preoccupation with the subjects. The fact that they also develop new synonyms for “slut” reflects their sexual morality as applied to girls. A similar viewpoint is known in psychology as the Sapir-Whorf hypothesis.

To those conditioned by prescriptive grammars, many of these statements are unsettling. For instance, when I suggested on the TECHWR-L list that proper usage was determined by common usage, one list member went so far as to call the view “libertarian.”

Actually, these statements are simply realistic. Despite two centuries of classroom conditioning, average users of English have never been overly concerned about the pronouncements of prescriptive grammar. Instead, people continue to use English in whatever ways are most convenient. If the prescriptive usage is not widely used in practice, it may even sound odd, even to educated people. For example, to an ear attuned to the spoken language, the lack of contractions in an academic essay or business plan may sound stilted. In fact, an entire written vocabulary exists that is almost never used in speaking. The descriptive approach simply acknowledges what has always been the case. In doing so, it frees users from the contortions of prescriptive grammar, allowing them to focus on communication–where their focus should have been all along.

Professional Writers and Grammar

Writing well, as George Orwell observes in “Politics and the English Language,” “has nothing to do with correct grammar and syntax.” If it did, then two centuries of prescriptive grammar in the classroom should have resulted in higher standards of writing. Yet there is no evidence that the language is used more skillfully in 2001 than in 1750. The truth is that, prescriptive grammar and effective use of English have almost no connection. A passage can meet the highest prescriptive standards and still convey little if its thoughts are not clearly expressed or organized. Conversely, a passage can have several grammatical mistakes per line and still be comprehensible and informative. Prescriptive grammars are interesting as a first attempt to approach the subject of language, but today they are as useless to writers as they are to linguists. So long as writers have a basic competence in English, prescriptive grammar is largely a distraction that keeps them from focusing on the needs of their work.

By abandoning prescriptive grammar, writers shift the responsibility for their work to themselves. In practice, this shift means making choices that are not right or wrong in the abstract, but, rather, useful in a particular context or purpose. For example, instead of agonizing over whether “User’s Guide” or “Users Guide” is correct, writers can choose whichever suits the situation. They can even flip a coin, if they have no better means of deciding. In such cases, which choice is made is less important than using it consistently throughout the document to avoid confusion. Even then, writers may decide to be inconsistent if they have a good reason for being so.

Similarly, the decision whether to use a particular word or phrase is no longer a matter of referring to a standard dictionary or somebody else’s style guide. Instead, writers have to fall back on the basics: Will the intended audience understand the word? Is it the most exact word for the circumstances? Does it convey the image that the company wants to present? In the same way, while the need for clarity and a factual tone makes complete sentences and unemotional words useful choices in typical manuals, in a product brochure, sentence fragments and words heavy with connotation are more common. Writers may still want to summarize their decisions in a corporate style guide, but the style guide will be based on their own considerations, not the rules that someone else tells them to follow.

None of which is revolutionary–except that, under the prescriptive attitude, irrelevant purposes are often inflated until they become more important than a writer’s practical concerns.

That is not to say that taking a descriptive approach to grammar means writing in the latest slang. Nothing could date a document faster, or be more intrusive to a technical manual. Nor does it mean abandoning technical vocabularies that are known to the audience or that make explanations easier. If anything, a descriptive approach demands a much greater awareness of the language than a prescriptive one. Instead of learning the single correct version of the languages, writers who take a descriptive approach need to be aware–probably through constant reading–not only of dozens of different versions, but of how each version is changing.

If necessary, writers can use descriptive grammars such as journalistic style guides to help them. On the whole, however, the descriptive approach leaves writers where they should have been all along, deciding for themselves what helps their documents to achieve their purposes. The only difference is that, under the descriptive approach, they are fully aware of their situation. If they say anything unclear or stupid, they can no longer hide behind tradition.

Prescriptive grammar is useful for teaching English as a second language, but it has little value for the practicing writer. Clinging to it may provide emotional security, but only at the expense of making writing harder than it needs to be. The culture-wide devotion to it will not be changed in a moment. But conscientious writers can at least change their own habits, and make life easier for themselves. And, from time to time, they can even laugh some worn-out, crippling concept — such as not ending a sentence in a preposition, or not splitting an infinitive — into the recycle bin where it belongs.

(With apologies to George Orwell.)

Read Full Post »

As a communications and marketing consultant, I worked with over forty companies in eight years. They ranged from multi-nationals like IBM and Intel to startups and single-person operations, and from the reputable (if bureaucratic) to the fly-by-night. Three times, I was stiffed – – fortunately, for small amounts. Often, I was frustrated by lack of challenge or mismanagement that I was sure I could put right if I were in charge (and once or twice, I might have been right). But only once have I quit a job after three days of work, and that position remains, indisputably, the worst job I have ever taken.

Even my first job as a busboy, which I quit after the manager accused me of spilling water on a customer when I had been in the kitchen all shift can’t compare. As for the summer between university semesters that I spent drilling holes in wooden rods, I may never have learned what either holes or rods were for, but that was only for three months, and enduring acute boredom isn’t too bad when you earn union wages.

I took my all-time worst job in the weeks after I left Stormix Technologies, where I had my first experience as management (and awful I was, but that’s another story). I had quit in the sudden realization that the company could fold any time (and six months later it did), and repented my rashness at leisure. When a woman I knew from a consulting agency I sometimes worked through told me that she was now doing human resources at a company, I jumped at the opportunity to start at her new company. Never mind that it was a return to technical-writing after playing at management; it was a job.

The company was involved in online gaming, my acquaintance told me, leading me to believe that it was developing a world for role playing. Never mind that gaming seemed frivolous after working with free software; I told myself to be realistic and take the opportunity that was offered.

The first morning, I learned that my acquaintance had misrepresented the company to me. It was not involved role playing, but in developing casino games. In fact, the company had been the recent victim of a police raid for its activities, and had just changed its name. Otherwise, I would have recognized the name, because, in those days, I kept close track on all the high-tech companies in my area. I was dismayed, but I knew enough about the police to know that a raid didn’t necessarily mean any wrongdoing, so I fought down my misgivings.

The second day, I noticed a door that opened into a room larger than any I had seen in the office. It had well over a dozen desks — not workstations. If all those desks were occupied, it represented over a third of the company, yet no one was there.

“That used to be for our lawyers,” the guy at the next workstation told me when I asked about the room. “But they all moved to the Caymans after the raid.”

My dismay deepened. A company with almost as many lawyers as coders could only be a patent shark or one with serious legal difficulties, especially since they were all in the Caymans now. In fact, the company’s head office had relocated to the Caymans, where online gambling is legal in a way that it isn’t in the United States and Canada.

My resolve to be realistic slipped another notch or two.

I went home and spent a fitful night trying to bat my conscience down. Did I want to be part of such a company? I resolved to wait a week before making a decision. But, from the way I dragged myself from the SkyTrain and lingered in the nearest Starbucks on the third day, I knew that I was half-out the door already.

The final blow descended when my manager gave me a tour of the rest of the office.. It was on several levels of an old building in Gastown, and I had only seen one level of it so far.

The tour ended in the kitchen. “If you see the light on next door,” the manager said, pointing to a small red-tinged bulb above the door frame, “Keep quiet. They’re filming.”

“Filming?”

“It’s what used to be our adult movie division. They’re a separate company now, but we still share the kitchen space.”

As he spoke, I noticed some long bathrobes draped over the chairs around the table.

Back at my desk. I tried to focus on my work, but I couldn’t. I was never much for gambling or adult movies (which, more often than not, are actually adolescent), but I’m not a prude, either. I’d always taken a more or less feminist of pornography as exploitation, but did I have a right to judge people who were shooting -rated movies of their own free will? In the past, I had met all sorts of people that wouldn’t exactly fit in to the average suburb, so why was I so depressed by the situation I found myself in?

I thought of a photo that had circulated in the local newspapers when the company had been raided, showing the company’s employees standing around on the sidewalk. I could be one of them, I realized.

Stopping all pretense of work, I decided that part of the problem was that, while I might tolerate the company’s business past and present, I didn’t want to be active in it. But an even larger source of my discomfort was the conviction that I had been duped. If the company misrepresented itself to attract employees, what else might it do in the name of business? I thought I had enough evidence to make a reliable guess.

Anyway, the only thing worse than being manipulated would be an attempt to deny the fact. My situation was too depressing for words.

At 11:30, I marched into the manager’s office and told him that I was suffering from post-traumatic stress syndrome, and had been too optimistic to think that I could work regular hours. He was sympathetic, and even told me to come back if I got better, but there was not much chance of that.

A month later, I received a check for the days I had worked. I referred to the cheque as “my avails of vice,” and briefly considered not cashing it. By then, I had started at my highest-paid and most interesting job to date, and I didn’t need the money. But, in the end, I told myself that I had earned it, and rationalized that I deserved some compensation for what I had been through.

I still don’t know whether that line of thought was a copout. But I was glad to be out of that job before it made a gap in my resume that I would have to explain. And, in the end, I benefited from my decision to quit, because it raised me in the estimation of my in-laws. Yet, brief as the experience was, I’ll never forget the mixture of anger and chagrin with which I descended into the seamy side of high-tech.

Read Full Post »

Articles about dealing with a bad boss always seem to center on enduring the situation. They tell you to avoid being judgmental, to understand a boss’ situation, to find ways to relieve your stress – and to leave quitting or complaining as last resorts. The assumptions are always that you are powerless, and that you are the one who has to change. However, a few years ago, I discovered that another alternative exists. Instead of finding ways to cope, you can sometimes train a bad boss into better behavior, even if “better” only means leaving you alone.

The setting was a small high-tech company where I was working as a technical and marketing writer, attached – who knows why – to the testing department. The new manager was a small, fussy man, with a great drive to conform to corporate culture to further his ambition. On being hired, he went out to purchase dozens of books on testing and management to decorate his office, all cataloged with stickers according to his own system, and none of which I ever saw him read. His office was soon decorated with motivational posters, corporate toys, and the most elaborately color-coded spreadsheet printouts ever. The result was so stereotypically perfect that, when a film maker wanted the perfect corporate setting, she chose his office.

As I might have predicted from his mannerisms and office, this new manager loved being in control. He was always insisting on progress reports (which he had the right to expect), and trying to change priorities developed in cooperation with other departments (which he had no business doing). Despite the fact that I had defined my position, he started trying to micro-manage me along with the testers. He also showed an alarming tendency to hold meetings, one on one or as a department several times a day, frequently at ten minutes before quitting time.

The company officers were either clueless or frequently absent, so complaining to them was out of the question. Nor did the manager himself have enough self-reflection that he would have welcomed advice or criticism. At the same time, department morale was plummeting, and the manager was seriously getting in the way of meeting deadlines, so something had to be done.

No one else would do more than make jokes at the manager’s expense, and several seemed worried about losing their jobs. As a consultant, I had seen jobs come and go, so I was less worried and had less to lose.

If the department as a whole wouldn’t act, I decided, it was time for me to show some initiative and lead a revolution of one. But it would be a polite revolution, with never a raised voice – just a calm and firm insistence.

Instead of waiting for the manager to assign priorities, I began telling him what the priorities would be, citing interactions with programming leads and other managers. Since he didn’t know my job, or where it fit into the company’s release schedule, he was more than glad to let me take over. For my part, I had been largely setting my own priorities since I started at the company, so I wasn’t taking on any extra work. Once I established quickly that I knew what I was doing, and would meet my self-imposed schedule, he was more than happy to spend his time producing elaborate color coded spreadsheets of my schedule for his own satisfaction while I returned to being productive.

Avoiding his meetings was a little harder. Fortunately, my work frequently involved making appointments with other members of the company, so I got into the habit of scheduling these appointments around the same time as his meetings, giving me an indisputable excuse to leave. The only meeting that I didn’t try to evade was the weekly departmental planning meeting, which I judged legitimate and occasionally useful for my work.

The meetings just before quitting time were hardest to get around. But, in the end, I hit upon a compromise of attending them until ten minutes after the end of my work day. Then I would plead an excuse, such as a need to meet my wife or to go grocery shopping, and exit. If necessary, I was prepared to point out that, as a consultant, I got an hourly rate, so he should seriously consider if he was making good use of company funds to have me bill an extra hour for a meeting that could just as easily be held during normal business hours, but that fallback position was unnecessary. After three or four weeks, he was soon conditioned to scheduling any meetings with me at other times.

Throughout all these guerrilla tactics, I was careful never to have a direct confrontation with him. I stayed polite, and even joked with him, a tactic that furthered the larger campaign by encouraging him to think of me as an equal rather than a subordinate.

Outwardly, I was a model employee, showing commendable initiative. It was only inwardly that I was undermining his authority.

I do admit that I wished I could tell someone what I was doing. I became fond of whistling “The Black Freighter” from the Threepenny Opera, but, fortunately, no one else in the department was a Bertolt Brecht fan.

In the end, I gained what I had wanted all along: The ability to work unmolested by meaningless interruptions. And when the manager was fired after a few months for incompetence, I felt my subversiveness fully vindicated.

Some people are horrified when I tell this story. In effect, they say that I stepped over the line and didn’t show myself a loyal employee. But, to say the least, I would beg to differ.

No employee is being paid to obey orders. They’re paid for results, and this manager was seriously interfering with those results. While I admit that a large part of my motivation was my own peace of mind, what I did allowed me to better accomplish what I was paid to do. Besides, no job is worth unnecessary stress. For these reasons, I would have no hesitation in doing the same again.

Of course, being a consultant rather than a regular employee, I had the advantage of being more independent than a full-time employee. Also, I knew my job far better than the new manager. But my experience convinces me that most so-called job experts are leaving out some important advice for dealing with problem bosses.

Sometimes, you don’t have to cope. Sometimes, so long as you stay polite and show some initiative, you can survive bad bosses by training them out of their bad behavior.

Read Full Post »

A couple of weeks ago, I wrote an article about the closure of Progeny, a company for which I used to work. Usually, I lack the patience for nostalgia by temperment. Besides, experiences such as my high school reunion last fall have convinced me that such efforts are largely pointless. However, writing the article required contacting old acquaintances, and I became introspective, writing a reminiscence on my Linux Journal blog about working for Progeny and a sequel about Stormix Technologies, the company I worked at before Progeny.

Both, I realized, were part of the path to my current work as a journalist. After writing those articles, it strikes me more strongly than ever that my work history amounts to a circling around my core skills until I actually had the courage to use them.

In school and university, I never worried much about what I would do with my education. I have always been a firm believer in the value of learning for its own sake, and I was too interested in my classes to think much about where I was going. I always had vague plans about being a writer, but I can’t say that I took many concrete steps towards that goal – certainly not any consistent ones, anyway. As a result, I have a work history that might politely be described as chequered. Less polite observers would call it a thing of rags and patches that they wouldn’t even wear to wash the car on Sundays.

After I finished my bachelor’s degree, I literally had no idea what I wanted to do. I had taken a double major, straight out of high school with no more than a semester’s break, and I was mentallly exhausted. I took a minimum wage job in a bookstore to pay my share of the expenses, but it was a nightmare – not just because I endured a Christmas in which every second album played in the mall was the Smurf’s, but because I had never been previously exposed to the idea of books as commodities. Nor was I cut out for dealing with customers arriving one minute after closing or saying things like, “I can’t remember the title or the author, and I can’t explain what the book is about, but it’s got a green cover.”

Seeking escape, I entered graduate school. After all, what else does someone with an English degree do except teach? Besides, it was another few years in which I could put off the decision, and, meanwhile, I was at least talking about books as books.

I had given poetry readings and lectures at my old high school, thanks to the courtesy of Allan Chalmers, my Creative Writing Teacher. However, it wasn’t until I started working as a teaching assistant during my degree that I realized that I had a talent as a teacher. I enjoyed sharing my enthusiasms, and genuinely felt that I was helping students, if only by teaching them enough about writing to survive the rest of the university years.

That belief was enough to propel me through my years as a grad student and for another seven as a sessional instructor. I might not have been doing much writing myself, but at least I was talking about books.

But university English departments aren’t about enjoying literature. They’re about dissecting it, and, as I settled into semi-regular work, I realized that my distaste for straightjacketing literature into the latest trendy theory was counting against me in my struggle for full-time employment. My thesis supervisor and I described ourselves as the department’s “token humanists.” And I started thinking a lot about Robert Graves’ story of his oral defence, in which one professor said that he seemed like a well-read young man, but that he had a serious problem: He like some works better than others.

After seven years, I had squeezed the last hopes out of academia like an old lemon. I attended a presentation about technical writing, and jumped careers. It wasn’t the sort of writing that I wanted to do, but getting paid for writing sounded fine to me. No one cared, either, if I was a post-modernist or post-colonialist. The only priorities were accuracy and meeting deadlines, and since I’ve always been a first rate researcher and well organized, neither was a serious problem. Within six months, I had so much work that I was hiring sub-contractors, and was organizing major projects for international companies.

Unfortunately, I was also mind-thumpingly bored. In desperation, I learned typography so that my working life would have an element of creativity. I branched out into marketing and PR. I became ferociously devoted to learning the technologies I was documenting – not just enough to get by, the way that other writers did, but becoming an expert in them. Boredom receded, but it was still lapping at the outer edges of my mind.

That changed when I worked at Stormix and Progeny and discovered the joys of both free and open source software (FOSS) and of management responsibility. FOSS triggered my latent idealism, as well as my growing impatience with the average executive I met. When those jobs ended and I returned to being a technical writing consultant, boredom came flooding back. Only now, it was combined with a impatience at the mediocrity of those giving me orders.

With growing desperation, I realized that I didn’t care if the projects I was working on finished on time. Nor did anyone, else, really. Without a prospect in sight, I quit my last consulting job and started doing FOSS journalism. And, to my surprise, it seemed to be what I had wanted to do all along. It’s already lasted three years – about two years longer than any full-time employment I’d had since I went back to grad school – and I don’t think I’ll be tiring of it in a hurry.

That revelation may have seemed like a blinding flash of the obvious. However, I don’t think I could have done it sooner. I didn’t have the experience to know what I wanted or could do. I needed the experience of teaching to understand that I needed meaning in my work, and the experience of technical writing to believe that I could make a sustained effort in my writing. And, when I discovered FOSS, I also found a cause that I wanted to write about.

Looking back, I wish that I could have hurried the experience. However, I doubt that I could. Clarity and experience take time, and, if I got to where I want to be late, at least I got there. How many people can say that?

Read Full Post »

I’m a journalist, so in an interview I’m usually the one asking the questions. Today, I experienced a role reversal when Samartha Vashishtha published an interview with me in Indus, the online magazine for the India Chapter of the Society for Technical Communication. I’ve been interviewed several times before, a couple of times for pod casts and a few more for mainstream publications needing some quick expertise about free software, but for some reason, this time I felt queasier than most.

It’s not that Vashishtha misrepresented me, or anything like that. He doesn’t make say anything I wouldn’t have said. He conducted the interview via email, and, so far as I can see without going to the trouble of a detailed comparison,his largest change was to add exclamations marks. I don’t use many exclamation marks myself – nor many initial capitals, being a lower case sort of person – but no great matter if he did.

On the whole, he did a very professional job, The sense is there, and I even got a plug in for free software, explaining how it would help the developing world in general and India in particular to meet the industrialized world on an equal footing.

I also mentioned that free software projects are also a great place for technical writers to gain experience, since most developers have little interest in documentation and generally welcome anyone willing to undertake the task.

Still, I was a little reluctant to do the interview at first. It’s been three years since I last did a technical writing assignment. In fact, it was the monotony of my last major contract that made me desperate enough to take the leap into journalism.

Besides, how could I be sure that anything I was saying was relevant? I occasionally drop by the Techwr-l mailing list to see what old acquaintances are saying, but I can’t pretend to know the trends in the industry anymore.

Even more importantly, I’ve moved a long way down my road since my last technical writing contract. A large part of the time I was a technical writer, I was facing some of the worst times of life, so on the whole it’s an era of my life that I’m not eager to revisit.

Maybe this uncertainty explains why I sound so stuffy and so full of opinions in the interview. I don’t normally sound that way. (or, if I do, my family and friends are polite enough to pretend otherwise) But, even talking in generalities, perhaps I was circling too close to the bad years, and knew it.

Most importantly, the whole time I was writing my responses, I kept feeling distinctly unsettled. Being a journalist yet being the one interviewed was disorienting, like standing between two mirrors and seeing one of the infinite reflections starting to move independently. I don’t feel important or proud to be interviewed — just dazed at the role reversal involved.

Read Full Post »