Feeds:
Posts
Comments

Archive for the ‘computers’ Category

Ordinarily, I don’t refer to commercial products when I blog. Still less do I ridicule people who appear earnest and are apparently engaged in a labor of love. However, in the case of Analogon Book, whose publicity reached me at Linux.com, I have to make an exception on both accounts.

Analogon Book was invented by Byron and Jorita Lockwood, a couple whom, from their picture, seem as middle-aged and middle-American as they come. Let them explain the origin of their brain-child in their own words. When they discovered computers:

The biggest complaint we had was managing the different logons and passwords. Byron does a lot of research online and used scraps of paper to manage his logons and passwords. Jorita would come along and throw them away!

During an argument over Jorita throwing away the scraps of paper she yelled out “why doesn’t someone create a book to write these down in.”

We thought it was good idea then and we still do! We looked in stores to see if this had been marketed, We could not find a book to meet this specific need, and never found the item.

So we created AnaLogon book!

Yes, that’s right: It’s a book where you can write down your passwords.

At this point, if you are not sitting back in dumbfounded amazement or giggling helplessly, let me explain a few basics about passwords and computer security.

Contrary to the Lockwood’s apparent belief, passwords are not just a nuisance. They’re intended to ensure your privacy and to prevent the vicious-minded from stealing information like your credit card numbers.

For this reason, one thing you should never do is write down your passwords, except perhaps on a piece of paper in your safety deposit box, so that your heirs can access your files when you’re dead.

You should certainly not write them down anywhere near the computer. Most computer break-ins are not remote, but committed by people with physical access to the computer. If you keep your passwords in a clearly labeled book, you might as well send out invitations to likely crackers to have your computer raided.

What makes this quote and product so sad and so blackly humorous at the same time is that the Lockwoods never seemed to have researched what they were doing. Instead, they plunged ahead with their business plan – apparently with people who are equally ignorant – and developed their web site, in all its exclamation-point studded glory. Apparently, they never once dreamed as they babbled about the convenience of their brain-wave, that what they are really doing is selling an invitation to thieves.

That’s why, when I see them saying such things as, “Get one for yourself, your spouse, student, parents, and friends” or recommend the book as a gift to a teacher (especially a computer teacher, a diabolical voice inside me whispers), I find it hard to keep a straight face.

And when their site talks about their product as a solution to hard drive crashes as though they have never heard of backups, or as useful for infrequently visited sites as though they have never found the password manager in their web browser, my initial perception is reinforced once again. These good people simply have no idea.

At first, the consensus of those on the Linux.com IRC channel was that the Analogon Book must be a joke. In fact, I wrote to the publicist to ask, and was assured it was seriously meant. Nor, in the weeks since, have I seen any of the distributors treat it as a joke. Thirty years of the personal computer, and a product like this can still not only be conceived, but brought to market without any second thoughts.

But at this point, it would be cruel to explain things to the Lockwoods, I think, shaking my head sadly — and then, I’m ashamed to say, I think of the website’s solemn description of how the product works, and I’m giggling again, and hating myself for doing so.

Read Full Post »

I’m almost getting afraid to look at a newspaper or any other traditional print media. Every time I do, some writer or other seems to be belittling an Internet phenomena such as blogging, Facebook, or Second Life. These days, such complaints seems a requirement of being a middle-aged writer, especially if you have literary aspirations. But, if so, this is one middle-aged, literary-minded writer who is sitting out the trend.

The Globe and Mail seems especially prone to this belittling. Recently, its columnists have given us the shocking revelations that most bloggers are amateurs, that Facebook friendships are shallow, and that, when people are interacting through their avatars on Second Life, they’re really at their keyboards pressing keys. Where a decade ago, traditional media seemed to have a tireless fascination with computer viruses, now they can’t stop criticizing the social aspects of the Internet.

I suppose that these writers are only playing to their audiences. After all, newspaper readers tend to be over forty, and Internet trends are generally picked up those under thirty-five. I guess that, when you’re not supposed to understand things, putting them down makes you feel better if you’re a certain kind of person.

Also, of course, many columnists, especially those who aspire to be among the literati, see the rise of the Internet as eroding both their audiences and their chances of making a living. So, very likely, there’s not only incomprehension but a primal dose of fear behind the criticism that deserves sympathy.

At first glance, I should sympathize with them. I’m in their age group, share something of their aspirations, and I’m cool to much of the social networking that has sprung up in recent years. Yet somehow, I don’t.

For one thing, having been on the Internet several years longer than anybody else, I learned long ago that communities exist for almost everyone. If you don’t care for Facebook, you can find another site where you’re comfortable. If you dislike IRC, you can find a mail forum. If you can’t find a blog that is insightful and meaningful, you probably haven’t been looking around enough, but surely the Pepys’ Diary page will satisfy the most intellectual and literary-minded person out there. So I suspect that many of those complaining are still unfamiliar enough with the technology that they don’t really know all that’s via the Internet.

Moreover, although I ignore large chunks of the Internet, my only regret is that it hadn’t developed ten or fifteen years earlier so that I could have been a young adult when it became popular.

Despite, my age, the Internet has been the making of me. It’s helped to make the fantasy and science fiction milieu that I discovered as a boy become mainstream– and if that means people are watching pseudo-profundities like Battlestar Galactica, it also means that a few are watching movies Neil Gaiman’s Stardust or Beowulf and moving on to discover the stories and novels that really fuel the fields. It’s given me a cause worth focusing on in free software, and a job as an online journalist that already has been one of the longest lasting of my life, and that still doesn’t bore me. Without the Internet, I just wouldn’t be the person I am today.

Nor, I suspect, would I like that alternate-universe me very much.

Having absorbed the toleration that underlies much of the Internet, I can’t help feeling that criticizing other people’s browsing habits shows a lack of manners and graciousness that is grounds for shame rather self-righteousness. But, in my case, it would show a lack of gratitude as well.

Read Full Post »

Like many people who spend their working hours with computers, I’m often asked by friends and neighbors for help. I’m an ex-teacher, and I volunteer at the free clinics held weekly by Free Geek Vancouver, so I don’t mind; teaching is close to a reflex with me. But one thing I do mind – very much – is when I ask the person I’m helping for some information about their computers or what caused the problem and they reply, “I don’t know. I’m just a techno-peasant” or say that they leave such technical matters to their pre-teens.

What irks me is not just the little giggle or the helpless shrug that accompanies such statements, regardless of whether a man or a woman is making themt. Nor is it the fact that the term is at least twenty years out of date. Instead, it’s the fact that the people who make these responses seem more proud than ashamed of their ignorance.

Why anyone would choose to boast about their ignorance is beyond me. Of course, nobody can be an all-round expert. Moreover, if you don’t mentally bark your shins against your own ignorance from time to time, you’re probably leading too shallow a life. But why boast about your shortcomings? Personally, I consider the fact that I am not fluent in another language, and know little about wines or central European history to be defects, and hope to correct them some day. Meanwhile, if I have to admit to my ignorance, I do so shamefacedly, and quickly change the subject.

As for computer skills, surely computers have been around so long that an average middle class North American should know their way around a computer. I don’t expect them to be able to write a “Hello, world” script if threatened at gun point, but how could they help not learning some basic system administration and hardware care?

I mean, I’m an English major with no formal background in computing whatsoever. If I can learn enough to write about computers, then surely most people can learn basic maintenance. After thirty years of the personal computer, defragging a hard drive or plugging in the cords to your computer should be as much a part of everybody’s basic skill set as cooking a meal or changing the oil in their car. Yet, as I continually find when asked for help, most people still haven’t learned these skills.

What’s worse, the implication of these reactions is that those who make them have no intention of correcting their ignorance. It doesn’t seem to be a reflection of class, an implication that they’re too important to bother themselves with details, as though they’re a high-powered CEO and I’m the janitor. Rather, it’s as if, having reached some landmark of adulthood – turning 21, perhaps, or receiving their master’s degree – they’ve decided they’ve done all the learning they need for this life time, and nobody can trick them into doing any more.

As someone who’s always believed in learning, this attitude horrifies me. So far as I’m concerned, the only time you stop learning is when you die. The idea that anyone would want to anticipate this end to learning is hard for me to understand. If nothing else, what are they going to do with the next fifty or sixty years?

Just as importantly, this refusal to learn undermines the whole idea of teaching. To me, the point of teaching is give students the skills they need to function on their own. But when people describe themselves as techno-peasants, what they’re telling me is that they have no intention of learning to function independently. They’re calling me in, not to help them learn to cope for themselves, but as a convenience that allows them to keep from learning.

And, considering they’re asking me to do the sort of things in my spare time that I do in my working hours – and for free — the request is a high-handed imposition. They’re asking me to waste my time for their convenience – frequently not just once, but often for the same problem, over and over.

Despite these lines of thought, I almost never turn down the requests for help. Some people are making a genuine effort to learn, and there’s always a chance that the rest will learn despite themselves. Yet I wonder if any of them guess that I think less of them once I understand that the only thing they’re willing to learn is how to excuse their own helplessness.

Read Full Post »

Today, I received the following e-mail. At the sender’s request, I have removed any personal details:

I was wondering if you had any advice for me about how to perform some marketing/pr for my Linux [project]. I’ve started doing interviews with developers and I have created a community news site.

But is there anyway I could possibly get [my project] mentioned in a
magazine like Linux Journal? Is there any free advertising I could take advantage of on certain web sites? I thought you may have some ideas for me because you have experience with this kind of thing. Any help you
could provide me would be appreciated.

I generally receive about 3-4 requests of this sort a year, so I decided to post my reply here, so I can refer others to it:

You’re not likely to find free advertising on sites that will do you any good, so your best bet is to try to get on the various sites as a contributor. Linux.com only takes original material for its main features, but it does have the NewsVac items, the three or four line link summaries on the right of the page that are very popular. And, of course, sites like Slashdot, Digg, and Linux Today are all about links to already published material.

If you have a solid piece of news — which for a piece of free software usually means new releases and unique features — at Linux.com you can pitch a story and write it yourself. However, you’ll be asked to include a disclaimer
that explains your connection with your subject matter, and the article will be rejected if you are being a fanboy. That means you can’t review your own distro, but you might be able to do a tutorial on a distribution’s packaging system, for instance.

Alternatively, you can send news releases in the hopes of convincing either an editor or a writer to cover your news. However, don’t be pushy. Submitting a news release once is enough, and popping back several times to ask if it was received or whether anyone is interested will probably only guarantee that you’ll annoy people so that they won’t cover your news no matter how big it is.

The ideal is to build up an ongoing relation with a few writers, in which you give them stories to write about — we’re always looking — and they give you the coverage you want when you have news that readers might want to hear.

Of course, you open yourself up to negative comments if the software deserves them, but that’s the chance you have to take. However, for the most part, both commercial companies and large community projects find the
risk well worth taking. It’s not as though any of the regular writers deliberately sit down to review with a determination to be negative (although, conversely, they don’t set out to praise, either: We’re not just fans, either).

This process doesn’t happen overnight, so be patient. But, in the long run, you should get some of the publicity you seek.

I don’t know whether this information is useful to others. To me, it seems that I’m saying the obvious, but part of that reaction is undoubtedly due to the fact that I deal with these things daily. Perhaps to others, these thoughts aren’t obvious, so I’m hoping that someone will find them useful

Read Full Post »

Long ago, I lost any queasiness about the command line. I’m not one of those who think it’s the only way to interact with their computers, but it’s a rare day that I don’t use it three or four times on my GNU/Linux system. No big deal – it’s just the easiest way to do some administration tasks. Yet I’m very much aware that my nonchalance is a minority reaction. To average users, the suggestion that they use the command line – or the shell, or the terminal, or whatever else you want to call it — is only slightly less welcome than the suggestion that they go out and deliberately contract AIDS. It’s a reaction that seems compounded of equal parts fear of the unknown, poor previous experiences, a terror of the arcane, and a wish for instant gratification.

Those of us who regularly try two or three operating systems every month can easily forget how habit-bound most computer users are. The early days of the personal computers, when users were explorers of new territory, are long gone. Now, the permanent settlers have moved in. The average computer user is no longer interested in exploration, but in getting their daily tasks done with as little effort as possible. For many, changing word processors is a large step, let alone changing interfaces. And both Windows and OS X encourage this over-cautious clinging to the familiar by hiding the command line away and promoting the idea that everything you need to do you can do from the desktop. The truth, of course, is that you can almost always do less from a desktop application than its command line equivalent, but the average user has no experience that would help them understand that.

Moreover, those who have taken the step of entering cmd into the Run command on the Windows menu have not found the experience a pleasant one. DOS, which remains the command line that is most familiar to people, is an extremely poor example of its kind. Unlike BASH, the most common GNU/Linux command line, DOS has only a limited set of commands and options. It has no history that lasts between sessions. Even the act of navigating from one directory to the next is complicated by the fact that it views each partition and drive as a separate entity, rather than as part of a general structure. Add such shortcomings to the ugly, mostly unconfigurable window allotted to DOS in recent versions of windows, and it’s no wonder that DOS causes something close to post-traumatic stress syndrome in average users. And, not having seen a better command line interface, most people naturally assume that BASH or any other alternative is just as stressful.

Yet I sometimes wonder if the main reason for nervousness about the command line isn’t that it’s seen as the area of the expert. In recent years, many people’s experience of the command line is of a sysadmin coming to their workstation, opening a previously unsuspected window, and solving problems by typing something too fast for them to see from the corner into which they’ve edged. From these encounters, many people seem to have taken away the idea that the command line is powerful and efficient. That, to their minds, makes it dangerous – certainly far too dangerous for them to dare trying it (assuming they could find the icon for it by themselves).

And in a sense, of course, they’re right. In GNU/Linux, a command line remains the only interface that gives complete access to a system. Nor are the man or info pages much help; they are often cryptically concise, and some of the man pages must have come down to us almost unchanged from the 1960s.

The fact that they are also wrong is beside the point. Many users aren’t clear on the concept of root accounts, file permissions, or any of the other safeguards that help to minimize the trouble uninformed users can blunder into.

The trouble is, understanding these safeguards takes time, and investing time in learning is something that fits poorly with our demand for instant gratification. By contrast, using a mouse to select from menus and dialogs is something that people can pick up in a matter of minutes. Just as importantly, the eye-candy provided by desktops makes them look sophisticated and advanced. Surely these signs of modishness must be preferable to the starkness of the command line? From this attitude, insisting on the usefulness of the command line is an anachronism, like insisting on driving a Model T when you could have a Lexus.

The truth is, learning the command line is like learning to touch-type: in return for enduring the slowness and repetitiousness of learning, you gain expertise and efficiency. By contrast, using a graphical desktop is like two-fingered typing: you can learn it quickly, but you don’t progress very fast or far. To someone interested in results, the superiority of the command line seems obvious, but, when instant gratification and fashion is your priority, the desktop’s superiority seems equally obvious.

And guess which one our culture (to say nothing of proprietary software) teaches us to value? As a colleague used to say, people like to view computers as an appliance, not as something they have to sit down and learn about. And, what’s more the distinction only becomes apparent to most people after they start to know their way around the command line.

Whatever the reasons, fear and loathing of the command line is so strong that the claim that GNU/Linux still requires its frequent use is enough to convince many people to stick with their current operating system. The claim is no longer true, but you can’t expect people to understand that when the claim plays on so many of their basic fears about computing.

Read Full Post »

Tomorrow, the last computer in the house with a floppy drive goes to Free Geek for recycling. An era in computing is officially over for me.

Actually, the era was over several years ago. Even four years ago, when I bought my last computer, I thought twice about bothering with a floppy drive. Nor do I think that I’ve use the drive any time in the last two years, nor even more than once or twice in the year before that. I’d already converted to flash drives, and only the free-spending, why-not attitude that comes when you’re making a large purchase made me get a floppy drive in the first place, on the remote chance that I might need it.

I didn’t, really. When I looked through the nearly two decades’ worth of floppies that I’d accumulated, I found all of them working — unsurprisingly, since I take good care of my storage media. But I hadn’t used them for anything except a quick means of transferring files with older computers for eight or nine years, and they had nothing that I couldn’t do without.

Back when I got my first computer, getting three and a half inch floppies had seemed like a cutting edge idea. Even the person from whom I bought thought that five and a quarter floppies would be more sensible. But I figured that disks that were not only smaller and more rugged but boasted twice the capacity — a whole 720k! — was the wave of the future.

I was right, of course, and smug about it. At first, I did have difficulties when buying programs (this was back when free software consisted of emacs and not much else). At least once, I carried disks to Kwantlen College where I was a sessional instructor, so I could take advantage of the different size drives in my office to copy programs into a format that my computer at home could use. Yet, before I’d had the computer a year, the larger sized floppies started to disappear.

Then for years, floppies were my main source of backup. I remember how strange it seemed when floppies started coming in black, and then even colors. And, while at first the differences in quality between name brands like Sony and cheaper brands were obvious, it soon disappeared.

After a few years, too, 720k no longer seemed as large. In rapid succession, I switched to syquest drives, then CDs. Eventually, I moved to DVDs and an external hard drive for backup. The prices started falling on floppies, and so did the amount of shelf space they took up. The last time I happened to notice, floppies were selling ten for six dollars. Yet I remember a time when thirty dollars seemed a good price for a name brand collection of ten.

In a way, I suppose the fact that you can buy floppies at all is a testimony to the force of habit. Even my smallest flashdrive has over three hundred times the capacity of a standard floppy — the 1.44 megabytes ones having never really caught on. They’ve been yesterday’s technology for a lot of yesterdays.

I don’t get nostalgic for hardware, although it’s a good piece of historical trivia for fiction to recall that a single floppy was once considered the storage necessary for the average popular novel. Even when I name our cars, it’s more a joke than any sign of affection. Still, the end of my personal floppy era is another milestone in the passage of time, just as the moment when I realized that the IBM Selectric that I bought with a small inheritance from my grandfather was obsolete.

Come to think of it, I still have that squirreled away on the top shelf of the closet in the spare room. My reasoning, I think, was that I’d have a backup if the computer failed. Of course, exactly how I thought an electric typewriter would be of any use when I couldn’t use a computer is a mystery, considering that most of those circumstances would involve a loss of power. So, I suppose the next bit of housecleaning is to haul that piece of scrap iron away.

Read Full Post »

My review of the latest release of Ubuntu was picked up by Slashdot this week, releasing a flood of criticism.

Although the article praised Ubuntu, it was also one of the first to mention some of its shortcomings, so it probably provoked more reaction than the average review. Much of the criticism was by people who didn’t know as much about a subject as they think they do, and even more was by people who had either misread the article or not read it at all. But the comments I thought most interesting were those who criticized me for suggesting that in some cases Ubuntu made things too simple, and didn’t provide any means for people to learn more about what they were doing. Didn’t I realize, the commenters asked, that the average person just wanted to get things done? That few people wanted to learn more about their computers?

Well, maybe. But as a former teacher, I can’t help thinking that people deserve the chance to learn if they want. More – if you know more than somebody, as Ubuntu’s developers obviously do, you have an obligation to give them the opportunity. To do otherwise is to dismiss the average person as willfully ignorant. Possibly, I’m naive, but I’m not quite ready to regard others that way.

Anyway, which came first: operating systems like Windows that prevent people from learning about their computers, or users who were fixated on accomplishing immediate tasks? If computer users are task-oriented, at least some of the time, the reason could be that they’re conditioned to be so. Perhaps they’ve learned from Windows that prying into the inner workings of their computer is awkward and difficult. We don’t really know how many users will want to learn more, given the opportunity.

Nor will we, until we design graphical interfaces that give users the chance to learn when they want to. Contrary to one or two commenters, I’m not suggesting that every user will always want to do things the hard way and use the command line – I don’t always want to myself, although I gladly do so when typing commands is the most efficient way to do the task at hand.

But where did so many people get the assumption that there’s such a contradiction between ease of use and complexity, that choosing one means that you forgo the other? It’s mostly a matter of tidying advanced features into a separate tab, or perhaps a pane that opens to reveal features that a basic user doesn’t want.

However, when so many people believe in the contradiction, we’re not likely to see graphical interfaces that are as useful to demanding users as basic ones.

Even more importantly, I suggest that giving users the chance to educate themselves is a corollary of free software principles. If free software is only going to empower users theoretically, then it might as well not do so at all. To help that empowerment along, free software has to provide the opportunity for users to learn, even though few may take the opportunity. Yet, so long as the chance exists that any users want the opportunity, it needs to be offered.

Moreover, I believe that, given the chance, many people will eventually embrace that opportunity. The first time that they use a free software interface, they may be focusing mainly on adjusting to so much that’s new.

However, eventually, many of them will learn that they can do things their own way and take more control. And eventually, surrounded by such choice, many may take advantage of it. If they don’t know the choices are available because their desktop has been simplified until the choices are obscured, then the developers are doing them a dis-service.

Some might say that simplification is needed to attract people to GNU/Linux. Personally, though, I doubt that exactly the same thing they can get on Windows is likely to attract anyone. If free operating systems are going to get a larger market share, then it will most likely be by providing a new perspective on computing. I like to think that new perspective should be attempting to accommodate everyone, not just beginners.

Read Full Post »

Setting up a new workstation is the easiest time to choose a new GNU/Linux distribution. Having just installed Fedora 7 on my laptop so I’d have an RPM-based system available for my work, I seriously considered ending my five-year endorsement of Debian on my workstation. Perhaps I should follow the crowd and go to Ubuntu? Some other DEB-based distribution? Maybe Slackware or Gentoo to grab a bit of geek-cred? But after debating my choices for a couple of days, I decided to stick with Debian for both technical and philosophical reasons.

Oh, a small part of my decision was convenience. Over the years, I’ve built up three pages of notes of exactly what I need to install, configure, and modify to customize my workstation exactly as I prefer. Probably, I could port most of these notes to another distribution, but I would have to change some of the configuration notes, as well as the names of some of the packages. For better or worse, I’m comfortable with Debian — sometimes, I think, too comfortable.

However, a larger part of my decision is practical. Not too many years ago, Debian held a decided advantage because its DEB packages, if properly prepared, were one of the few that automatically resolved dependencies when you added software. That’s no longer true, of course, but Debian’s policy of packaging everything from kernels to drivers means that many installation tasks are far easier than in most distributions.

Moreover, I appreciate Debian’s policy of including recommended and related packages in the descriptions of packages. These suggestions help me to discover software that I might otherwise miss, and often help the packages I originally wanted to run better.

Another advantage of Debian is its repository system. As many probably know, Debian has three main repositories: the rock-solid, often less than cutting edge stable repository, the reasonably safe testing, and the more risky unstable. For those who really want the cutting edge, there is also the experimental repository. When a new package is uploaded, it moves through these repositories, eventually slipping into stable when it has been thoroughly tested. Few, if any distributions, are more reliable than Debian stable, and even Debian unstable is generally about as safe as the average distribution.

What this system means for users is that they can choose their preferred level of risk, either for a particular package or for their system as a whole. For instance, by looking at the online package descriptions, you can see what dependencies a package in unstable has, and decide whether installing it is worth the risk of possible damage to their system, or else judge how easily they can recover from any problems. This system means that most experienced Debian users have a mixed system, with packages from more than one repository — an arrangement that is far preferable to blindly updating because an icon in the notification tray tells you that updates are available. It also means that official releases don’t mean very much; usually, by the time one arrives, you usually have everything that it has to offer anyway.

In much the same way, each individual repository is arranged according to the degree of software freedom you desire. If you want, you can set up your system only to install from the main section, which includes only free software. Alternatively, you can also use the contrib section, and install software that is free in itself but which relies on unfree software to run, such as Java applications (at least until Java finishes becoming free). Similarly, in the non-free section, you can choose software that is free for the download but is released restrictive licenses, such as Adobe’s Acrobat and Flash players. Although my own preference is to stay with main, I appreciate that Debian arranges its repositories so that I can make my own choice.

Almost as important as Debian’s technical excellence and arrangements is the community around the distribution. This community is one of the most outspoken and free-thinking in free and open source software. This behavior is a source of irritation to many, including Ian Murdock, the founder of the distribution and my former boss, who thinks that the distribution would run more smoothly if its organization was more corporate. And, admittedly, reaching consensus or, in some cases, voting on a policy can be slow, and has problems scaling — problems that Debian members are well-aware of and gradually developing mechanism to correct without changing the basic nature of the community.

Yet it seems to me that Debian is, in many ways, the logical outcome of free software principles. If you empower users, then of course they are going to want a say in what is happening. And, despite the problems, Debian works, even if it seems somewhat punctilious and quarrelsome at times, insisting on a standard of purity that, once or twice, has even been greater than the Free Software Foundation’s. The community is really a daring social experiment, and its independence deserves far more admiration than criticism.

Of course, I could get many of the same advantages, especially the technical ones, from Ubuntu, Debian’s most successful descendant. But Debian has had longer to perfect its technical practices, and, if the Ubuntu community is politer, its model of democracy is further removed from the town meeting than Debian’s. Certainly, nobody can demand a recall of Mark Shuttleworth, Ubuntu’s founder.

Which brings up another point: I’m reluctant to trust my computer to an eccentric millionaire, no matter how benevolent. This feeling has nothing to do with Mark Shuttleworth himself, whom I’ve never met, and who, from his writing, seems a sincere advocate of free software. But one of the reasons I was first attracted to free software was because, in the past, my computing had been affected by the whims of corporation, notably IBM’s handling of OS/2 and Adobe’s neglect of FrameMaker. Trusting my computing to an individual, no matter how decent, seems no better. I’d rather trust it to a community.

And Debian, for all its endless squabbles and the posturing of some of its developers, has overall proven itself a community I can trust. So, at least for the time being, I’ll be sticking with Debian.

Read Full Post »

Having barely recovered from getting my new laptop set up, I spent this weekend setting up my new workstation. Since I only buy a new computer every three or four years, it’s a labor-intensive job – a real busman’s holiday, since I do a dozen or more installations of operating systems each year as a reviewer. It’s also a chance to learn first hand the recent changes to hardware.

Because I’ve used alternative operating systems as long as I’ve had a computer, I always buy my workstation from a shop that does custom work. That way, I can be sure that I buy both quality parts and ones that will work with my preferred operating system. The shop I’ve dealt with for my last purchases is Sprite Computers, a Surrey store that I recommend unreservedly to anyone in the Lower Mainland.

This year, buying a custom machine backfired unexpectedly: My Debian GNU/Linux system worked perfectly because I had checked everything I bought, but I had to download drivers for the ethernet, sound, and video cards for Windows. Apparently, GNU/Linux hardware support may have finally surpassed that on Windows, as some pundits have been saying. But it’s been ten months since I’ve had a Windows installation about the house, and the added bother makes me feel that I haven’t been missing anything (aside from some games, which I never have time to play any more, anyway). I keep a small Windows partition because I sometimes need to check a reference to the operating system in a review, but for personal use, I wouldn’t miss it (nor the twinge of guilt I feel as a free software advocate for having a copy of Windows in the first place).

Another advantage of getting a custom computer is that, in placing my order, I always hear the latest trends in the business. Talking over my order with a sales rep, I learned that Windows XP was outselling Vista by a ration of fifty to one. Furthermore, Windows XP is expected to stop selling next Febuary, but computer businesses are already stockpiling copies. So much for claims about Vista’s sales.

I also learned that LightScribe, the DVD-etching technology I tried for the first time on my new laptop, is in no greater demand, either. The drives and DVDs cost more for LightScribe, and it’s a slow, currently monochromatic technology that isn’t essential.

Similarly, the store sells video cards from NVIDIA than from ATI. That trend was already obvious the last time I bought, but it seems to have accelerated, perhaps because of NVIDIA’s aggressive marketing of other hardware products makes a bundle deal attractive. ATI’s sale to AMD may also make a difference, since manufacturers might be waiting to see what happens.

Of course, those who order custom computers are a small percentage of the public, but the comments I heard are interesting, all the same, since they are some of the few available from an unbiased source (that is, not from the manufacturer or a fan-boy review).

I infer other buying trends by the point at which increases in size or functionality suddenly take a jump in price. Sometime, this point is obvious from sales flyers that come to the door, but not always. For video cards, that point is 256 gigabytes of RAM. For hard drivers, it’s 500 gigabytes. For flat screen monitors, it’s 22 inches. Total system RAM is stalled at two gigabytes, apparently because Windows, which is the largest market, can’t handle more without an adjustment that most lay users don’t know. Generally, I find that ordering a system according to this point means that, three or four years in the future, I still have an adequate system, if no longer a cutting edge one.

For now, I appreciate a number of features in my new workstation. I can appreciate the increase speed, especially on GNU/Linux, which now zips along quite nicely. The dual-core processor, now standard on all new machines, makes multi-tasking smoother, too.

As for the wide screen monitor, which barely fits on the desk, that’s a practical change that I took to at once.

Yet I think the most welcome innovation is the cube case. Its dimensions – – 9 x 10 x 14 inches — small enough that I plan to put both my main and test computers under the same desk and use a KVM switch to move between them. Its blue light, although garish, means that I can crawl around under the desk chasing wires without carrying a flashlight. But, best of all, both sides are so well-ventilated that the overheating problems I’ve had in the hot weather may be a thing of the past.

These aren’t dramatic changes. Their relative modesty compared to changes in previous buying cycles suggests that the computer market is largely saturated and likely to remain so unless a breakthrough technology emerges. So, probably sooner than later, I will take the changes for granted. Just now, I shake my head when I realize that I now have flash drives with more memory than my first computer, but, on the whole, I don’t have a hardware fetish. Model numbers and stats seep through my head faster than they enter, and, so long as hardware works as advertised, I’m content. And I’m happier still to stop thinking of hardware, and get back to the business of writing.

Read Full Post »

In Jungian psychology, the Shadow is a figure who is everything that you are not. Often, it is seen as evil. The Shadow can be helpful in establishing a sense of self, but a personal identity based only on the Shadow is dependent and reactive, and can easily become unhealthy.. In fact, if you define yourself only in terms of the Shadow, you risk taking on characteristics of the Shadow, partly because you are refusing to deal with the aspects of your personality that you have invested in the Shadow, and partly because anything seems justified in order to fend off the shadow.

When people in the free software community solemnly tell me that “Eternal vigilance is the price of freedom,” and draw obsessive diagrams of all the ways that Microsoft is undermining the community, that’s what I see: People on the brink of assuming some of the traits they claim to despite in their Shadow.

Fighting the Shadow can be dramatic and lend purpose to people’s lives, but it doesn’t make for sound thinking, even in their own terms. It lures them into thinking in dichotomies, believing that everyone must either be a vigilant soldier or else an optimist too full of naive to see a threat. With no middle ground, they can lose allies. Similarly, in focusing on one Shadowy figure, they risk overlooking other concerns.

And let’s say they’re right: Microsoft is the Great Satan, and an apocalyptic battle is just a matter of time. What happens once the Shadow is defeated? Inescapably, a good part of their purpose in life has gone, because they have lost all that they measured themselves against.
You can’t completely ignore Microsoft’s actions, even those that are not directly concerned with free software (In previous posts, I was exaggerating for rhetorical effect). Microsoft’s influence is simply too great. But I don’t want to ignore other things while keeping an eye out for possible concerns.

The free software community has a lot to be proud of. Collectively, its members have built an alternative that, overall, is comparable to its proprietary rivals. It’s done so by developing collaborative work methods, and principled stands that give ordinary people control over important parts of their lives, and helps the poor and those handicapped by a lack of national development meet the privileged on a more equal footing. It’s changed how business is done. It’s helped to preserve minority languages. It’s green. All these are important accomplishments.

That’s how you overcome the Shadow – by building a self-contained identity that robs it of its power over you.

I don’t know about anyone else, but, at the end of my life, I’d rather look back and remember that I played a small role in those accomplishments than admit I spent my life hating a corporation. It’s not as exciting as imagining yourself locked in adversity with a Dark Lord, but it’s certainly more constructive and longer lasting – to say nothing of more interesting.

Read Full Post »

« Newer Posts - Older Posts »