You are here: Articles --> 2007 -->
The myth of "the user”
Vous êtes ici : Essais --> 2007 --> The myth of "the user”
By Geoff Hart
Previously published as: Hart, G. 2007. Viewpoint: The myth of "the user”. Indus, newsletter of STC India, Vol. XI(1), Feb.-March 2007. http://www.stc-india.org/indus/
Most of us have heard the warning that we shouldn’t refer to our audience as “users” because, at least in the North American idiom, the word is also used as a synonym for drug abusers. Despite images of hung-over computer users with bags under their eyes, playing "just one more" game of Solitaire before heading home, I've always thought that logic to be a bit of a stretch. But I do have considerable sympathy for the notion that this label dehumanizes people by turning them into just one more faceless part of the already anonymous technology that governs our destiny far more strongly than "the stars" or "Fate" ever did in pre-technological times.
I sometimes succumb to a bit of cynicism, and increasingly, I find it appropriate to wonder whether computers are truly our allies. Computerization and modern software have perpetuated an unfortunate situation in which our main role is increasingly to do things for computers rather than relying on them to ease our lives by doing things for us. Increasingly, in those rare moments when it doesn't seem that computers are actively out to get us, interactions with our digital nemeses revolve around efforts to make us feel useful to the computer. There's a long, not very proud, history behind this problem.
In the 1970s, personal computers didn't yet exist; instead, you typed programs onto cardboard cards that were punched like the instructions for a Jacquard loom, entered them into the computer using a card reader that resembled the machines that banks still use to count money, and displayed their results on enormous teletype terminals that resembled typewriters from Hell. These devices connected to distant mainframe computers via acoustic couplers that clamped onto a telephone handset like a starving octopus prying open an abalone. The teletype's clattering shook the room so hard that it regularly dislodged the coupler and lost the connection. But back then, we knew we were pioneers at the mercy of hostile technology and expected to meet the fate of Hollywood-style settlers trying to colonize the American West in the 1800s. We didn't even have wagons to draw into a circle to defend us against hostile natives; Microsoft Wagon® didn't arrive until many years later. But we'd seen Star Trek, and we knew computers would improve. Eventually. Possibly even before we were too old to understand them.
Fast forward a decade. By this time, the punchcard machines used to type and store programs had been replaced by keyboards attached to video terminals—still attached to distant mainframe computers, mind you, but at least the connection was hardwired and thus reliable, and you didn't emerge from the computer lab deafened. We still had to type cryptic, arcane, misleading commands at a command line, and suffer the consequences of sloppy typing, but progress was imminent; word out of Xerox PARC was that these cryptic, arcane, misleading snippets of incomprehensible text would soon be replaced by cryptic, arcane, misleading mouse gestures. And so it came to pass, and now we have Windows XP and sooner than I'd like to think, Windows Vista.
I eventually came to understand that the problem with all this progress lay in a flaw in the fundamental paradigm of computers: that of empowerment. Star Trek's vision of liberation from dull, repetitive work had been subverted by a philosophy based on preserving our link with the pre-technological past that preserved the pre-eminent role of humans and human thought: the goal of empowerment was to preserve human dignity in an increasingly machine-centric age by maintaining an active human role in computing. We'd tolerate none of this usurpation of our authority by dumb silicon! Unfortunately, this also subtly shifted responsibility for the difficult parts of computing onto us and off of the computer.
Don't believe me? Consider everyone's favorite bête noire, Microsoft Word, whose automatic numbering ("it's not a bug it's a feature") keeps our brains working, albeit at the cost of a dramatically increased incidence of baldness as we users tear out our hair trying to keep the numbering straight in procedural steps. (In a heartwarming example of how Microsoft has increased equality between the sexes, the hair-loss syndrome now afflicts women in equal numbers.) Eventually, even the most masochistic user resorts to numbering steps manually, patiently retyping the numbers whenever the steps change. That's oddly satisfying, like using the stickshift in a car with a manual transmission instead of using an automatic transmission and letting the car shift for us. And as a bonus, we're constantly reminded that the ability to manipulate numbers—the main thing that elevates us above the beasts of the field—is something we can still do better than our computers.
Of course, computer use isn't only rote number-crunching work. Our higher cognitive functions remain in a keen state of readiness to ensure, for example, that we don't accidentally delete critical files. Sure, a well-designed computer could protect these files from our fumbling, but where's the empowerment in that? Better by far to leave such critical thought-intensive tasks to us. Then there are the legions of "temporary" files (most infamously, those accumulated by Internet Explorer) that clog hard disks the way cholesterol clogs arteries—files, incidentally, that are significantly more permanent, more difficult to find, and more difficult to eliminate than our own documents. Should we forget to manually delete them, these "temporary" files eventually crash our computers or cause their operations to grind to a halt. But worrying about this problem and dealing with it keeps our wits sharp, which is surely a good thing. And I can't say enough about those installation programs that force us to hover over the computer, answering inane questions, until the installation is complete. We could simply tell the installer what to do right at the start, then go get a cup of coffee and finally make time to read The Da Vinci Code while the installation finishes. But that kind of automation would make us feel disempowered.
Ironically, while I was writing an early draft of this article, my employer was busily migrating us to new server software. Our computer staff had tripled temporarily so that the formerly solitary network guru could keep up with all the ennobling manual labor that kept him gainfully employed when we weren't engaged in disruptive projects like server upgrades. As I watched over the shoulder of the worker who was reconfiguring my computer, I experienced what the Japanese call satori and the English call epiphany: a sudden moment of revelation. The title of his printed checklist, in immaculate 24-point French, was Configuration des usagés (configuration of the used). Optimists will attribute this to a simple phonetic misspelling of usagers (the French word for users). But me? I fear there's a much deeper principle at work here. Instead of becoming computer users, like the cheery protagonists of Star Trek, we've become the computer used, like the gloomy inhabitants of Dilbert's world.
I am sure that I feel empowered, but I'm not sure that this comforts me.
©2004–2017 Geoffrey Hart. All rights reserved