Articles Archive
Articles Search
Director Wiki

Fly me to the moon

July 9, 1998
by Zac Belado

Those of us that work in the computer industry have started to become rather blase about the rather frightening rate at which the technology we use changes. I think we've stopped questioning the true utility of the equipment and technologies we've embraced.

Many of us have amusing stories we like to tell that give a personal dimension to this technological advancement. Mine revolves around my utter disbelief when a friend told me he was going to upgrade his Apple II+ to 128K of RAM. "What," I remember saying, "are you ever going to need 128K of RAM for?"

A few months ago, when I bought a machine to run Linux on, I immediately got extra RAM to bring the system up to 40MB.

The latest (September '98) issue of PC Computing had a very interesting counter-point to this:

"And these days, 64MB of memory is the starting point for a Pentium II portable - at least by our standards."

Should a computer really need that much RAM? Should we even need, for routine tasks, a processor that runs at over 200 Mhz?

Now before you immediately say "Yes", take a moment to think about what the average computer really needs to do; process text, manage a network connection and display pictures on screen. Those three tasks probably account for 75% of the tasks that the average machine will be asked to do. So unless you have a hell of a big Word file you're not going to need 64 MB of RAM or need a Pentium II 400.

True, some tasks require an exorbitant amount of RAM and processing power. 3D rendering, digital manipulation in PhotoShop, compiling large apps with a C++ compiler. But for every machine that requires added resources to do its job there are probably 100 that don't.

Steve Wozniak is quoted in the September 98 issue of Wired as saying that the average person's computer needs were satisfied with the Commodore 64 and Apple II+. He's right.

There are only two problems with this. Most people really do need a GUI in order to use their computers (something that is hard to run on an Apple II+...and yes I did actually own and use GeoWorks) and it's really hard to make a billion dollars in the computer industry without making people feel that their hardware and software is inadequate.

The first problem was effectively solved by the release of the Macintosh in 1984. I think that most people would be more than happy with a Mac Classic. In fact I spent several years happily using one for all my email and writing. It was responsive, ran Eudora and WordPerfect and didn't crash. And it had the added benefit of being able to follow me into the kitchen so I could read my email while I had my morning Froot Loops.

The second problem is far more complicated, and for an answer to it, I simply point towards the Open Source software movement and ask the reader to make the obvious conclusions.

This entire matter is further complicated by the rather incestuous circle-jerk that software and hardware manufacturers seem to be engaged in.

Our operating systems have become so bloated that they almost require 200+ Mhz systems in order to operate. It is to be expected that, as the functions of an OS expand, the memory and system requirements would expand as well. But there is also something to be said for the lack of desire to make code as economical as possible.

When Windows 95 was first released I was quite shocked by the rather hefty requirements it had. I compared it to System 7.5 which would still run on lowend '040 machines and even to System 7 which I had running on quite a few Mac Pluses. Even Linux, with all the "modern" bells and whistles it offers will still run on a 486. For a more shocking condemnation of the coding practices of some select software producers have a look at the required hardware for DR-DOS.

I have a 133 Mhz Pentium system that I run Linux on. I have, on several occasions, been tempted to try and run NT 4 Server on it. Just to see it would actually function properly. At times I think I entertain this thought purely for the entertainment it affords me.

So why don't we notice this? Is it really not a problem or has this upgrade cycle and habit of using shoddy products become a part of our experience of using computers?

Ram is cheap. Hardware is cheap. So it seems we don't complain.

But maybe we should.

Why should Photoshop run as fast (with some exceptions) on my Mac as it does on my PC but Netscape takes almost a minute to load and redraws pages in a fashion that makes you suspect its rendering the damn things.

Why should my word processor take up so much space, use so much Ram and require such a fast machine to run?

Why can Claris (or I guess I should say Apple) be able to supply a word processor, spreadsheet, database and communications package that has a memory footprint that is less than a single component of Office?

Why, after all the years of talk about components, objects and new programming technologies are we still stuck with apps and operating systems that expand like Mr. Creosote in Monty Python's "Meaning of Life"?

I think we all need to stop, take a long breath and ask ourselves what the hell we're doing.

I have more computer processing power on my desk than they needed to send the first missions to the moon. And what does it do?

Not much because Word keeps crashing.

Zac Belado is a programmer, web developer and rehabilitated ex-designer based in Vancouver, British Columbia. He currently works as an Application Developer for a Vancouver software company. His primary focus is web applications built using ColdFusion. He has been involved in multimedia and web-based development, producing work for clients such as Levi Straus, Motorola and Adobe Systems. As well, he has written for the Macromedia Users Journal and been a featured speaker at the Macromedia Users Convention.

Copyright 1997-2017, Director Online. Article content copyright by respective authors.