Friday, November 5

Technology - we all use it, right?

But how many of us can actually keep up with the damn thing? I remember the time of simple comparisons: there were two numbers we all understood - processor family and clock speed. Say, 80386 running at 33MHz, or a 80486DX2 @ 66Mhz, and so on. Life was simple. There were distinct successors in graphic adapters - Hercules was replaced by a CGA adapter, then EGA, which expanded into VGA (wow look at the colors!!) and then there was SVGA and life ceased to be simple.

Enter Pentium and different bus speeds, multipliers, overclocking and the whole industry became a cruel race. Most of us mortals just couldn't keep up with the changes, unless we devoted ourselves to it full time. Different standards started to compete for market dominance (remember VLB and PCI? I do. I had bought a VLB motherboard along with an S3 graphics adapter which I was sooo proud of, and then PCI won.)

Recently (last 5 years) there are so many chipsets, clocks, buses, whatevers to consider when you are looking to buy a new piece of hardware, that it becoms an arduous task of researching sites like anandtech and Tom's hardware (thank you for your existence) and asking your guru friends for reccomendations.
Just now I witnessed two of my colleagues engaged in a discussion which digital camera one should buy. The abundance of features even in such a simple item as a camera has become so vast, that you can't simply go out and buy one, without fearing that within a few months it will become obsolete. OK, such speedy progress is good, right? I suppose it means that we get better and more interesting gadgets sooner, but in a few years we will need to incorporate some sort of tech-class into elementary education in order to have consumers competent to operate (even understand) those new toys.

And I don't even want to begin talking about Asimov.