
Enter Pentium and different bus speeds, multipliers, overclocking and the whole industry became a cruel race. Most of us mortals just couldn't keep up with the changes, unless we devoted ourselves to it full time. Different standards started to compete for market dominance (remember VLB and PCI? I do. I had bought a VLB motherboard along with an S3 graphics adapter which I was sooo proud of, and then PCI won.)
Recently (last 5 years) there are so many chipsets, clocks, buses, whatevers to consider when you are looking to buy a new piece of hardware, that it becoms an arduous task of researching sites like anandtech and Tom's hardware (thank you for your existence) and asking your guru friends for reccomendations.
Just now I witnessed two of my colleagues engaged in a discussion which digital camera one should buy. The abundance of features even in such a simple item as a camera has become so vast, that you can't simply go out and buy one, without fearing that within a few months it will become obsolete. OK, such speedy progress is good, right? I suppose it means that we get better and more interesting gadgets sooner, but in a few years we will need to incorporate some sort of tech-class into elementary education in order to have consumers competent to operate (even understand) those new toys.
And I don't even want to begin talking about Asimov.