Quote Originally Posted by azjoe View Post
Well, I for one completely disagree with the notion of forced upgrades. Why should people who, as you say, be forced to spend $200+ for an OS upgrade?
Maybe I should clarify my point. I definitely disagree with forced regular upgrades. My point isn't that people should be forced upgrade their OS every time a new one comes out. It's that the new OS shouldn't be forced to support hardware/software that only a handful of people are still using. That's not to say that they shouldn't support more recent stuff though.

Quote Originally Posted by azjoe View Post
I don't think you'll find that legacy drivers are the major cause of bloat... rather I believe it's new software feature bloat that is requiring us to buy new PCs with faster CPUs and more memory and disk... and the majority of the bloat is in the OS. Think what all this is costing businesses and consumers who must constantly upgrade to new hardware and software, spend time and money on training, etc. just to support the overhead of new software features most don't use. For example, .NET has probably cost the civilized world far more than it can ever be expected to save.
I agree with you, for the most part. The drivers are not the major source of bloat, but I can't help but feel that support for hardware/software that's around a decade old not only bloats the OS in terms of extra libraries and code needed for the support, it invites LOTS of room for error (read: bugs), and probably stifles innovation to some degree, if for no other reason than it would cost too much to throw the new features in, then try to make it not interfere with the older stuff, then debug the whole thing, and expect it to be a cohesive, user-friendly and stable package.

Quote Originally Posted by azjoe View Post
There was a time (in the dark ages of computer history) when operating systems were fairly sleek. Software required to support specific hardware devices and/or run specific applications was purchased with the hardware/application. Somehow almost all that "stuff" has migrated into the OS... and created the bloated things we have now. The reasons are complex and rooted mostly in business trying to maintain proprietary rights to the interfaces to their operating systems, hardware manufacturers trying to reduce programming costs, etc., etc. But if you step back and look at it objectively today, one has to ask if the pendulum hasn't swung to far... and perhaps now the tail is wagging the dog!
I think I know what you're saying here. If I'm getting you correctly, Microsoft's DirectX would be an example of this. I remember when graphics cards supported a specific graphics language, usually proprietary to the vendor, or pretty close to it. Now, it seems that they pretty much just support DirectX and OpenGL. It has its ups and downs. If all the hardware vendors are on the same page, it makes it easier for game programmers and the like to create their goodies, and not have to try to work in support for several different grahpics languages. I'm sure you can give a better example than me. I didn't really get into the computer world until late 1996, and I'd wager a guess to say that things had already changed for the most part by then, or were at least well on their way.

Quote Originally Posted by azjoe View Post
As a software vendor I greatly resent operating system vendors (such as Microsoft, Apple, and even the open systems guys) that drop support for internal features that then require me to buy new development tools and rewrite my software products... just so they'll run on a newly purchased PC or operating system. This increases my costs which, in turn, increases the costs to my customers... all with no value added.

I guess constantly rewriting software keeps a lot of us employed, but in the bigger picture that might actually be considered a form of lost productivity. IMHO, consumer acceptance of planned product obsolescence, coupled with its throw-away mentality, will turn out to be key factors in economic woes which will become problematic for highly industrialized nations in the future.
This is why I said that they should support features for a reasonable amount of time. If they're dropping all of these features with every new release and forcing everyone who needs those features to completely uproot themselves, their business, and employees by deploying and learning new software, yeah, that's not right. But I'm assuming that you're not creating your latest, greatest software to specifically NOT take advantage of any of the improvements that the computers coming out within the past 5 years or so have to offer.

Joe (also from AZ)