NT/2K/XP – A Brief History

March 8, 2008 – 1:58 PM


What’s the difference? Well, at the very core of them, not much. When you see talk of Windows 98/ME being more stable than Windows 2000 or XP, don’t buy into it. Not because 98SE isn’t one of Microsoft’s better efforts, but because of the disaster of an OS architecture that it still relies upon. By the end of the road, Windows 9x had become such a massive quagmire of heaped on patches and features that it ultimately led to Windows ME… arguably the worst operating system release to ever come from Microsoft. I’d rather use Window 95 than ME.

Windows NT started as a joint venture between IBM and Microsoft you might have heard of… OS/2. A long-running spat between the two eventually forced a split, and each walked away with a copy of the code and the intention to take it in their own direction. The intent was to design a completely modular operating system whereby entire portions of the system could be replaced at will by third parties or by Microsoft in order to support varying hardware platforms, security models or interface elements. Memory management and process separation were paramount design goals. This is the single most important factor in system and application stability because one failed driver or program should not be capable of taking down the entire operating system.


Shooting themselves in the foot, enter the magical Blue Screen of Death (BSOD). Instead, a single driver or application can cause the operating system to detect the failure, but not understand how to handle the error. The result is a wealth of dump information that is still of little help to this day. Better than a generic VxD error that can be caused by something totally unrelated, but this trend of STOP error codes having many different meanings isn’t much better, and early versions of NT would gak on even the most trivial of error conditions. The upside is that somebody on the planet can take that information and turn it into something useful… just not anyone that you or I can easily reach. But I digress.

When NT 3.1 was released, it brought Windows to the server for real… well, sort of. Narrow hardware support, half-hearted feature sets and stability nightmares galore labeled 3.1 as a retail Beta release, but if all of the ducks where lined up in a neat little row, it ran quite well. Then came 3.5, which improved things quite a bit, but still left a sour taste in the mouths of customers Microsoft was trying to win over from Novell and Banyan, mainly due to a complete lack of maturity of features and little in the way of real interoperability with existing networks.

Windows NT 3.51 was when this new entrant really made its debut. It wasn’t great, but it was good enough to put into production for certain things. File and print serving were adequate, and it made for an acceptable SQL and Exchange platform. The workstation version was rough, however. Compatibility was difficult, at best. 16-bit Windows application support made DOS look like a step forward. This would get better, though.

NT 4.0 is generally considered to be when the Windows NT core came into its own, and I would have to agree. I would use NT 4.0 whenever possible, dual-booting with Windows 95/98 only when absolutely necessary because of the relative stability that NT offered and the more advanced security and file system features included in the OS. Finally, a workstation OS that would let me take full advantage of extra RAM and multiple CPUs, while still letting me limp along with some games and “legacy” applications if tweaked properly. I was working for an insurance company at the time of the NT 4.0 release and we decided to make NT 4.0 Workstation the standard for desktops company-wide. I considered this to be a relative success, having migrated and rolled out some 800+ PCs based on NT 4.0 and aside from a funky token ring card driver, everything ran quite well. I truly liked NT 4.0, but was eagerly awaiting improvements in Windows 2000.

As hardware advanced, and the vision of ditching the 9x code base creeped closer, Microsoft had to do something about the constant install-and-reboot issues, power management, networking services and multimedia support. These were the highlights of Windows 2000, and they couldn’t come soon enough. Driver development is probably the one thing that held Windows 2000 back from replacing Windows 9x outright rather than waiting for Windows XP. For whatever reason, manufacturers just didn’t get the hint. They still saw that the majority were using 9x, so why play to that geek crowd that just likes to be different? They eventually learned the hard way, but at the expense of many frustrated customers that wanted what Windows 2000 could give them otherwise.

Now we’re at Windows XP. Still based on the very beginnings of Windows NT, though obviously quite a bit different these days. No longer is Microsoft bent on the modular rip-and-replace design of years past. Some of that capability is still around, but mainly used in server systems that require specific Hardware Abstraction Layer components in order for the OS to speak to the hardware properly. Aside from that, you don’t see many entire shell replacements, network authentication modules or other allowances for non-Microsoft modifications (e.g. entire antitrust debacle).

The overall stability of Windows XP has been quite good. Even fewer reboots are required today, though I’m still waiting for the day when all but kernel-level changes lack the requirement of a system restart. Reboots are annoying. Hot-docking, suspend, hibernate and other dynamic features within Windows XP are admittedly better, but still problematic. And don’t even get me started on XP Service Pack 1. I’m still avoiding it like the plague, in case you’re wondering.

So, here we are in late 2002, with the next major version of Windows due somewhere around 2005-2006 and that doesn’t really bother me. Windows ME showed what can happen if things are rushed in an effort to have another source of revenue. There is talk of an interim release of some sort, but the details are sketchy at this point. Microsoft is moving in the right direction, having finally shoved all operating systems over to the NT code base. It’s far more stable, efficient, secure and usable than 98 or ME ever were. Gamers still have a few gripes, but those should be thrown at the game developers for not waking up long ago. DirectX is DirectX, and if a developer isn’t following along with what Microsoft is doing, then they will fall behind the times. I play a few games under XP and have few problems. In fact, it’s better in most cases because of how memory and system resources are utilized.

At the end of this little lesson in history and perspective, should you upgrade to Windows 2000 or XP? I still say it depends. If all you do is surf, read email and watch a video or two, 98 or ME will still suffice. Should you could yourself to be dependant on your computer for much more, I would say that you should seriously consider the move. However, let me qualify that with a plea for how you go about it. Please, please, do not “upgrade”. Do a clean installation. It’s worth every bit of time and effort to wipe the slate clean rather than trying to move to an entirely different architecture while preserving your settings and applications from the past several years. Things are bound to go wrong, no matter how much time and effort Microsoft puts into smooth transitions. They can never account for every nasty application, suspect driver and legacy device among the billions of combinations. Start over. You’ll be a much happier camper. I’ve been using NT-based operating systems since before NT was NT, and ask that you trust me on this one. You would still qualify for the upgrade version of the OS, but this does not require you to perform an actual upgrade of your existing installation. You simply have to have the media from your existing OS handy for verification that you do have an upgradeable product.

Also, don’t even think about Windows XP with less than 256MB of RAM. I know the book answer is 128MB, but forget it. You’re kidding yourself. XP is a pig, and the more RAM you can give it, the better. I run with 512MB in my notebook, and when I get rolling, I can chew that up quite easily. Bloated code, useless services enabled by default, and a prefetch engine that loves RAM… welcome to Windows. For what it’s worth, I have the same recommendation for MacOS X. Anything less than 256MB of RAM, and it can get painful (though you can do more with 256MB under OS X than under XP). I’m afraid it’s the way of the world from here on out. Remember the days when a 1GB disk (heck, a 5MB disk) was considered huge? Today, 1GB of RAM is more common than you might think. I have 2GB in my Gateway 6400 server and plan to move to 4GB in my Dell PowerEdge 1400SC server down the road. I’d have 1GB in my notebook already if it supported such capacities.

You must be logged in to post a comment.