|
Posted: P |
Now that CPUs contain over 50 million transistors and are capable of processing information at clockrates exceeding 3,000 Megahertz [3 GigaHertz], raw performance no longer carries the importance it once did. Certainly, speed will always have its place. But it's no longer the primary focus. Rather, today's PC enthusiast is shifting a critical eye toward system stability. I contend that it doesn't matter how fast a Ferrari can go .. if it keeps stalling every few miles. BackgroundWhen a tech site reviews a particular hardware component, they generally do so on a stripped-down, bare-bones, benchmarking rig .. with nothing installed except a fresh copy of an operating system and a few savory benchmarking programs. They'll likely open one program at a time, run the benchmark, and close it before opening the next. If they're meticulous, they'll even reboot before running the next benchmark. But is this how you and I use our systems? Not hardly. This minimalist
approach to system configuration works best for benchmarking hardware
components, because it minimizes the chance for incidental operations
(such as virus
scanning) that might interfere with the benchmark and distort the
results.
And we like to fill our systems with lots of software programs. The problem is that each new software program, and each new component, increases the potential for a quirky compatibility glitch. Everyone's system is different. We have different hardware, and we install
different software. And ifm we have the same software, we're likely
to configure it differently than someone else. All of which means our
systems are unique, and therefore our problems, too,
are likely to be unique. I've spent many hours on the phone with numerous tech support groups, whose only solution to my problem was to "uninstall the other guy's software". Which seems like no solution to me. I mean, my grandmother could have figured that out. If you have only a basic system configuration, with a few, basic
programs installed, for things such as email, word processing and surfing
the 'Net, you're unlikely to be plagued by the quirky compatibility
glitches I'm talking about. ...which brings us to the inspiration for this article. I've learned a few tricks along the way to building and configuring stable systems. I don't know everything. But I've made enough mistakes that I've learned what *doesn't* work well. I've found that, if you eliminated the major sources of these problems, you have a better chance of running a stable system. It's much more difficult to build and configure a stable system than it is to built a fast one. So let's get busy and take a look at the major factors that affect PC stability. |
Updated: P |
||
1. The Operating SystemThe single biggest factor affecting system stability is the operating system. Both Windows XP and Windows 2000 are based on code from Windows NT. They are far more stable than 'legacy' versions of Windows (9X, Me) that includes 16-bit DOS-compatibility. To make a long and complicated story short and simple, DOS is a 16-bit operating system. Windows is a 32-bit operating system, which contains backwards-compatibility with/for 16-bit DOS programs. This backwards-compatibility with 16-bit programs is the thing that adversely affects the stability of the consumer versions of Windows (Win9X, WinME). Neither Windows 2000 nor Windows XP include backwards-compatibility with 16-bit programs. They are therefore is not affected by these associated stability problems. If you absolutely must have/use one of these consumer versions of Windows that contain 16-bit backwards compatibility, because you still use 16-bit programs that are not supported by either Windows 2000 or Windows XP, then you should consider dual-booting two operating systems. You can use one OS for all your 16-bit programs, and Windows 2000/XP for everything else. If you've never used Windows XP, you'll think you died and went to operating system heaven. 2. The ChipsetVolumes have been written on this subject, but suffice to say that
Intel chipsets are the most stable. I do not know if this is
because Intel does a better job at manufacturing their
chipsets than other companies, or that software manufacturers test
their software more thoroughly on Intel-based systems, since they are
more popular .. more than they do on systems based upon non-Intel chipsets.
Or a combination of these factors.
Chris at GamePC says (middle of first paragraph):
I can fill this page with comments like these, but I'll simply include one more for good measure and be done with it. Kyle at [H]ard|OCP echoes these sentiments when he says (next-to-last paragraph):
I'm not saying that other chipsets suk. I'm merely saying that you
have the greatest chance of configuring a rock-solid system if you use
a motherboard based upon an Intel chipset. Far as motherboards go (chipsets come installed on motherboards),
I prefer Asus boards. You will generally
pay a little more for an Asus board, but this is money well spent. Asus
is the largest manufacturer of mobos in the world.
Note that I never said that non-Asus boards are crap. They're not. There are many good motherboard manufacturers. I used to be a big Abit fan, beginning with the legendary BH6 (Cel300a@464MHz). I simply feel that Asus makes the most stable motherboards boards on the market. And more than a few folks agree. |