How do you know when to upgrade?
by
, 07-15-10 at 09:56 PM (4221 Views)
So I got to thinking the other day. Upgrading my gaming computer used to be a ritual that happened every 2 years with a yearly graphics card upgrade. But every two years, from top to bottom, I would have a new gaming machine. This was partly due to the development of games always being one or two steps ahead the development of hardware. These $500-$1000 upgrades every two years were always worth it as it allowed me to play the latest and greatest games, such as Quake, Max Payne, Command & Conquer Generals, Half Life 2, and Crysis.
Things are different now. They are different, and I am not sure if it is a "good" different, or if it is a "bad" different. My current gaming rig might have one hell of an overclock on it, but I am using technology that is 3 years old, and it eats everything I try to feed it. I mean, you would think that the latest and greatest games would demand the latest and greatest hardware, right? Before I continue, here are the basic specs of my system:
CPU: Intel E8400 @ 4.0 GHz
Motherboard: ASUS P5K Premium Black Pearl Edition (P35 Chipset)
Memory: 6GB OCZ Reaper @ 960 MHz
GPU: 2 x ATI 4850 in CrossfireX
Not an entry level system by any means. In fact, it was near top of the line during the original build back in September 2007. The only real performance upgrades I have made in the almost 3 years was swapping out the E6850 for the E8400 and deciding to go with the twin 4850 graphics cards to replace my continuous over-heating HD2900XT.
So here I am with the aging P35 chipset, still running DDR2, running 2 budget level graphics cards in CrossfireX, and I am able to handle every game I can throw at it. But why? With all of the multicore and multithread technology, quad SLI/Crossfire and SSD Drives, are the hardware developers just getting that good? Or better yet, are the game developers just getting lazy? With the release of the latest XBOX 360, I realized that the game developers are not being lazy, but they are just doing good business. The life of the XBOX 360 is long for a console gaming system, and it isn't over yet.
Nowadays, most games are being developed for console systems and being "ported" to the PC. This isn't true for all games, but it is true for most of the hottest titles, such as Call of Duty: Modern Warfare 2. These games are being developed for systems that are literally half as powerful in terms of graphics and CPU processing capabilities. Unfortunately, this means the PC gamers get the versions that are buggy, badly ported, or similar to the console version, but probably having some advanced graphics setting options. So that brings me to my question: Why should we upgrade?
What benefit do we have if the game developers are not going to give us anything worth throwing at our new rigs? Don't get me wrong. I love the excitement of the build, the overclocking, and benchmarking. But what is the point if the games will run on my current rig? I know that with the down economy not all people can afford to upgrade. But I can, damnit, and I want to... but for what? Nothing. There is no "good" reason to upgrade. Hell, PC gamers got screwed over AGAIN by Alan Wake. A game that was originally a PC game is now only available on a console. At this point, I don't know what to expect for future games. At least Valve continues to put out good stuff, even if their games are using an aging graphics engine. But hell, they are selling console games now too.
Am I bitter? I think so...
HeavyG