Thursday, January 3, 2013

Hitting the Wall

For years it seems the enthusiast PC market has been concerned with going ever faster.  Faster processors,  faster memory and of course faster video cards.  We're not even happy with our storage anymore.  These days if it isn't a fast SSD you may as well be saving your data on backup tapes.

On the CPU side of things, at some point both AMD and Intel decided that speed was irrelevant.  Clock speeds were quickly climbing toward 4Ghz but the awful truth was that the drag race couldn't continue.  Silicon isn't very stable above 4Ghz unless you happen to own a liquid nitrogen plant.  Since we're still years out from quantum computing at the consumer level something had to change.

The fix? Both Intel and AMD ramped up their marketing departments and started selling the idea that speed wasn't that important anymore.  Instead they pushed the idea of having more processing power that could best their predecessors in benchmark tests regardless of the rated speed. 

More physical (or virtual) CPU core designs soon replaced single core designs and it wasn't long till the software caught up to take advantage of them.  To date there's still not much software out there that can simultaneously load more than 2 CPU cores for the average user but it makes for a good marketing hook at least.  I mean who wouldn't be impressed by having 8 CPU cores running at 3GHz even if 7 aren't doing anything...

361259_Arcade Video Game Sticks - Buy Direct Now!It seems we're now hitting the same wall with Video cards.  This time, however, it's not about the speed of the Graphics processor.  GPU's have taken a cue from their CPU cousins and morphed from simple display translators to multiple IC high bandwidth monsters capable of completely synthetic photo realistic gaming. 
That's all well and good but as the saying goes, beauty is indeed in the eye of the beholder and comparing one video card to another is a very subjective process.  Frames per Second is the standard and it's been around since the days of the first 3D gaming cards.  The problem is that two cards with the same FPS ratings at any given resolution can be wildly different.  Driver tricks, games optimized for one brand or another as well as personal bias can make the numbers moot. 

We've finally arrived at the point with high end video cards that we were at with CPU's not so long ago.  A cheap 3D card can be clocked higher than its enthusiast level sibling but still look pale in comparison.  We're more concerned with streaming processors, shaders and memory bandwidth than how fast it runs.  
Ahhh, all those geeky bits warm the heart but they still mean nothing when you're trying to compare two equivalent cards.  Performance is subjective without some kind of metric but does it really matter?  Remember that with each generation of new video card comes only a fractional improvement in performance.  We usually have to settle for no more than 30% improvement at best and that's under ideal conditions.  Factors like power draw, heat dissipation and even size are becoming more prominent in reviews now.  PCPer is currently evaluating new testing methods that leave FPS behind and instead start evaluating the constituents of the frames themselves at a fixed point in time.  I've linked to Ryan Shrout's article on it below..

It's admirable to seek the holy grail of a truly objective video benchmark but in the end it seems more of an exercise in futility.  Video cards in the same price range generally differ very little in the real world.   Adding more granular datapoints into the mix only serves the marketing departments of AMD and Nvidia.  If you can't discern any real difference between cards outside of a benchmark does it really matter? 

We know the basics, given a good driver (which isn't a given) good GPU design and sufficient memory bandwidth we should be happy.  A frame's difference here and there regardless of the metric cited are irrelevant so long as our games look good and our PC's don't become space heaters in the process.
Another case of much ado about nothing.

Post a Comment