ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
The Widespread Support Fallacy
NVIDIA acquired Ageia, they were the guys who wanted to sell you another card to put in your system to accelerate game physics - the PPU. That idea didn’t go over too well. For starters, no one wanted another *PU in their machine. And secondly, there were no compelling titles that required it. At best we saw mediocre games with mildly interesting physics support, or decent games with uninteresting physics enhancements.
Ageia’s true strength wasn’t in its PPU chip design, many companies could do that. What Ageia did that was quite smart was it acquired an up and coming game physics API, polished it up, and gave it away for free to developers. The physics engine was called PhysX.
Developers can use PhysX, for free, in their games. There are no strings attached, no licensing fees, nothing. Now if the developer wants support, there are fees of course but it’s a great way of cutting down development costs. The physics engine in a game is responsible for all modeling of newtonian forces within the game; the engine determines how objects collide, how gravity works, etc...
If developers wanted to, they could enable PPU accelerated physics in their games and do some cool effects. Very few developers wanted to because there was no real install base of Ageia cards and Ageia wasn’t large enough to convince the major players to do anything.
PhysX, being free, was of course widely adopted. When NVIDIA purchased Ageia what they really bought was the PhysX business.
NVIDIA continued offering PhysX for free, but it killed off the PPU business. Instead, NVIDIA worked to port PhysX to CUDA so that it could run on its GPUs. The same catch 22 from before existed: developers didn’t have to include GPU accelerated physics and most don’t because they don’t like alienating their non-NVIDIA users. It’s all about hitting the largest audience and not everyone can run GPU accelerated PhysX, so most developers don’t use that aspect of the engine.
Then we have NVIDIA publishing slides like this:
Indeed, PhysX is one of the world’s most popular physics APIs - but that does not mean that developers choose to accelerate PhysX on the GPU. Most don’t. The next slide paints a clearer picture:
These are the biggest titles NVIDIA has with GPU accelerated PhysX support today. That’s 12 titles, three of which are big ones, most of the rest, well, I won’t go there.
A free physics API is great, and all indicators point to PhysX being liked by developers.
The next several slides in NVIDIA’s presentation go into detail about how GPU accelerated PhysX is used in these titles and how poorly ATI performs when GPU accelerated PhysX is enabled (because ATI can’t run CUDA code on its GPUs, the GPU-friendly code must run on the CPU instead).
We normally hold manufacturers accountable to their performance claims, well it was about time we did something about these other claims - shall we?
Our goal was simple: we wanted to know if GPU accelerated PhysX effects in these titles was useful. And if it was, would it be enough to make us pick a NVIDIA GPU over an ATI one if the ATI GPU was faster.
To accomplish this I had to bring in an outsider. Someone who hadn’t been subjected to the same NVIDIA marketing that Derek and I had. I wanted someone impartial.
Meet Ben:
I met Ben in middle school and we’ve been friends ever since. He’s a gamer of the truest form. He generally just wants to come over to my office and game while I work. The relationship is rarely harmful; I have access to lots of hardware (both PC and console) and games, and he likes to play them. He plays while I work and isn't very distracting (except when he's hungry).
These past few weeks I’ve been far too busy for even Ben’s quiet gaming in the office. First there were SSDs, then GDC and then this article. But when I needed someone to play a bunch of games and tell me if he noticed GPU accelerated PhysX, Ben was the right guy for the job.
I grabbed a Dell Studio XPS I’d been working on for a while. It’s a good little system, the first sub-$1000 Core i7 machine in fact ($799 gets you a Core i7-920 and 3GB of memory). It performs similarly to my Core i7 testbeds so if you’re looking to jump on the i7 bandwagon but don’t feel like building a machine, the Dell is an alternative.
I also setup its bigger brother, the Studio XPS 435. Personally I prefer this machine, it’s larger than the regular Studio XPS, albeit more expensive. The larger chassis makes working inside the case and upgrading the graphics card a bit more pleasant.
My machine of choice, I couldn't let Ben have the faster computer.
Both of these systems shipped with ATI graphics, obviously that wasn’t going to work. I decided to pick midrange cards to work with: a GeForce GTS 250 and a GeForce GTX 260.
294 Comments
View All Comments
johnjames - Monday, May 18, 2009 - link
I don't get it, I started reading this review and decided to get a 4890, then I read the following reviews:[url]http://www.driverheaven.net/reviews.php?reviewid=7...[/url]
[url]http://www.bit-tech.net/hardware/graphics/2009/04/...[/url]
[url]http://www.bjorn3d.com/read.php?cID=1539&pageI...[/url]
[url]http://www.dailytech.com/422009+Daily+Hardware+Rev...[/url]
[url]http://www.guru3d.com/article/geforce-gtx-275-revi...[/url]
[url]http://www.legitreviews.com/article/944/15/[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_3d_...[/url]
[url]http://www.hardwarecanucks.com/forum/hardware-canu...[/url]
[url]http://hothardware.com/Articles/NVIDIA-GeForce-GTX...[/url]
[url]http://www.engadget.com/2009/04/02/nvidia-gtx-275-...[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_gtx...[/url]
[url]http://www.pcper.com/article.php?aid=684&type=...[/url]
And they all state the GTX 275 gives a lot more fps in all games bar Grid.
genetix - Wednesday, September 23, 2009 - link
This is actually really funny you mention multiple sites. Since it's pretty hard to find an site these days which actually doesn't review/preview without sponsors. Meaning lean to one side to other is pretty simplistic just need to review right games and voila either can win. Lol, looking ATI videos damn those are so well selected that damn.We are definedly getting back to 80s where games where made to GPU. Not to all. The funny thing is even our so trusted Benchmarks like any Futuremark production fakes the results of GPUs. Their so called ORB is pretty far from reality what the hardware is really capable.
Asianman - Tuesday, June 16, 2009 - link
most of those use either NV biased games or most likely didn't upgrade the 4890's drivers. All reviews show that 4890 loses its initial advantage at higher resolutions, and the fact that it is now much cheaper. Take your pick, you'd get good value either way.Patrick Wolf - Sunday, August 2, 2009 - link
Well the 4890 isn't exactly kicking the 275's butt here.Let me break it down:
Age of Conan: 0-3 fps difference. It's a wash
CoD: WaW: 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Crysis Warhead: 0-2 fps difference. It's a wash.
Fallout 3: 4890 wins.
Far Cry 2: 0-2 fps difference. It's a wash.
Left 4 Dead: Again, 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Grid: 4890 wins.
That's 2 for nvidia, 2 for ATI. And on COD, Crysis, Far Cry, and L4D the 4890 wins at 1680 and 1920, then at 2560 the 275 suddenly pulls ahead? That's supposed to make sense? Not to mention both drivers used were beta. And the 185.65 drivers have been pulled from nvidia's archives.
pinguw - Friday, April 17, 2009 - link
yes, you said the one that is getting the benefit are the end user, but I think you have a short vision, because when things getting cheapper means we have more chance to get lower quality product. for example, the GTX260 that I bought several month a go, I can see that the image was worse than the 8800GTS that I had 2 years. At beggining I thought it was a defect so I changed other one and other brand and had the same result. so I say, instead of fighting for price, why dont they just make a better product?? lowering the price would just get our product worse and worse, like most of the product sold in US are now made in China... and then everybody are complaining about the product is bad... that is poisoned etc, what a joke what do you expect when the price go down? the answer is easy to get right? So I would suggest you stopping saing the one is getting the benefit are the users, what a brainless commentjoeysfb - Tuesday, April 28, 2009 - link
Something is not right here, are you linking the lowering of product quality to ficere competition???That's why people read reviews, comments from Neweggs, Amazons... to find out the user experience before buying a desired product...
Almost everythings is made in china now...like it or not.
8KCABrett - Thursday, April 16, 2009 - link
Those of us that buy the latest hardware to fly our flight sims have been pretty much left to using the outdated Tom's Hardware charts (which still show the 8800GTS being the fastest card around). I would love to know how the Q9650s and i7s are doing in FSX since the service packs, and it would be great to learn if the GTX 260/280s and now the refreshes are still slower than an 8800GTS in those sims. . .not to mention the abysmal performance of ATI cards! Has anyone found such a review anywhere?joeysfb - Friday, April 17, 2009 - link
just stick to 8800GTS then (money saved)... besides there not many sim titles these days.BikeDude - Friday, April 17, 2009 - link
Stick with the 8800GTS?I do not think you realize the problem. A year ago, FSX ate all the hardware you could throw at it.
FSX is a very difficult animal to feed.
It loves fast CPUs, but it also needs a fast GPU. Unfortunately, as was pointed out, there exists few recent comparisons. It is not easy figuring out the correct hw balance for FSX, since few includes it in a review.
Comparing dozens of FPS games is pointless. They perform similar. There are some small differences, but to evaluate a given card, you don't have to review that many games. FSX however poses some unique challenges, and deserves some attention.
Oh... I'd also like to know which of these cards will play nicely with HD video.
8KCABrett - Tuesday, April 21, 2009 - link
Well, every now and then I like to have a little shooter fun, and the GTS is certainly lagging behind in the new titles.I'm currently beta testing a new sim and it really utilizes the GPU which is nice to see, but my 8800GTS limits me quite a lot, and it's also nicely multi-threaded. I decided it's time to update my system, and really have nothing to guide me. Is ATI still really weak in sims? Have the GTX 280s gotten any better with the recent drivers? What about SP2 in FSX? I just don't have any source of this info, and I've looked everywhere for a legit source.
I've got a GTX 285 on the way and will just end up doing my own testing since that's apparently the only way to get the info.
There are hundreds of review sites out there posting these same four or five titles in their benchmarks and not a single one that includes any of the flight sims, even the new releases. I know sims are a niche market, but flight simmers are left to test for themselves, and they use what is perhaps one of the more demanding titles out there! My complaint isn't directed at Anandtech per se, I favor this site and have seen and appreciated the helpfulness of Gary Key time and again, especially over at the Abit forums, I just wish that Anandtech could employ their testing discipline in titles that really do need a legit place to evaluate them. It could really be a benefit to many people that really aren't catered to at all currently.
OK. . .back to lurking.