ATI Radeon HD 4890 vs. NVIDIA GeForce GTX 275
by Anand Lal Shimpi & Derek Wilson on April 2, 2009 12:00 AM EST- Posted in
- GPUs
Mirror’s Edge: Do we have a winner?
And now we get to the final test. Something truly different: Mirror’s Edge.
This is an EA game. Ben had to leave before we got to this part of the test, he does have a wife and kid after all, so I went at this one alone.
I’d never played Mirror’s Edge. I’d seen the videos, it looked interesting. You play as a girl, Faith, a runner. You run across rooftops, through buildings, it’s all very parkour-like. You’re often being pursued by “blues”, police offers, as you run through the game. I won’t give away any plot details here but this game, I liked.
The GPU accelerated PhysX impacted things like how glass shatters and the presence of destructible cloth. We posted a video of what the game looks like with NVIDIA GPU accelerated PhysX enabled late last year:
"Here is the side by side video showing better what DICE has added to Mirror's Edge for the PC with PhysX. Please note that the makers of the video (not us) slowed down the game during some effects to better show them off. The slow downs are not performance related issues. Also, the video is best viewed in full screen mode (the button in the bottom right corner)."
In Derek’s blog about the game he said the following:
“We still want to really get our hands on the game to see if it feels worth it, but from this video, we can at least say that there is more positive visual impact in Mirror's Edge than any major title that has used PhysX to date. NVIDIA is really trying to get developers to build something compelling out of PhysX, and Mirror's Edge has potential. We are anxious to see if the follow through is there.”
Well, we have had our hands on the game and I’ve played it quite a bit. I started with PhysX enabled. I was looking for the SSD-effect. I wanted to play with it on then take it away and see if I missed it. I played through the first couple of chapters with PhysX enabled, fell in lust with the game and then turned off PhysX.
I missed it.
I actually missed it. What did it for me was the way the glass shattered. When I was being pursued by blues and they were firing at me as I ran through a hallway full of windows, the hardware accelerated PhysX version was more believable. I felt more like I was in a movie than in a video game. Don’t get me wrong, it wasn’t hyper realistic, but the effect was noticeable.
I replayed a couple of chapters and then played some new ones with PhysX disabled now before turning it back on and repeating the test.
The impact of GPU accelerated PhysX was noticeable. EA had done it right.
The Verdict?
So am I sold? Would I gladly choose a slower NVIDIA part because of PhysX support? Of course not.
The reason why I enjoyed GPU accelerated PhysX in Mirror’s Edge was because it’s a good game to begin with. The implementation is subtle, but it augments an already visually interesting title. It makes the gameplay experience slightly more engrossing.
It’s a nice bonus if I already own a NVIDIA GPU, it’s not a reason for buying one.
The fact of the matter is that Mirror’s Edge should be the bare minimum requirement for GPU accelerated PhysX in games. The game has to be good to begin with and the effects should be the cherry on top. Crappy titles and gimmicky physics aren’t going to convince anyone. Aggressive marketing on top of that is merely going to push people like us to call GPU accelerated PhysX out for what it is. I can’t even call the overall implementations I’ve seen in games half baked, the oven isn’t even preheated yet. Mirror’s Edge so far is an outlier. You can pick a string of cheese off of a casserole and like it, but without some serious time in the oven it’s not going to be a good meal.
Then there’s the OpenCL argument. NVIDIA won’t port PhysX to OpenCL, at least not anytime soon. But Havok is being ported to OpenCL, that means by the end of this year all games that use OpenCL Havok can use GPU accelerated physics on any OpenCL compliant video card (NVIDIA, ATI and Intel when Larrabee comes out).
While I do believe that NVIDIA and EA were on to something with the implementation of PhysX in Mirror’s Edge, I do not believe NVIDIA is strong enough to drive the entire market on its own. Cross platform APIs like OpenCL will be the future of GPU accelerated physics, they have to be, simply because NVIDIA isn’t the only game in town. The majority of PhysX titles aren’t accelerated on NVIDIA GPUs, I would suspect that it won’t take too long for OpenCL accelerated Havok titles to equal that number once it’s ready.
Until we get a standard for GPU accelerated physics that all GPU vendors can use or until NVIDIA can somehow convince every major game developer to include compelling features that will only be accelerated on NVIDIA hardware, hardware PhysX will be nothing more than fancy lettering on a cake.
You wanted us to look at PhysX in a review of an ATI GPU, and there you have it.
294 Comments
View All Comments
johnjames - Monday, May 18, 2009 - link
I don't get it, I started reading this review and decided to get a 4890, then I read the following reviews:[url]http://www.driverheaven.net/reviews.php?reviewid=7...[/url]
[url]http://www.bit-tech.net/hardware/graphics/2009/04/...[/url]
[url]http://www.bjorn3d.com/read.php?cID=1539&pageI...[/url]
[url]http://www.dailytech.com/422009+Daily+Hardware+Rev...[/url]
[url]http://www.guru3d.com/article/geforce-gtx-275-revi...[/url]
[url]http://www.legitreviews.com/article/944/15/[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_3d_...[/url]
[url]http://www.hardwarecanucks.com/forum/hardware-canu...[/url]
[url]http://hothardware.com/Articles/NVIDIA-GeForce-GTX...[/url]
[url]http://www.engadget.com/2009/04/02/nvidia-gtx-275-...[/url]
[url]http://www.overclockersclub.com/reviews/nvidia_gtx...[/url]
[url]http://www.pcper.com/article.php?aid=684&type=...[/url]
And they all state the GTX 275 gives a lot more fps in all games bar Grid.
genetix - Wednesday, September 23, 2009 - link
This is actually really funny you mention multiple sites. Since it's pretty hard to find an site these days which actually doesn't review/preview without sponsors. Meaning lean to one side to other is pretty simplistic just need to review right games and voila either can win. Lol, looking ATI videos damn those are so well selected that damn.We are definedly getting back to 80s where games where made to GPU. Not to all. The funny thing is even our so trusted Benchmarks like any Futuremark production fakes the results of GPUs. Their so called ORB is pretty far from reality what the hardware is really capable.
Asianman - Tuesday, June 16, 2009 - link
most of those use either NV biased games or most likely didn't upgrade the 4890's drivers. All reviews show that 4890 loses its initial advantage at higher resolutions, and the fact that it is now much cheaper. Take your pick, you'd get good value either way.Patrick Wolf - Sunday, August 2, 2009 - link
Well the 4890 isn't exactly kicking the 275's butt here.Let me break it down:
Age of Conan: 0-3 fps difference. It's a wash
CoD: WaW: 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Crysis Warhead: 0-2 fps difference. It's a wash.
Fallout 3: 4890 wins.
Far Cry 2: 0-2 fps difference. It's a wash.
Left 4 Dead: Again, 275 is at or above 60 fps on all resolutions, beats 4890 at 2560. 275 wins.
Grid: 4890 wins.
That's 2 for nvidia, 2 for ATI. And on COD, Crysis, Far Cry, and L4D the 4890 wins at 1680 and 1920, then at 2560 the 275 suddenly pulls ahead? That's supposed to make sense? Not to mention both drivers used were beta. And the 185.65 drivers have been pulled from nvidia's archives.
pinguw - Friday, April 17, 2009 - link
yes, you said the one that is getting the benefit are the end user, but I think you have a short vision, because when things getting cheapper means we have more chance to get lower quality product. for example, the GTX260 that I bought several month a go, I can see that the image was worse than the 8800GTS that I had 2 years. At beggining I thought it was a defect so I changed other one and other brand and had the same result. so I say, instead of fighting for price, why dont they just make a better product?? lowering the price would just get our product worse and worse, like most of the product sold in US are now made in China... and then everybody are complaining about the product is bad... that is poisoned etc, what a joke what do you expect when the price go down? the answer is easy to get right? So I would suggest you stopping saing the one is getting the benefit are the users, what a brainless commentjoeysfb - Tuesday, April 28, 2009 - link
Something is not right here, are you linking the lowering of product quality to ficere competition???That's why people read reviews, comments from Neweggs, Amazons... to find out the user experience before buying a desired product...
Almost everythings is made in china now...like it or not.
8KCABrett - Thursday, April 16, 2009 - link
Those of us that buy the latest hardware to fly our flight sims have been pretty much left to using the outdated Tom's Hardware charts (which still show the 8800GTS being the fastest card around). I would love to know how the Q9650s and i7s are doing in FSX since the service packs, and it would be great to learn if the GTX 260/280s and now the refreshes are still slower than an 8800GTS in those sims. . .not to mention the abysmal performance of ATI cards! Has anyone found such a review anywhere?joeysfb - Friday, April 17, 2009 - link
just stick to 8800GTS then (money saved)... besides there not many sim titles these days.BikeDude - Friday, April 17, 2009 - link
Stick with the 8800GTS?I do not think you realize the problem. A year ago, FSX ate all the hardware you could throw at it.
FSX is a very difficult animal to feed.
It loves fast CPUs, but it also needs a fast GPU. Unfortunately, as was pointed out, there exists few recent comparisons. It is not easy figuring out the correct hw balance for FSX, since few includes it in a review.
Comparing dozens of FPS games is pointless. They perform similar. There are some small differences, but to evaluate a given card, you don't have to review that many games. FSX however poses some unique challenges, and deserves some attention.
Oh... I'd also like to know which of these cards will play nicely with HD video.
8KCABrett - Tuesday, April 21, 2009 - link
Well, every now and then I like to have a little shooter fun, and the GTS is certainly lagging behind in the new titles.I'm currently beta testing a new sim and it really utilizes the GPU which is nice to see, but my 8800GTS limits me quite a lot, and it's also nicely multi-threaded. I decided it's time to update my system, and really have nothing to guide me. Is ATI still really weak in sims? Have the GTX 280s gotten any better with the recent drivers? What about SP2 in FSX? I just don't have any source of this info, and I've looked everywhere for a legit source.
I've got a GTX 285 on the way and will just end up doing my own testing since that's apparently the only way to get the info.
There are hundreds of review sites out there posting these same four or five titles in their benchmarks and not a single one that includes any of the flight sims, even the new releases. I know sims are a niche market, but flight simmers are left to test for themselves, and they use what is perhaps one of the more demanding titles out there! My complaint isn't directed at Anandtech per se, I favor this site and have seen and appreciated the helpfulness of Gary Key time and again, especially over at the Abit forums, I just wish that Anandtech could employ their testing discipline in titles that really do need a legit place to evaluate them. It could really be a benefit to many people that really aren't catered to at all currently.
OK. . .back to lurking.