Final Words
On a final note, we’ll end with a quick look at Supersonic Sled, NVIDIA’s big “kitchen sink” demo for GF100. Supersonic Sled is a comically-themed simulation of a sled with a rocket attached (or perhaps the other way around) based on some real 1950’s US Air Force tests. It uses tessellation, DirectCompute, PhysX – every new thing NVIDIA could throw in to a demo and still have it run. We had a chance to see this in action on a 3D Vision Surround setup at CES, and we have to give NVIDIA’s demo team credit here, they rarely disappoint.
NVIDIA did give us a small (7MB) recording of it in action that we’ve posted here, in case you haven’t had a chance to see any of the recordings from the CES showfloor.
With that out of the way, there’s only so much we can say about NVIDIA’s new architecture without having the hardware on-hand for testing. NVIDIA certainly built a GPU compute monster in GF100, and based on what we now know about its graphics abilities, it looks like it’s an equally capable GPU gaming monster.
But the big question is just how much of a monster will it be, and what kind of monster price tag will it come with? Let’s make no mistake, at 3 billion transistors GF100 is going to be big, and from NVIDIA’s hints it’s probably going to be the single hottest single-GPU card we’ve seen yet. Barring any glaring flaws NVIDIA has what looks to be a solid design, but at the end of the day it almost always boils down to “how fast?” and “how much?”
NVIDIA has taken a big risk on GF100, first with its compute abilities for GPGPU use, then on its geometry abilities for gaming, and now the risk is time. Being 6 months late has hurt NVIDIA, and being 6 months late has hurt consumers through uncompetitive pricing from AMD. By no means is the situation dire, but we can quickly come up with some scenarios where it is if NVIDIA can’t convincingly beat AMD in gaming performance.
NVIDIA has shown their cards, and they’re all in. Now in the next couple of months we’ll see if they’re bluffing or if they really have what it takes to win. Stay tuned.
115 Comments
View All Comments
DanNeely - Monday, January 18, 2010 - link
For the benefit of myself and everyone else who doesn't follow gaming politics closely, what is "the infamous Batman: Arkham Asylum anti-aliasing situation"?sc3252 - Monday, January 18, 2010 - link
Nvidia helped get AA working in batman which also works on ATI cards. If the game detects anything besides a Nvidia card it disables AA. The reason some people are angry is when ATI helps out with games it doesn't limit who can use the feature, at least that's what they(AMD) claim.san1s - Monday, January 18, 2010 - link
the problem was that nvidia did not do qa testing on ati hardwareMeghan54 - Monday, January 18, 2010 - link
And nvidia shouldn't have since nvidia didn't develop the game.On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game.
And that can be plainly seen by the fact that when the game is "hacked" to trick the game into seeing an nvidia card installed despite the fact an Ati card is being used and AA works flawlessly....and the ATi cards end up faster than current nvidia cards....the game is exposed for what it is. Purposely crippling a game to favor one brand of video card over another.
But the nvididiots seem to not mind this at all. Yet, this is akin to Intel writing their complier to make AMD cpus run slower or worse on programs compiled with the Intel compiler.
Read about that debacle Intel's now suffering from and that the outrage is fairly universal. Now, you'd think nvidia would suffer the same nearly universal outrage for intentionally crippling a game's function to favor one brand of card over another, yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture.
So, one company cripples the function of another company's product and the world's up in arms, screaming "Monopolistic tactics!!!" and "Fine them to hell and back!"; another company does essentially the same thing and it gets a pass.
Talk about bias.
Stas - Tuesday, January 19, 2010 - link
If nV continues like this, it will turn around on them. It took MANY years for the market guards to finally say, "Intel, quit your sh*t!" and actually do something about it. Don't expect immediate retaliation in a multibillion dollar world-wide industry.san1s - Monday, January 18, 2010 - link
"yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture. "here you go
http://www.legitreviews.com/news/6570/">http://www.legitreviews.com/news/6570/
"On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game. "
proof? that looks like conjecture to me. Nvidia says otherwise.
Amd doesn't deny it either.
http://www.bit-tech.net/bits/interviews/2010/01/06...">http://www.bit-tech.net/bits/interviews...iew-amd-...
they just don't like it
And please refrain from calling people names such as "nvidiot," it doesn't help portray your image as unbiased.
MadMan007 - Monday, January 18, 2010 - link
Oh for gosh sakes, this is the 'launch' and we can't even have a paper launch where at least reviewers get hardware? This is just more details for the same crap that was 'announced' when the 5800s came out. Poor show NV, poor show.bigboxes - Monday, January 18, 2010 - link
This is as close to a paper launch as I've seen in a while, except that there is not even an unattainable card. Gawd, they are gonna drag this out a lonnnnngg time. Better start saving up for that 1500W psu!Adul - Monday, January 18, 2010 - link
I suppose this is a vaporlaunch then.Adul - Monday, January 18, 2010 - link
I suppose this is a vaporlaunch then.