NVIDIA's GeForce 7800 GTX Hits The Ground Running
by Derek Wilson on June 22, 2005 9:00 AM EST- Posted in
- GPUs
Introduction
A vast expanse of destruction lies before you. Billowing blue smoke rises from the ashes of the destroyed city, and flames continue to lick towards the sky. The horizon shimmers from the heat waves and smoke emanating from the rubble. As you proceed into the wreckage, your boots splash through puddles, sending out ripples and churning up the ashes. One of the buildings appears to have escaped most of the force of the blast, so you head towards it hoping to find some shelter and a place to relax for a moment.A glint of light reflects off of the cracked windows, and you instinctively dive to the ground. A split second later, the glass shatters and fragments rain down around you as the bullet misses its intended mark. You roll to the side and watch as dirt and rubble plumes into the air from the spot you so recently occupied. As you marvel at the small particles of dirt scattering into the air, you realize it's already too late; you're too far from cover and the sniper is skilled. As your body slams towards the ground and the scene fades to black, you're glad to know that this was only a game, regardless of how lifelike it appears...
That's not a description of any actual game, but it could be in the very near future judging by the progress we continue to see on the graphics front. The attempt to bring such visions to life is reason enough for us to encourage and revere continued excellence in the field of computer graphics. The ongoing struggle between ATI and NVIDIA to bring forth the most parallel and powerful GPUs at reasonable prices opens new possibilities to developers, pushing them to create content beyond the realm of dreams and move onto ground where angles fear to tread: reality. With each successive generation we work our way closer and closer to blurring the line between reality and rendering, while every step leaves us wanting more. Once again it is time to check in on our progress down the infinite road to graphical perfection.
The latest offering from NVIDIA does not offer a host of new features or any upgraded shader model version support as have the past few generations. The NV4x architecture remains a solid base for this product as the entire DirectX 9 feature set was already fully supported in hardware. Though the G70 (yes, the name change was just to reconcile code and marketing names) is directly based on the NV4x architecture, there are quite a few changes to the internals of the pipelines as well as an overall increase in the width and clock speed of the part. This new update much resembles what we saw when ATI moved from R300 to R420 in that most of the features and block diagrams are the same as last years part with a few revisions here and there to improve efficiency.
One of the most impressive aspects of this launch is that the part is available now. I mean right now. Order it today and plug it in tomorrow. That's right, not only has NVIDIA gotten the part to vendors, but vendors have gotten their product all the way to retailers. This is unprecedented for any graphics hardware launch in recent memory. In the midst of all the recent paper launches in the computer hardware industry, this move is a challenge to all other hardware design houses.
ATI is particularly on the spot after today. Their recent history of announcing products that don't see any significant volume in the retail market for months is disruptive in and of itself. Now that NVIDIA has made this move, ATI absolutely must follow suit. Over the past year, the public has been getting quite tired of failed assurances that product will be available "next week". This very refreshing blast of availability is long overdue. ATI cannot afford to have R520 availability "soon" after launch; ATI must have products available for retail purchase at launch.
We do commend NVIDIA for getting product out there before launching it. But now we move on to the least pleasant side of this launch: price. The GeForce 7800 GTX will cost a solid $600. Of course, we do expect retailers to charge a premium for the early adopters. Prices we are seeing at launch are on the order of $650. This means those who want to build an SLI system based on the GeForce 7800 GTX will be paying between $1200 and $1300 just for the graphics component of their system.
So, what exactly is bigger better and faster this time around? And more importantly, what does that mean for game performance and quality (and is it worth the price)? This is the right place to find the answers. As developers continue to grow in shader prowess, we expect to see hardware of this generation stretch its legs even more as NVIDIA believes this is the point where pure math and shader processing power will become the most important factor in graphics hardware.
127 Comments
View All Comments
multiblitz - Sunday, June 26, 2005 - link
It would be great of you could do a comparison between the 6800 and the 7800 in video /DVD-playback-quality similar to the comparison betwenn the X800 and the 6800 you did last year.at80eighty - Saturday, June 25, 2005 - link
OMG! I've never seen so many bitching whiners come outta the woodworks like this!!You A-holes oughta remember that this site has been kept free
F
R
E
E
The editors owe YOU nothing. At all.
AT team - accidents happen. Keep up the great work!
/#121 : well said. Amazing how these turds dont realise that the knife cuts both ways...
mrdeez - Friday, June 24, 2005 - link
#124You can stfu too...j/k..point taken .
I guess the real issue for me is that this card is a beast but ill never have it in my sli rig......i want all settings maxed at playable resolutions thats just me.........and i will not go back to crt...lol crt thats was lame dude
Momental - Friday, June 24, 2005 - link
#122 The problem with your solution regarding "all of us just getting two 6880U's" works perfectly for those with an SLI-capable board, yes? Some of us, like myself, anticpated the next generation of GPU's like the 7800 series and opted to simply upgrade to one of those when the dust settled and prices slid back a bit.Additionally, telling someone to "STFU" isn't necessary. We can't hold a conversation if we're all silent. Knowhuddamean, jellybean? Hand gestures don't work well over the internet, but here's one for you..........
SDA - Friday, June 24, 2005 - link
LCD gamers shouldn't be bothering with new graphics cards, they should get new monitors.kidding, I have nothing against LCDs. The real advantage of showing the card run at 2048x1536 is that it lets you see how well the card scales to more stressful scenarios. A card that suddenly gets swamped at higher resolutions will probably not fare well in future games that need more memory bandwidth.
On a side note, you can get a CRT that will run 2048x1536 @ a reasonable refresh for about $200 shipped (any Sony G520 variant, such as the Dell P1130). The only things that would actually be small in games are the 2D objects that have set pixel sizes, everything else looks beautiful.
mrdeez - Friday, June 24, 2005 - link
#121lol ty for your insight....anyway like i said this card is not for lcd gamers as most have a 12x10 or 16x12.....so what purpose does this card have??answer me this batman and you have the group that should buy this card -otherwise, the rest of us should just get 2 6800u....this card is geared more for workstation graphics not gaming.....unless you game on a hi def crt and even then max res would be 1920 by 1080i..or something like that.....
SDA - Friday, June 24, 2005 - link
#116, if people in the comments thread are allowed to give their opinions, why shouldn't #114 give his too? Surely even an illiterate like you should realize that arguing that everyone is entitled to his or her own argument means that the person you're arguing with is too.#119, some people have different requirements than others. Some just want no visible blur, others want the best contrast ratio and color reproduction they can get.
bob661 - Thursday, June 23, 2005 - link
#188Oh yeah. The monitor goes up to 16x12.
bob661 - Thursday, June 23, 2005 - link
#118I play BF2 on a Viewsonic VP201b (20.1") at work and it's very good. No streaking or ghosting. Video card is a 6800GT. I play at 1280x960.
Icehawk - Thursday, June 23, 2005 - link
Well, I for one think 1280x1024 is pretty valid as that is what a 19" LCD can do. I'd just want to see a maxed out 12x10 chart to see how it does - I know a 6800 can't do it for every game with full AA and AF. Otherwise I agree - a 12x10 with no options isn't going to show much with current games.See, I'm considering swapping my two 21" CRTs for two 19" LCDs - and they won't do more than 12x10. I'd love to do two 20-21" LCDs but the cost is too high and fast panels aren't to be found. 19" is the sweet spot right now IMO - perhaps I'm wrong?
Thanks AT for a nice article - accidents happen.