Overclocking Extravaganza: Radeon HD 4890 To The Max
by Derek Wilson on April 29, 2009 12:01 AM EST- Posted in
- GPUs
Combined Memory and Core Overclocking: The Sweet Spot
In this round of tests, we combine our previous maximum overclocks. This is our compromise, in that we show the maximum potential of combined core and memory overclocking rather than effects of memory overclocking over each core clock speed we tested. While the latter option would be more complete, our tests do enough to show people what they need to know to find the sweet spot.
We theorized that with an extreme core clock speed that memory may have become a bottleneck to performance at some point. Despite the fact that increasing memory clock without increasing core clock didn't do much at all, we could see increased benefit beyond what one might expect based on our initial memory overclocking results.
Before we looked at varied memory clock with a stock core clock and varied core clock with a stock memory clock. Let's revisit both of those but also add in a twist. We will also look at percent increase in performance when overclocking memory with a 1GHz core clock and the percent increase in performance when overclocking the core with a 1.2GHz memory clock.
1680x1050 1920x1200 2560x1600
1680x1050 1920x1200 2560x1600
1680x1050 1920x1200 2560x1600
1680x1050 1920x1200 2560x1600
Note that in both cases we see a much bigger boost in performance. This means that while applications tend to be very heavily compute limited, at higher core clock speeds on AMD hardware memory bandwidth increasinly becomes a bottleneck. Now let's take a look at what we get when going from a completely stock part to a maximally overclocked part at 1GHz/1.2GHz (core/mem).
1680x1050 1920x1200 2560x1600
To get a basic idea of what's going on, here's an example of two programs. Remember that this isn't really real world and is just to illustrate the concept.
The first application is completely compute bound and the second is 50% compute bound and 50% memory bandwidth bound. Both tests generate 100 frames per second on a stock Radeon HD 4890. If we increase core clock speed 10%, the first application will generate 110 frames per second, while the second one would only generate 105. This is because we only see the 10% benefit while doing half of the work. If we look at only boosting memory performance 10%, the first program delivers only 100 fps while the second hits 105 again. Pushing both memory and core clock speed up 10% each gives us 110 frames per second from both applications. Basically.
Nothing is really that contrived or works like that, but the important thing to remember is that different applications can make varying use of different resources, and balancing those resources is important to ensuring the best performance in the most efficient package.
So, to find the sweet spot for your overclock, you will want to increase core clock speed as much as you can. Then bump up memory clock and see how high you can get it and remain stable. Use a real world application to test performance at each point and then use a binary search like algorithm to find the sweet spot in a short number of tests. And there you have it. We didn't do this for you, but what's better practice than a little hands on experience right? Besides, if gives readers the opportunity to compare notes in the comments on what the optimal memory clock for a 1GHz core clock on the 4890 would be. Have fun!
61 Comments
View All Comments
PC Reviewer - Monday, June 22, 2009 - link
I can vouch for this card...http://pcreviewer.org/new-radeon-hd-4890-video-car...">http://pcreviewer.org/new-radeon-hd-4890-video-car...
I prefer the XFX.. but either way, any single one of those cards is outstanding...
fausto412 - Tuesday, May 5, 2009 - link
All this talk of tuning video cards for max flexibility and performance beings to mind a great idea. why not have a write up on all the neat things you can do with Riva Tuner.i only know how to do 2 things. setup overclocking and fan profiles but i know there is more neat stuff in there.and can you undervolt an nvidia card with software? how?
ValiumMm - Tuesday, May 5, 2009 - link
Saphire and Powercolor have both announced a 1ghz GPU for the 4890, IF this is what got, i just thought you guys would have got higher since your seeing the max OCgold333 - Monday, May 4, 2009 - link
http://www.hexus.net/content/item.php?item=18232&a...">http://www.hexus.net/content/item.php?item=18232&a...SiliconDoc - Saturday, June 6, 2009 - link
Gee that's funny, the GTX275 wins against the 4890 in every single benchmark nearly - or everyone one completely.http://www.hexus.net/content/item.php?item=18232&a...">http://www.hexus.net/content/item.php?item=18232&a...
--
Gee imagine that - I guess Derek wasn't red roostering the testing with a special manufacturer edition sent exclusively to him from ati - and a pathetic 703 core nvidia.
--
Wow.
It's amazing what passes HERE for "a performance comparison".
gold333 - Monday, May 4, 2009 - link
Overclocking Review: HD 4890http://www.guru3d.com/article/overclocking-the-rad...">http://www.guru3d.com/article/overclocking-the-rad...
Overclocking Review GTX 275
http://www.guru3d.com/article/geforce-gtx-275-over...">http://www.guru3d.com/article/geforce-gtx-275-over...
Both are on the identical Core i7 system.
random2 - Monday, May 4, 2009 - link
Great article DerekThanks a ton for the not small effort made to put this together.
What I found very interesting, (besides the overclockability of the 4890) was just how close the 4890 is to the 285 in resolutions less than 30" monitor size. Close enough to be within the realm of "margin of error".
This is all good to know as I have a 24" monitor I cannot see giving up for a few years yet:-)
Thanks again. Oh, by the way....Those who can do...Those who cannot....criticize.
rgallant - Thursday, April 30, 2009 - link
-in every game ? when did this happen.frozentundra123456 - Thursday, April 30, 2009 - link
Just 2 topics:1. How would the 4890 compare in price and performance to the 4870x2 or 4850x2. Would these cards give similar performance at a lower price?
2. When is DX11 coming, how will it be implemented, and will it be any more efficient hardware wise than DX10? Even now, most games take a serious performance hit with DX10. Will DX11 require even better hardware?? If so I will either have to do a serious upgrade or run 2 generation old DX9. I have played Company of Heroes and World in Conflict, and I ran both is DX9 mode. The games looked fine and performance was so much better. DX10 to me has been a big disappointment in that it is so resource intensive without much visual improvement.
Captain828 - Thursday, April 30, 2009 - link
I have to say, this is probably the most worked out OC article I've ever seen... and I've seen a lot of them.Now I understand OC-ing the 4890 to show what a terrific overclocker it is, but it's just not fair to do this if you don't OC the competition's GPU in the same price range (the GTX 275).
Also, I failed to see you mention db ratings in the article. No one wants a goddamn leaf blower in their PC for usual gaming purposes.
Again, I know it took a lot of time and effort to get this done, but I would have gladly waited to see a GTX 275 OC comparison.
Regards,
Captain828