Tuesday, 09 November 2010 15:05

Geforce GTX 580 review is here - 7. Overclocking, consumption, Thermals

Written by Sanjin Rados



Review: Fastest single GPU around, quiet too

Overclocking, Consumption and Thermals

As our results readily confirm – GTX 580 packs plenty of OC potential. We managed to hit 855MHz for the GPU and 1190MHz (4760MHz effectively), all without meddling with voltages or changing the fan speed from AUTO mode. Reference clocks for this card are 772MHz for the GPU and 4008MHz for the memory. We heard that most GTX 580 can be clocked up to 890MHz by adding voltage but we will check this later on.

The fan isn’t too loud during intensive gaming and we can finally say that we’re pleased with the noise levels. Speeding up the fan didn’t significantly contribute to overclocking headroom. MSI’s Afterburner v.2.0 allowed us to set the fan at maximum 85% RPM, where we were able to push the GPU to 860MHz. MSI’s Afterburner beta does support changing voltages and we should play with it later.
  
gpuz_580
gpuz_580_oc

Nvidia uses a new technology dubbed Advanced Power Management on the GTX 580. It is used for monitoring power consumption and performing power capping in order to protect the card from excessive power draw.

Dedicated hardware circuity on the GTX 580 graphics card performs real time monitoring of current and voltage. The graphics driver monitors the power levels and will dynamically adjust performance in certain stress appllications such as FurMark and OCCT if power levels axceed the cards spec.

Power monitoring adjust performance only if power specs are exceeded and if the application is one of the apps Nvidia has defined in their driver to monitor such as FurMark and OCCT. This should not significantly affect gaming performance, and Nvidia indeed claims that no game so far has managed to  wake this mechanism from its slumber. For now, it is not possible to turn off power capping.

GTX 580’s power caps are set close to PCI Express spec for each 12V rail (6-pin, 8-pin, and PCI Express). Once power-capping goes active, chip clocks go down by 50%. It seems that this is the reason why GTX 480 scores pretty bad compared to GTX 480 in FurMark.



FurMark temperatures didn’t go over 76 °C, which isn’t very realistic – in gaming tests we measured up to 85°C, which seems to suggest that Nvidia overdid the preventive measures.  

overheating-with-newFurMark

GPUZ 0.4.7 doesn’t show downclocking during FurMark, but the new version is set to change that. Below you see the GPU temperature graph we captured during Aliens vs. Predator tests.
 
temperature_in_games_croped

After overclocking, temperatures were at the same level as before. The fan was a bit louder, but still not too loud.

OC_855_2360_Afterburner_crop

The older FurMark, version 1.6, shows that GPU can hit 90°C.

overheating-with-oldFurMark

Consumption is on par with the GTX 480. You’ll find older FurMark test resulst below, because in new FurMark tests our rig didn’t consume more than 367W. During gaming we measured rig consumption of about 450W.
power2

(Page 7 of 8)
Last modified on Sunday, 12 December 2010 18:27
blog comments powered by Disqus

Comments  

 
+4 #1 hellfire 2010-11-09 15:23
they've done their homework. Its actually a Fermi we were waiting for in autumn of 2009. Still even a year later its kickin
 
 
+7 #2 nele 2010-11-09 15:32
Hey, this is pretty good for a card Fudo made up...
 
 
+2 #3 Wolfdale 2010-11-09 15:36
Quote:
Nvidia introduced a new cooling solution to reference high end graphics cooling


right..
its new to nvidia yes,
sapphire has been using it for several years,
its just that nvidia never wanted to believe that their coolers were NOT perfect for the card (hence the fermi fail)
 
 
+4 #4 nele 2010-11-09 15:42
Quoting Wolfdale:
Quote:
Nvidia introduced a new cooling solution to reference high end graphics cooling


right..
its new to nvidia yes,
sapphire has been using it for several years,
its just that nvidia never wanted to believe that their coolers were NOT perfect for the card (hence the fermi fail)


If you bothered to read the whole damn thing you'd realize we were talking about reference cards...
 
 
+8 #5 Jermelescu 2010-11-09 15:57
Indeed, nice job, even compared to 5970; but, before nvidia's fanboys get hyped up... when was 5970 released?
I'll wait until AMD releases 6970 & 6990 to compare benchmarks and prices.
Nevertheless, ladies and gentlemen: THE COMPETITION IS ON!
 
 
+15 #6 nele 2010-11-09 16:01
Speaking of competition...

The HD 5970 just got a massive price cut... €389, down from €500...

Capitalism at its best
 
 
+2 #7 Steve-O 2010-11-09 16:03
Quoting Jermelescu:
Indeed, nice job, even compared to 5970; but, before nvidia's fanboys get hyped up... when was 5970 released?
I'll wait until AMD releases 6970 & 6990 to compare benchmarks and prices.
Nevertheless, ladies and gentlemen: THE COMPETITION IS ON!


As well, one can buy a 5970 on newegg for 499 (not including 30$ MiR, but those are junk), and GTX 580's are running around 550-580$ (not including 10% off rebate).

Have yet to see any results that compared 6870's in crossfire yet though.
 
 
+4 #8 thomasg 2010-11-09 16:04
Yes, in the end the ones that win are the consumers!
 
 
+5 #9 NickThePrick 2010-11-09 16:10
Nvidia really pulled this one outta nowhere. Not a bad effort, cant wait for Cayman and a possible price war.
Its just bad that they needed a kicking from ATI before they got their act back together. Lets hope Jensen learned something out of it.
 
 
+7 #10 thematrix606 2010-11-09 16:14
It's faster, but too expensive. STILL not worth it.
 

To be able to post comments please log-in with Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments