Featured Articles

IHS teardown reveals Galaxy S5 BOM

IHS teardown reveals Galaxy S5 BOM

Research firm IHS got hold of Samsung’s new flagship smartphone and took it apart to the last bolt to figure out…

More...
Galaxy S5, HTC One M8 available selling well

Galaxy S5, HTC One M8 available selling well

Samsung’s Galaxy S5 has finally gone on sale and it can be yours for €699, which is quite a lot of…

More...
Intel lists Haswell refresh parts

Intel lists Haswell refresh parts

Intel has added a load of Haswell refresh parts to its official price list and there really aren’t any surprises to…

More...
Respawn confirms Titanfall DLC for May

Respawn confirms Titanfall DLC for May

During his appearance at PAX East panel and confirmed on Twitter, Titanfall developer Respawn confirmed that the first DLC pack for…

More...
KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 gained a lot of overclocking experience with the GTX 780 Hall of Fame (HOF), which we had a chance to…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Thursday, 07 October 2010 10:56

Tegra 2 doesnt need more power than Tegra 1

Written by Fuad Abazovic
tegra_logo

Still 150 to 200mW


It looks like that Tegra 2 is a good CPU even for mobile phones, or super phones how Nvidia's Tegra general manager Mike Rayfield likes to call them.

He can see them in many phones in 2011 and the most important thing is that it looks like that Tegra 2 won’t need much more power than much older Tegra 1 chip.

Mike has confirmed that Tegra 2 should need some 150 to 200mW which is roughly the same as Tegra 1. The reason why is that Tegra 2 only peaks at the highest speed clock when it’s absolutely necessary and the second core won’t be active at all times. Tegra 2 can peak the both cores and the maximal clock, but this scenario will only be active for the shortest possible time, when the peak of perfoamance is necessary.

Once the heavy task, eg. launching an application in multitasking is over, the CPU will decrease its clock and maybe even turn the other core completely, in order to save battery life.

Tegra 2 is not an exception, as you can expect similar performance from other ARM 9 adopters, but we would not be surprised to see that Tegra 2 could be able to do this power management better. Just remember that Nvidia has been playing with this technology for at least 10 years with its nobook power saving techniques such as Optimus.

Weather you like it or not, dual-core CPUs will be the part of your next generation superphone, as soon as next year.
Last modified on Thursday, 07 October 2010 11:13
blog comments powered by Disqus

Comments  

 
-1 #1 fudzillalololo 2010-10-07 11:29
Well, nVidia said Fermi 480 is like 250W max, right? When it's actually 400+

http://www.nvidia.com/object/product_geforce_gtx_480_us.html
 
 
+22 #2 Alexko 2010-10-07 12:17
"Tegra 2 is not an exception, as you can expect similar performance from other ARM 9 adopters, but we would not be surprised to see that Tegra 2 could be able to do this power management better. Just remember that Nvidia has been playing with this technology for at least 10 years with its nobook power saving techniques such as Optimus."

Riiiiight, because Optimus is 10 years old and NVIDIA's competitors in the smartphone market have no experience at all when it comes to power-saving features. That must be why Tegra is doing so well!

Oh, wait…
 
 
+30 #3 dan 2010-10-07 12:46
Fudo this story is nonsense. Tegra needs 150-200 mW for WHAT exactly? Idling? Full load at max freq? You need to specify these kinds of details, otherwise it comes across as NV-PR sponsored bullshit.

Optimus is not a 10 years in the making technology. It is a fudge, wrapped up in pretty NV PR paper. Nothing more.
 
 
+19 #4 Regenweald 2010-10-07 13:09
how much do you get paid to put your name on these PR pieces Fudo ? I just don't believe you actually write this fiction.
 
 
+2 #5 eddman 2010-10-07 14:23
Quoting fudzillalololo:
Well, nVidia said Fermi 480 is like 250W max, right? When it's actually 400+

http://www.nvidia.com/object/product_geforce_gtx_480_us.html


yeah, sure. The only thing is that the 400+ you're talking about is for the whole system, retard. There is no way the card can go higher than 300w. 75w pci-e + 75w 6-pin + 150w 8-pin.

http://www.anandtech.com/show/2977/nvidia-s-geforce-gtx-480-and-gtx-470-6-months-late-was-it-worth-the-wait-/19
 
 
+4 #6 Nubstick 2010-10-07 16:04
Too bad they couldn't have used some of that amazing power saving knowledge on GF100.

Also, what are the power specs of Tegra 1? They are roughly the same, but specifically what are they?
 
 
+6 #7 fudo 2010-10-07 16:44
Optimus is just one example of power saving and Nvidia is working on power saving techniques since it has entered the mobile market, same as Amd, same as Intel.

150 - 200 mW to play video if I remember correctly

I also agree with you guys that Qualcomm, TI and other guys are not sleeping.
 
 
-2 #8 Alexko 2010-10-07 17:10
Quoting eddman:
yeah, sure. The only thing is that the 400+ you're talking about is for the whole system, retard. There is no way the card can go higher than 300w. 75w pci-e + 75w 6-pin + 150w 8-pin.


Actually, 75 + 75 + 150 is just the specification, the card can draw more than this, because it's possible to draw more than 75W from the PCI-E slot, and more than 75W/150W from 6/8-pin connectors. They're just not certified for it. And some samples have been measured around 320W.

Just take a GTX 480, overclock it to hell, and measure its power consumption under Furmark, you'll see that it's well over 300W. Naturally, you need a motherboard and a power supply that can handle that… And even then, there's no guarantee you won't fry them.
 
 
+4 #9 thevoid 2010-10-07 17:35
Quoting Alexko:
...And some samples have been measured around 320W.


Where do you get such info?

Quoting Alexko:
Just take a GTX 480, overclock it to hell, and measure its power consumption under Furmark, you'll see that it's well over 300W. Naturally, you need a motherboard and a power supply that can handle that… And even then, there's no guarantee you won't fry them.


The same could be done with a 5870, provided you dont hit VRM downclocking first. The 5970 is probably over it at stock, but thats a different matter.
 
 
+4 #10 Alexko 2010-10-07 19:16
Quoting thevoid:
Where do you get such info?


Here: http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html

Obviously, TPU got an especially power-hungry sample, most GTX 480s draw about 295~300W.

Quoting thevoid:
The same could be done with a 5870, provided you dont hit VRM downclocking first. The 5970 is probably over it at stock, but thats a different matter.


Over 300W with an HD 5870? You'd need some pretty insane overclocking with a serious voltage bump and LN2… but yeah, it should be possible. You'd probably need an overcloking-oriented design, though, e.g. MSI's Lightning. As you can see in TPU's review, they measured the HD 5970 at 304W, so yes, some samples may be slightly over 300W in Furmark.
 

To be able to post comments please log-in with Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments