Featured Articles

IHS teardown reveals Galaxy S5 BOM

IHS teardown reveals Galaxy S5 BOM

Research firm IHS got hold of Samsung’s new flagship smartphone and took it apart to the last bolt to figure out…

More...
Galaxy S5, HTC One M8 available selling well

Galaxy S5, HTC One M8 available selling well

Samsung’s Galaxy S5 has finally gone on sale and it can be yours for €699, which is quite a lot of…

More...
Intel lists Haswell refresh parts

Intel lists Haswell refresh parts

Intel has added a load of Haswell refresh parts to its official price list and there really aren’t any surprises to…

More...
Respawn confirms Titanfall DLC for May

Respawn confirms Titanfall DLC for May

During his appearance at PAX East panel and confirmed on Twitter, Titanfall developer Respawn confirmed that the first DLC pack for…

More...
KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 GTX 780 Ti Hall Of Fame reviewed

KFA2 gained a lot of overclocking experience with the GTX 780 Hall of Fame (HOF), which we had a chance to…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Monday, 27 June 2011 09:30

Haswell is bad news for Nvidia

Written by Nick Farell
intel_logo_new

Intel is going for it
Intel's Haswell appears to be targeting Nvidia when it hits the shops in 2013.

A technical document posted tipped on an Intel software blog dumps hints about what its "Haswell" chip will be doing. We already know that the mobile version of the Haswell will be Intel's first system-on-a-chip designed for the laptop market.

Under Intel's cunning plan, by 2013, if the world has not been eaten by a mutant star goat, the laptop market will likely consist of "Ultrabooks." These are ultraslim, ultralight laptops, as well as hybrid designs such as the Asus UX21 and Apple's MacBook Air. Haswell-based Ultrabooks will be about $599 and will not require graphics silicon from companies like Nvidia.

In the Intel Software Network blog with the catchy headline "Haswell New Instruction Descriptions Now Available!" the post gives an overview of "a full specification for the Haswel.” It talks about how it will ship with Intel's Advanced Vector Extensions, or AVX which will deal with the continued need for “vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications”.

Intel AVX uses “advanced thread parallelism, and data vector lengths” to do all that. What it means is that Intel will be a lot better at handling the kinds of tasks that Nvidia and Advanced Micro Devices target today with their graphics silicon. While there is an element of “well it would say that wouldn't it?” we have been waiting for a while for Intel to move on the graphics market and it looks like this will happen with Haswell.


blog comments powered by Disqus

Comments  

 
+24 #1 loadwick 2011-06-27 10:14
nVidia will lose a lot of money from entry level graphics as this market is simply dead now but CPU graphics will NEVER rival discrete graphics cards in the mid to high end range, for one simple reason, you can't cool the chips well enough.

The average CPU is around 100watts, overclocked maybe 150watts, and you have seen the size of air coolers we have just to keep them under control. Try to add in another 300watts high end GPU and it just isn't going to happen.
 
 
-5 #2 fingerbob69 2011-06-27 11:00
AMD may want to start worrying that Intel might be able to do an APU with graphics to match AMD's ...but not for quite sometime yet.

nVidia on the other hand ....are toast.
 
 
0 #3 Kryojenix 2011-06-27 11:55
Whoa! What happened to 'Fick Narrel's' comment?! Is it that there are separate comment systems for logged in members and non-members?! That is pretty cool!
 
 
+23 #4 Memristor 2011-06-27 12:10
Well looks like AMD is still about one year ahead of Intel when it comes to integrated GPU designs. The next release of Fusion will us the GPU as a fully pre-emptable co-processor and more importantly AMD is also supplying its software to developers to really take advantage of the GPU integration with its FSAIL compiler support for C++, Java, and .Net.
 
 
-15 #5 STRESS 2011-06-27 12:35
Two more generations and nVidia is irrelevant. The only ones who haven't realized it yet is Fuad and JJH
 
 
+9 #6 JEskandari 2011-06-27 13:02
Quoting STRESS:
Two more generations and nVidia is irrelevant. The only ones who haven't realized it yet is Fuad and JJH


two more generation and nvidia have a
chip that have 100 time the graphic power
of tegra 2 and i doubt intel can match that in
near future even by having access to
nvidia IP for the next 5 year
and also in two generation NVIDIA will
have project Denver to enter Intel
traditional backyard .


see its easy to talk about future and
talk about probabilities and ifs .
do you think NIDIA right now is in worse
situation than the time when they release
those failed FX series
 
 
-9 #7 pogsnet 2011-06-27 14:30
@JEskandari

You think ARM can outmatch x86/x64 chips?

Haha look at their tiny chip compare to Intel and AMD. The size also symbolizes complexity and ability of the chip. ARM was designed for simple tasks while x86 chips are designed to do everything as possible.

You might be dreaming boy.
 
 
+4 #8 Jurassic1024 2011-06-27 16:06
ffs enough with the ARM and nVIDIA vs Intel and AMD crap.

You're comparing a hot dog to a hamburger!
 
 
+4 #9 magius 2011-06-27 18:03
Quoting pogsnet:
@JEskandari

You think ARM can outmatch x86/x64 chips?

You might be dreaming boy.


Little kid, please don't interrupt while the grownups talk.

Where chips are concerned intended use and efficiency are king.

Going by your logic Intel would have taken over the mobile market by now but they have not, have they?

As for Intel's predictions,the y might make a mean CPU (only thanks to their Israeli team knowledge), but on the graphics side of things they are only marginally good.

Of course they exceed in trash talking but we have seen that in the past.

Just please wake me up when that actual product gets here and then we can test the part.

*YAWN*
 
 
+6 #10 rickster 2011-06-27 23:12
Quoting pogsnet:
@JEskandari

You think ARM can outmatch x86/x64 chips?

Haha look at their tiny chip compare to Intel and AMD. The size also symbolizes complexity and ability of the chip. ARM was designed for simple tasks while x86 chips are designed to do everything as possible.

You might be dreaming boy.


And people thought that tablets would never work.

Never say never. In a world where efficiency is becoming increasingly one of the more important factors, ARM devices are poised to change the market.

It's hard not to have noticed all the AMD fanboys screaming how much more efficient their products are then their rivals lol...
 

To be able to post comments please log-in with Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments