Featured Articles

Intel refreshes CPU roadmap

Intel refreshes CPU roadmap

Intel has revealed an update to its CPU roadmap and some things have changed in 2015 and beyond. Let’s start with the…

More...
Hands on: Nvidia Shield Tablet with Android 5.0

Hands on: Nvidia Shield Tablet with Android 5.0

We broke the news of Nvidia's ambitious gaming tablet plans back in May and now the Shield tablet got a bit…

More...
Nokia N1 Android tablet ships in Q1 2015

Nokia N1 Android tablet ships in Q1 2015

Nokia has announced its first Android tablet and when we say Nokia, we don’t mean Microsoft. The Nokia N1 was designed…

More...
Marvell launches octa-core 64-bit PXA1936

Marvell launches octa-core 64-bit PXA1936

Marvell is better known for its storage controllers, but the company doesn’t want to give up on the smartphone and…

More...
Nvidia GTX 970 SLI tested

Nvidia GTX 970 SLI tested

Nvidia recently released two new graphics cards based on its latest Maxwell GPU architecture, with exceptional performance-per-watt. The Geforce GTX 970…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Wednesday, 24 October 2007 12:12

nForce 750i also gets N200 treatment

Written by test

Image

But gets less bandwidth


It looks as if Nvidia is relying heavily on its N200 chipset to patch up current chipsets and to make them support PCI Express 2.0. It might be a strategic move, as it's cheaper to add a PCI Express controller than to make a new chipset, but it's not a good solution.

The 750i is still using the C55 SLI X8 and the MCP51 combination, but with the addition of the N200 this board will get PCI Express 2.0 added to its feature set. However, it will be limited to the same two x8 slot bandwidth as with the 650i chipset.

Boards based on the 750i chipset will also have support for up to six x1 slots or devices, 800MHz DDR2 memory and this time it will aparently support SLI memory, as well.

We're curious why all the chipset manufacturers are imposing these made-up limitations of their chipsets just so they can offer a wider range of chipsets, since why would anyone willingly want to use less bandwidth for their graphics cards than they can use?
Last modified on Wednesday, 24 October 2007 20:36
blog comments powered by Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments