Getting Technical with Next Generation ION

Obviously, there are some changes to the core ION design, since the previous version was a chipset with integrated graphics while the new version only has to worry about GPU duties. The manufacturing technology for ION has also shifted from 55nm for the first generation to 40nm hardware; this results in a package size reduction from 35x35mm down to 23x23mm. Outside of clock speeds what we're looking at is essentially a netbook/nettop version of the GeForce G 210M (with half the SPs on certain models—see below), with full DirectX 10.1 support. The memory technology will continue to be DDR2, which comes from the Intel NM10 chipset.

Since NG-ION is now a discrete GPU, it also comes with up to 512MB of dedicated DDR3 memory. This alone should provide better performance than the old version, as there's now more bandwidth to go around. NVIDIA isn't disclosing clock speeds yet, but given the process shrink we would expect moderately higher clocks. That probably won't matter much for complex games, as the Atom CPU will continue to be a bottleneck, but it will help with some games and CUDA applications. NVIDIA says that in general NG-ION will be 50 to 100% faster than the original ION.

Like the previous iteration of ION, there will be two versions of ION. Nettop IONs will come with 16 SPs (aka CUDA Cores), while netbooks will come with either 16 SPs or 8 SPs. The 8 SP version is designed specifically to fit within the thermal constraints of a 10.1" chassis, so we'll actually see 10" netbooks with ION this time around. Fewer CUDA Cores may impact gaming performance (depending on the title) and CUDA applications, but NVIDIA says Blu-ray playback will continue to work, so by extension HD video decoding won't be a problem.

The one aspect of the technology that NVIDIA wouldn't discuss extensively is how Optimus works within the bandwidth constraints imposed by the NM10 chipset. Just to clarify, NM10 provides just four PCI Express 1.1 lanes, and while a manufacturer could choose to use all of those for the ION GPU NVIDIA says most will use a single link, leaving the other lanes open for additional devices. PCIe 1.1 provides 250MB/s of bidirectional bandwidth on each link, so that means Optimus will have to work within that constraint. Optimus copies content from the GPU memory to the system memory directly, to avoid any issues with flickering displays, but that means NG-ION will need to transmit a lot of data over a relatively narrow link.

Driving a low resolution 1024x600 LCD at 60FPS isn't a problem, but even at 1366x768 we are at the limits of x1 bandwidth (such a display would require 251.78MB/s to be exact). NVIDIA says that NG-ION won't have any trouble driving up to 1080p displays, so naturally we were curious how they manage to get 1920x1080x32-bit @ 60FPS (497.66MB/s) over a 250MB/s link. For now, unfortunately, they aren't saying, other than that there's some "intelligent work" going on in the drivers. Real-time compression is one option, and if you only transmit the meaningful 24-bits of color you can save 33% of the total bandwidth. All we know is that NVIDIA says it works, that the content isn't compressed (presumably that means no lossy compression, as clearly there has to be some form of compression happening), and that they're not overclocking the PCIe bus. In short, Optimus has some "special sauce" that NVIDIA doesn't want to disclose (yet?).

Update: The answer was staring me in the face, but I missed it. NVIDIA was kind enough to provide the remaining information. If you read below, HDMI requires the GPU, since the Atom IGP doesn't support HDMI output. That solves any question of 1080p support, since the only thing going over the x1 link at that point is the compressed video stream. For content running on the laptop LCD, the maximum resolution supported by the Atom LVDS controller is 1366x768, so NVIDIA can do two things. First, any content at a higher resolution (e.g. 1080p Blu-ray) can be scaled down to 1366x768 before going over the x1 PCI-E bus. Second, most videos are at 30p or 24p, so 1366x768x32-bit @ 30FPS is well withing the bandwidth constraints of an x1 link (125.9MB/s). For 1366x768 @ 60FPS, rendering internally at 32-bit (with the Alpha/Z/Bump channel) and then transmitting the final 24-bits of color data still seems like a good solution, but NVIDIA didn't clarify if they're doing anything special for that case or if they're just saturating the x1 link.

Speaking of Optimus, that also means that NG-ION netbooks are limited to Win7 systems. This won't be a concern for most users, but know in advance that installing Linux on such a netbook would mean the loss of your GPU functionality (short of some enterprising individuals figuring out how to do Optimus on their own, which I wouldn't put past the Linux community given time). NVIDIA informs us that there are currently no plans for Optimus on desktops or on other OSes.

One item that created a bit of confusion for us was the Windows 7 Starter requirement. NVIDIA is able to handle the Aero UI with ION, obviously, but what does that mean for Optimus? If GMA 3150 can't run Aero on its own, Optimus would have to constantly power up the GPU to handle Aero effects. That would be counterproductive and as far as we can tell, the real issue with Windows 7 Starter comes down to manufacturers wanting to save money by going with Starter instead of install Home Premium. GMA 3150 can handle Aero (albeit barely), so Optimus doesn't need to worry about using the GPU for standard Windows applications with Pineview CPUs like the N450. Whether all ION netbooks will require Home Premium isn't clear, but we see no reason to pair such a netbook with Win7 Starter.

Finally, because of the IGP restrictions on the GMA 3150, NG-ION will require the GPU to be active any time an HDMI connection is used. GMA 3150 does not have support for HDMI at all, so the HDMI connection will run solely off the ION GPU. Considering HDMI means you're tethered to an external display this shouldn't be a problem, as you can easily plug in the netbook. Second, GMA 3150/NM10 has a resolution limit of 1366x768 for LVDS and 1400x1050 for VGA, so higher resolutions will need ION—but again, higher resolutions will only be available with an external display. The VGA connection is supposed to come from the IGP, so the resolution limit for VGA will remain in effect.

Index ION: TNG Lives Long and Prospers
POST A COMMENT

34 Comments

View All Comments

  • yyrkoon - Wednesday, March 3, 2010 - link

    One of the things that gets me, is that they will not / can not port this technology to the desktop. Would it not be great to have switchable switchable graphics on a low powered IGP platform, and then get a boost when you need / want it ? But nvidia still drives up the power required to use parts on the desktop.

    But, let me backup a minute. Would it not be nice to have a mobile part in a desktop for max efficiency ? Let say, something like the equivalent of the 250M, with very low power usage, but very good performance for the power usage statistics ? I am thinking ~35-40W max under load.

    Even the 7600GT for its time, could not beat these power usage numbers, and for a single monitor at around 1440x900, it did not do terrible performance wise. That, and the 7600GT was one of the most power thrifty discrete cards offered for the desktop, that gave decent performance at or around this resolution. AM I wrong in thinking the 250M GPU could trump the 7600GT in both of these areas ? If I am, then I am sure there is something that *can*.

    Also, look, I am pro Microsoft. I really like Windows 7, especially the 64-bit variant of Ultimate. It runs really nicely on "cheap" laptop with only a T3400 CPU, but with 4GB of memory. Anyways, what is up with nvidia and their "nothing but Windows" stance on this. While again, is there something wrong with the other hardware available to make better use of this current technology ? ARM comes to mind. As well as even a different CPU produced by Intel, or even AMD.

    Maybe the above is moot, because there is already something to fill those gaps, or they do not want to compete with themselves because of the new emerging hardware ( based on ARM was it ? ) they seem to have announced recently. I really do not know the whole story, but it does seem rather short sighted to me that they would limit this hardware to a single software platform. No matter which is is. Give your customer the freedom while using your hardware, and perhaps they will respond in kind by buying your hardware to begin with( and all that ).
    Reply
  • Penti - Tuesday, March 2, 2010 - link

    Twice as fast? What are you on?

    http://www.notebookcheck.net/NVIDIA-GeForce-9600M-...">http://www.notebookcheck.net/NVIDIA-GeForce-9600M-...

    http://www.notebookcheck.net/NVIDIA-GeForce-9400M-...">http://www.notebookcheck.net/NVIDIA-GeForce-9400M-...

    It's game-able with 9600M it's not really game-able with Integrated 9400M.
    Reply
  • JarredWalton - Tuesday, March 2, 2010 - link

    Okay, so it's "over twice as fast". It's still not a performance part. 3DMark isn't usually the best source of data for true performance. Looking to actual games, 9600M typically scores around 2 to 3 times as high as 9400M. The 9400M achieves playable frame rates at minimum details and 800x600 in nearly all games, but only about half are playable at 1366x768. Something like a 9600M is playable in all titles at 1366x768. It's still pretty anemic compared to a $100 desktop card, or a 9800M part. Reply
  • Penti - Wednesday, March 3, 2010 - link

    I was looking at the games (which is included in most reviews/benchmarks at that site).

    9400M does fairly well on a high-speed cpu though I'll give you that. But it's still a pain to run most games.

    Dedicated memory helps, I wonder if the NG-ION will be helped by it. Looks like it will be pretty low bandwidth. 9600M is old of course. But not much else has been available. Of course I'd rather see say a Mobility HD5650. But that's still only comparable in performance to a 9800M GS. They fit the power envelope though. But that won't happen till they move to Core i lappys for Apples part. But of course even the difference of 9400M and 9600M can be felt as enormous. You don't really need to be able to play at higher resolutions then the lappy screen either way. I do agree that it's pretty anemic any way though, especially for the 17" MacBook Pro, but then again it's not a gaming computer. It's not the same as desktop where you need to game at around 1920x1200 and has screens upto 2560x1440. Being able to play at all is pretty good on a laptop.
    Reply

Log in

Don't have an account? Sign up now