Back in December, NVIDIA contacted me to let me know that something "really big" was coming out in the near future. It's January 24 as I write this, and tomorrow is the "Optimus Deep Dive" event, an exclusive event with only 14 or so of the top technology websites and magazines in attendance. When Sean Pelletier from NVIDIA contacted me, he was extra excited this time and said something to the effect of, "This new technology is pretty much targeted at you, Jarred… when I saw it, I said, 'We should just call this the Jarred Edition of our mobile platform.' We can't go into details yet, but basically it's going to do all of the stuff that you've been talking about for the past couple of years." With a statement like that, you can understand that it got the gears in my head to start churning. What exactly have I been pining for in terms of mobile GPUs of late? So in advance of the unveiling of their latest technologies and products, I thought I'd put down what I really want to see and then we'll find out how well NVIDIA has matched my expectations.

I've put together my thoughts before getting any actual details from NVIDIA; I'll start with those, but of course NDAs mean that you won't get to read any of this until after the parts are officially announced. Page two will begin the coverage of NVIDIA's Optimus announcement, but my hopes and expectations will serve as a nice springboard into the meat of this article. They set my expectations pretty high back in December, which might come back to haunt them….

First off, if we're talking about a mobile product, we need to consider battery life. Sure, there are some users that want the fastest notebook money can buy—battery life be damned! I'm not that type of user. The way I figure it, the technology has now existed for at least 18 months to offer a laptop that can provide good performance when you need it, but at the same time it should be able to power down unnecessary devices and provide upwards of six hours of battery life (eight would be better). Take one of the big, beefy gaming laptops with an 85Wh (or larger) battery, and if you shut down the discrete GPU and limit the CPU to moderate performance levels, you ought to be able to get a good mobile solution as well as something that can power through tasks when necessary. Why should a 9 pound notebook be limited to just 2 hours (often less) of battery life?

What's more, not all IGPs are created equal, and it would be nice if only certain features of a discrete GPU could power up when needed. Take video decoding as an example. The Intel Atom N270/280/450 processors are all extremely low power CPUs, but they can't provide enough performance to decode a 1080p H.264 video. Pine Trail disappointed us in that respect, but we have Broadcom Crystal HD chips that are supposed to provide the missing functionality. Well, why can't we get something similar from NVIDIA (and ATI for that matter)? We really expect any Core i3/i5 laptop shipped with a discrete GPU to properly support hybrid graphics, and the faster a system can switch between the two ("instantly" being the holy grail), the better. What we'd really like to see is a discrete GPU that can power up just the video processing engine while leaving the rest of the GPU off (i.e. via power gate transistors or something similar). If the video engine on a GPU can do a better job than the IGP and only use a couple watts that would be much better than software decoding on the CPU. Then again, Intel's latest HD Graphics may make this a moot point, provided they can handle 1080p H.264 content properly (including Flash video).

Obviously, the GPU is only part of the equation, and quad-core CPUs aren't an ideal solution for such a product, unless you can fully shut down several of the cores and prevent the OS from waking them up all the time. Core i3/i5/i7 CPUs have power gate transistors that can at least partially accomplish this, but the OS side of things certainly appears to be lagging behind right now. If I unplug and I know all I'm going to be doing for the next couple of hours is typing in Word, why not let me configure the OS to temporarily disable all but one CPU core? What we'd really like to see is a Core i7 type processor that can reach idle power figures similar to Core 2 Duo ULV parts. Incidentally, I'm in a plane writing this in Word on a CULV laptop right now; my estimated battery life remaining is a whopping 9 hours on a 55Wh battery and I have yet to feel the laptop is "too slow" for this task. We haven't reached this state of technology yet and NVIDIA isn't going to announce anything that would affect this aspect of laptops, but since they said this announcement was tailored to meet my wish list I thought I'd mention it.

Another area totally unrelated to power use but equally important for mobile GPUs is the ability to get regular driver updates. NVIDIA first discussed plans for their Verde Notebook Driver Program back at the 8800M launch in late 2007. We first discussed this in early 2008 but it wasn't until December 2008 that we received the first official Verde driver. At that time, the reference driver was only for certain products and operating systems, and it was several releases behind the desktop drivers. By the time Windows 7 launched last fall, NVIDIA managed to release updated mobile drivers for all Windows OSes with support for their 8000M series and newer hardware, and this was done at the same time and with the same version as the desktop driver release. That pattern hasn't held in the months following the Win7 launch, but our wish list for mobile GPUs would definitely include drivers released at the same time as the desktop drivers. With NVIDIA's push on PhysX, CUDA, and other GPGPU technologies, linking the driver releases for both mobile and desktop solutions would be ideal. We can't discuss AMD's plans for their updated ATI Catalyst Mobility just yet, but suffice it to say ATI is well aware of the need for regular mobile driver updates and they're looking to dramatically improve product support in this area. We'll have more to say about this next week.

Finally, the last thing we'd like to see from NVIDIA is less of a gap between mobile and desktop performance. We understand that the power constraints on laptops inherently limit what you can do, and we're certainly not suggesting anyone try to put a 300W (or even 150W) GPU into a laptop. However, right now the gap between desktop and mobile products has grown incredibly wide—not so much for ATI, but certainly for NVIDIA. The current top-performing mobile solution is the GTX 280M, but despite the name this part has nothing to do with the desktop GTX 280. Where the desktop GTX 285 is now up to 240 shader cores (SPs) clocked at 1476MHz, the mobile part is essentially a tweaked version of the old 8800 GTS 512 part. We have a current maximum of 128 SPs running at 1500MHz (1463MHz for the GTX 280M), which is a bit more than half of the theoretical performance of the desktop part with the same name. The bandwidth side of things isn't any better, with around 159GB/s for the desktop and only 61GB/s for notebooks.

As we discussed recently, NVIDIA is all set to release Fermi/GF100 for desktop platforms in the next month or two. Obviously it's time for a new mobile architecture, but what we really want is a mobile version of GF100 rather than a mobile version of GT200. One of the key differences is the support for DirectX 11 on GF100, and with ATI's Mobility Radeon 5000 series already starting to show up in retail products, NVIDIA is behind the 8-ball in this area. We don't have a ton of released or upcoming DX11 games just yet, but all things being equal we'd rather have DX11 support than not. Considering Fermi looks to be a beast in terms of power consumption, we're obviously going to need to make some performance sacrifices in order to keep power in check. GF100 looks to have several parts with varying levels of SPs, so it may be as simple as cutting the number of SPs in half and toning down the clock rates. Another option is that perhaps NVIDIA can take a hybrid approach and tack DX11 features onto the G90 or GT200 architecture rather than reworking GF100 into a mobile product. Whatever route they take, NVIDIA really needs to maintain feature parity with ATI's mobile products, and right now that means DX11 support.

So, that's my wish list right now. I don't ask for much, really: give me mobile performance that has feature parity with desktop parts, with a moderate performance hit in order to keep maximum power requirements in check, and do all that with a chip that's able to switch between 0W power draw and normal power requirements in a fraction of a second as needed. Simple! Now it's time to begin coverage of the actual presentation and find out exactly what NVIDIA is announcing. So turn the page and let's delve into the latest and greatest mobile news from NVIDIA.

A Brief History of Switchable Graphics
Comments Locked

49 Comments

View All Comments

  • jfmeister - Tuesday, February 9, 2010 - link

    I was anxious to get an mx11 but 2 things were bothering me:
    1- No DirectX 11 compatibility
    2- No Core i5/i7 platform.

    Now there is another reason to wait for the refresh. But with arrendale prices droping, DX11 card available, Optimus, I would expect Alienware to get on the badwagon fast for a new mx11 platform and not wait 6 to 8 months for a refresh. This ultra laptop is intended for gamers and we all know that gamers are on top of their things. Optimus in the mx11 case should be a must.

    BTW, what I find funny is Optimus looks like a revolution, but what about 3dfx 10 years ago with their 3D Card addon (Monster 3D 8MB ftw)? Swithcing was used back then... This looks like the same thing except with HD video support! It took that long to come up with that?
  • JarredWalton - Tuesday, February 9, 2010 - link

    Remember that the switching back in the days of 3dfx was just in software and that the 3D GPU was always powered. There was the dongle cable situation as well. So the big deal here isn't just switching to a different GPU, but doing it on-the-fly and powering the GPU on/off virtually instantly. We think this will eventually make its way into desktops, but obviously it's a lot more important for laptops.
  • StriderGT - Tuesday, February 9, 2010 - link

    My take on Optimus:

    Optimus roots lie with hybrid SLI.
    Back then it was advertised as an nvidia only chipset feature (nvidia IGP + nvidia GPU) for both desktop and notebooks.

    Currently nvidia is being rapidly phased out of PC x86 chipsets so optimus is the only way to at least put an nvidia GPU on an intel IGP based system, but:

    1. Only real benefit is gaming performance without sacrificing autonomy in notebooks.
    2. Higher cost (in the form of the discrete GPU), intel has 60%+ of GPUs(=IGPs) because the vast majority do not care or are uninformed about game performance scaling.
    3. CUDA/Physx currently and in the foreseeable future irrelevant for mobile applications (gaming is much more relevant in comparison).
    4. Video decoding capabilities already present in most current IGPs (except pinetrail netbooks which can acquire it with a cheaper dedicated chip )
    5. Netbooks will not benefit from Optimus because they lack the CPU horsepower to feed the discrete GPU and are very cost sensitive... (same reason that ION1/2 is not the primary choice for netbook builders)
    6. In the desktop space only some niche small form factor PC applications could benefit from such a technology eg an SFF PC would need lesser cooling/noise during (IGP) normal operation and become louder more powerful while gaming (GPU)
    7. Idling/2D power consumption of most modern desktop GPUs is so low making the added complexity of a simultaneously working onboard IGP and the associated software a no benefit approach.
    8. Driver/application software problems that might arise from the complexity of profiles and the vastly different workload application scenarios.

    So in the end it boils down how can nvidia convince the world that a discrete GPU and its added cost is necessary in every portable (netbook and upwards sized) device out there. As for the desktop side it will be even more difficult to push such a thing with only noise reduction in small form factor PCs being of interest.

    BTW At least now the manufacturers won't have anymore excuses for the lack of descent GPU inside some of the cheaper notebook models (500-1000$), because of battery autonomy reasons.
    Oh well I'll keep my hopes low after so much time being a niche market since they might find some other excuse along the lines weight and space required for cooling the GPU during A/C operation... :-(

    PS Initially posted on yahoo finance forum
  • Zoomer - Tuesday, February 9, 2010 - link

    Not like it was really necessary; the Voodoo 2 used maybe 25W (probably less) and was meant for a desktop use.
  • jfmeister - Tuesday, February 9, 2010 - link

    Good point! I guess I did not take the time to think about it. I was more into the concept than the whole techincal side of that you brought up.

    Thanks!

    JF
  • cknobman - Tuesday, February 9, 2010 - link

    Man mx11 was biggest disappointment out there. weak sauce last gen processor on a so called premium high end gaming brand? Ill consider it once they get an arrandale culv and optima cause right now looking at notebookreview.com forums it is a manual switching graphics not optima.
  • crimson117 - Tuesday, February 9, 2010 - link

    Which processor should they have used, in your opinion?
  • cknobman - Tuesday, February 9, 2010 - link

    Should have waited another month to market and used the Core i7 ulv processors. There are already a few vendors using this proc (panasonic is one).
  • Wolfpup - Tuesday, April 20, 2010 - link

    Optimus is impressive software, but personally I don't want it, ever. I don't want Intel graphics on my CPU. I don't want Intel graphics in my memory controller. I don't want Intel graphics. I want my real GPU to by my real GPU, not a helper device that renders something that gets copied over to Intel's graphics.

    I just do not want this. I don't like having to rely on profiles either-thankfully you can manually add programs, but still.

Log in

Don't have an account? Sign up now