Optimus: Recognizing Applications

Beyond addressing the problems with switching between IGP and dGPU, the Optimus driver has also been re-architected to provide an extensible framework that allows NVIDIA to support new applications with minimal effort. We've seen application profiling in graphics drivers for a while now, but Optimus adds a new type of profiling. Whereas gaming profiles are generally designed to get optimal performance out of the graphics hardware, Optimus is intended to provide an ideal platform for a variety of tasks. If an application can benefit from running on a discrete GPU, the Optimus driver will route the necessary calls to the dGPU. Likewise, if an application doesn't need any extra performance/features, the calls get routed to the IGP for rendering. The idea is that Optimus will use the GPU if there's a performance, quality, and/or power saving benefit.


At present, Optimus recognizes applications based on the executable file name. In some cases, the recognition goes a little deeper. For example, surfing the Internet generally won't benefit from the dGPU; however, if you happen to be viewing a Flash video (and you have the Flash 10.1 beta installed for your browser), Optimus will power up the GPU and begin routing calls through the video processing engine. Close the Flash video website and the GPU can turn off again. Similarly, if you load up a media player application, the GPU won't be necessary if you're dealing SD content but it would be enabled for HD content (and this can be changed depending on the hardware if necessary). Optimus should activate the dGPU any time a user requires DXVA, DirectX (OpenGL), or CUDA features.

The big change in application profiling is that the profiles are now separate from the main graphics driver. NVIDIA has created a robust infrastructure to deal with automatically downloading and updating the profiles, with user customizable options directing how frequently this should occur. This means that unlike SLI support, where a fully functional profile might require one or two releases before it's integrated into the standard NVIDIA drivers, NVIDIA can add applications that can benefit from a GPU to the Optimus profile list within days or perhaps even hours.

What's more, it's possible to add an application yourself if necessary. As an example, our Steam version of Batman: Arkham Asylum wasn't enabling the dGPU initially; we added a profile pointing at the Steam Batman EXE and the problem was solved. Ideally, we shouldn't have had to do that, and if "only 1%" of users ever manually switch between IGP and dGPU before, we suspect far less than 1% would be willing to manually add an application to the Optimus profile list. Hopefully NVIDIA will be able to push out regular profile updates for such omissions quickly.

The automatic updating of Optimus profiles also raises the possibility of using automatic updates for other areas. The big one is going to be SLI profile support, and while it isn't part of the current program it sounds as though NVIDIA intends to add that feature down the road. Once the infrastructure is in place and the drivers support a separate profile download, it should be relatively easy to get SLI profiles in a similar manner. It would also be interesting to see NVIDIA allow users to "suggest" applications for Optimus support through the drivers—i.e., anything that a user has manually added could be uploaded to the server, and if an application name gets enough hits NVIDIA would be more likely to enable support. Naturally, there would be some privacy concerns with such a scheme and some users wouldn't want to participate in such a program, but it might be useful.

As an aside, we've wanted AMD/ATI to enable manual user profiling of games for CrossFire for some time. They still haven't done that, and now NVIDIA has taken things a step further and separated the profiles from the main drivers. This is definitely an improvement over previous profiling schemes and it's something we hope to see more of in the future—from both AMD as well as NVIDIA.

NVIDIA Optimus Unveiled NVIDIA Optimus Demonstration
Comments Locked

49 Comments

View All Comments

  • MasterTactician - Wednesday, February 10, 2010 - link

    But doesn't this solve the problem of an external graphics solution not being able to use the laptop's display? If the external GPU can pass the rendered frames back to the IGP's buffer via PCI-E than its problem solved, isn't it? So the real question is: Will nVidia capitalize on this?
  • Pessimism - Tuesday, February 9, 2010 - link

    Did Nvidia dip into the clearance bin again for chip packaging materials? Will the laptop survive its warranty period unscathed? What of the day after that?
  • HighTech4US - Tuesday, February 9, 2010 - link

    You char-lie droid ditto-heads are really something.

    You live in the past and swallow everything char-lie spews.

    Now go back to your church's web site semi-inaccurate and wait for your next gospel from char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So I suppose the $300+ million dollars in charges Nvidia took were merely a good faith gesture for the tech community with no basis in fact regarding an actual defect.
  • HighTech4US - Tuesday, February 9, 2010 - link

    The $300+ million was in fact a good faith measure to the vendors who bought the products that had the defect. nVidia backed extended warranties and picked up the total cost of repairs.

    The defect has been fixed long ago.

    So your char-lie comment as if it still exists deserves to be called what it is. A char-lie.
  • Pessimism - Tuesday, February 9, 2010 - link

    So you admit that a defect existed. That's more than can be said for several large OEM manufacturers.


  • Visual - Tuesday, February 9, 2010 - link

    Don't get me wrong - I really like the ability to have a long battery life when not doing anything and also have great performance when desired. And if switchable graphics is the way to achieve this, I'm all for it.

    But it seems counter-productive in some ways. If the external GPU was properly designed in the first place, able to shut down power to the unused parts of the processor, supporting low-power profiles, then we'd never have needed switching between two distinct GPUs. Why did that never happen?

    Now that Intel, and eventually AMD too, are integrating a low-power GPU inside the CPU itself, I guess there is no escaping from switchable graphics any more. But I just fail to see why NVidia or ATI couldn't have done it the proper way before.
  • AmdInside - Tuesday, February 9, 2010 - link

    Because it's hard to compete with a company that is giving away integrated graphics for free (Intel) in order to move higher priced CPUs. In a way, AMD is doing the same with ATI. Giving us great motherboards with ATI graphics and cheap cheap prices (which in many ways are much better than Intel's much higher priced offerings) in order to get you to buy AMD CPUs.
  • JarredWalton - Tuesday, February 9, 2010 - link

    Don't forget that no matter how pieces of a GPU go into a deep sleep state (i.e. via power gate transistors), you would still have some extra stuff receiving power. VRAM for example, plus any transistors/resistors. At idle the CULV laptops are down around 8 to 10W; even the WiFi card can suck up .5 to 1W and make a pretty substantial difference in battery life. I'd say the best you're likely to see from a discrete GPU is idle power draw that's around 3W over and above what an IGP might need, so a savings of 3W could be a 30% power use reduction.
  • maler23 - Tuesday, February 9, 2010 - link

    I've been waiting for this article ever since it was hinted in the last CULV roundup. The ASUS laptop is a little disappointing, especially the graphic card situation(the Alienware M11X kind of sucked up a lot of excitement there). Frankly, I'd just take a discounted UL-30VT and deal with manual graphics switching.

    Couple of questions:

    -Any chance for a review of the aforementioned Alienware M11X soon?

    -I've seen a couple of reviews with display quality comparisons including this one. How do to the Macbook and Macbook Pros fit into the rankings?

    cheers!

    -J

Log in

Don't have an account? Sign up now