Battery Life and Charging

Most of us would not think of a 8.5 lb notebook as something that is overly portable, but there still might be an occasion where the G751 would need to be off the mains for a bit. NVIDIA has done some work on battery powered gaming, but their biggest issue to overcome is that most of these gaming notebooks do not get great battery life even under light workloads, so gaming generally means an hour or less of battery life, and that is with reduced performance as well.

The G751 comes equipped with a 90 Wh battery, which is certainly on the large side but not unexpected with a large chassis notebook.

Our light battery life test consists of web browsing, with the display set at 200 nits. Time is logged until the device powers down. The heavy test ramps up the number of pages loaded, adds in a 1 MB/s file download over Wi-Fi, and also includes a movie playback.

Battery Life 2013 - Light

If the G751 has an Achilles heel it is battery life. Although we can't isolate power consumption of the laptop's GTX 980M discrete GPU, based on the performance of other large gaming laptops I believe the bulk of the battery life hit comes from the lack of Optimus support. Without any Optimus support, or a hardware switch to disable the GPU like the MSI GT80 has, the ASUS G751 is dragged down by having to power far more graphics power than is necessary to browse the web. These devices are not really designed to be used on the go like a smaller Ultrabook would be, but here we can see pretty clearly that there is a lot of work to be done for the Maxwell GPUs to get close to integrated GPUs low power usage.

Battery Life 2013 - Heavy

It is pretty much the same story on the heavy test. Only the Clevo manages worse battery life, and it has a desktop processor inside of it. A ten pound notebook is already not very portable, so the battery life scores are certainly not as relevant as they would be on many machines.

Battery Life 2013 - Light Normalized

Battery Life 2013 - Heavy Normalized

With a 47 watt quad-core Haswell processor, a powerful GPU which can’t be completely disabled, and no real requirement to be efficient, the G751 is closer to a desktop replacement than other similar devices. Like a desktop, this laptop should be plugged in for pretty much any usage. If you wanted to watch a movie on it, you could get by as long as it isn’t Lord of the Rings (the extended version of course). But if you are going to have a knock on a device like this, battery life is likely the one area that is not as critical to the experience.

Charging

ASUS ships the G751 with a 230 watt A/C adapter, which should be plenty to cover the peak power usage of the laptop. With a 47 watt processor and a GPU that will draw somewhere around 100 watts, there is still a nice margin even if the device is fully loaded up. This also leaves quite a bit of power available to charge the large 90 Wh battery.

Battery Charge Time

Despite the huge battery, ASUS manages to charge up very quickly, with the G751 being right at the top of any device we’ve tested. In fact the only device that charges quicker is the Lenovo ThinkPad X1 Carbon, and it has just a 50 Wh battery inside, so ASUS slightly makes up for the less than stellar battery life by at least getting back up to full charge quickly.

All in all, the battery life is poor, but with G-SYNC requiring that the GPU be directly connected to the display, rather than through the integrated GPU, there is no possibility for Optimus to be used. Some devices have a hardware multiplexer to enable the integrated GPU to be used, but it adds cost and complexity to the laptop, and you would lose access to G-SYNC on the desktop too in that case. With these handicaps, ASUS has tried to compensate with a large battery, but in the end the device is just not that power efficient, but its use targeted audience is likely not too worried about that.

Display Wireless, Audio, Cooling, and Software
Comments Locked

52 Comments

View All Comments

  • nathanddrews - Wednesday, July 29, 2015 - link

    Does Intel plan on taking advantage of the Adaptive Sync tech anytime soon? I know they use it for other things for power saving (self-refresh, etc.), but it sure seems like a golden opportunity.

    In the same vein - given that we know (and have known for some time) that the G-Sync module is not required if you have eDP/DP1.2a, when can we expect a shift to tech-agnostic displays? Do we have to wait for DP1.3?
  • DanNeely - Wednesday, July 29, 2015 - link

    I know eDP/DP1.2a provides a similar feature set to G-Sync; but has nVidia actually said anything about adopting it instead of (in addition to?) their current custom hardware implementation?
  • DanNeely - Wednesday, July 29, 2015 - link

    I suppose I deserve what I get for commenting before reading the 2nd page of the article; but as long as they're requiring additional qualification work to approve a panel for GSync, there's still plenty of potential to cause trouble for cross GPU support if they wanted to.
  • nathanddrews - Wednesday, July 29, 2015 - link

    VESA's Adaptive Sync is a part of eDP and DP 1.2a+. It's there, waiting to be used. G-Sync, FreeSync, and whatever Intel does are just the names for how they utilize it.

    The only reason NVIDIA pursued a custom module was because no desktop monitors - at the time - included A-Sync. Screens connected via eDP can. As we move forward with newer and newer displays, we're likely to see more and more of them be A-Sync-capable, which means being able to support any variable refresh technology, whatever the marketing label is. Whether or not those displays end up as G-Sync, FreeSync, or some sort of hybrid is up to the manufacturers, I suppose. Just like with G-Sync and FreeSync displays, optimizing the display will still be important.

    Ideal situation: connect monitor to computer, GPU drivers load a default profile based upon EDID for frequency range and resolution support, advanced users can tweak (under/overclock/etc.), maybe share firmware, etc. to crowd-source calibration settings. Shipping a monitor back to a manufacturer for an update is ridiculous...
  • lefty2 - Thursday, July 30, 2015 - link

    > The only reason NVIDIA pursued a custom module was because no desktop monitors - at the time - included A-Sync.
    This is not true. When AMD created FreeSync the DisplayPort 1.2a monitors did not exist either, but instead of creating a propierty solution, they got FreeSync incorperated into the open standard. Nvidia could have done the same thing, but they prefer using propierty standards to lock the customers in.
  • nathanddrews - Thursday, July 30, 2015 - link

    Interesting version of history you've created.

    DP1.2a display availability lagged behind G-Sync Module display availability by more than a year. If NVIDIA pursued the VESA solution, G-Sync would have been delayed alongside FreeSync. The G-Sync module was absolutely required to get a jump on the market since DP1.2 did not include the Adaptive Sync option. You may not LIKE it, but that's the how it happened.
  • lefty2 - Thursday, July 30, 2015 - link

    DP1.2a display availability also lagged behind FreeSync... AMD just waited had to wait for them to come available (in fact they pushed the standard forward) and Nvidia could have done the same.
  • LoganPowell - Friday, November 27, 2015 - link

    ASUS does overall make fantastic computers, but I'd recommend going for the Dell Inspiron i7559-763BLK. Highly approved by many gaming enthusiasts. /Logan from http://www.consumerrunner.com/top-10-best-laptops/
  • Creig - Thursday, July 30, 2015 - link

    "additional qualification" = give us money to use the G-sync name.
  • Shadowmaster625 - Wednesday, July 29, 2015 - link

    I have a great idea. Let's build Optimus notebooks for five years, and then all the sudden build one without it! Isnt that a great frickin idea? I need to be CEO of Asus.

Log in

Don't have an account? Sign up now