Battery Life and Charging

Most of us would not think of a 8.5 lb notebook as something that is overly portable, but there still might be an occasion where the G751 would need to be off the mains for a bit. NVIDIA has done some work on battery powered gaming, but their biggest issue to overcome is that most of these gaming notebooks do not get great battery life even under light workloads, so gaming generally means an hour or less of battery life, and that is with reduced performance as well.

The G751 comes equipped with a 90 Wh battery, which is certainly on the large side but not unexpected with a large chassis notebook.

Our light battery life test consists of web browsing, with the display set at 200 nits. Time is logged until the device powers down. The heavy test ramps up the number of pages loaded, adds in a 1 MB/s file download over Wi-Fi, and also includes a movie playback.

Battery Life 2013 - Light

If the G751 has an Achilles heel it is battery life. Although we can't isolate power consumption of the laptop's GTX 980M discrete GPU, based on the performance of other large gaming laptops I believe the bulk of the battery life hit comes from the lack of Optimus support. Without any Optimus support, or a hardware switch to disable the GPU like the MSI GT80 has, the ASUS G751 is dragged down by having to power far more graphics power than is necessary to browse the web. These devices are not really designed to be used on the go like a smaller Ultrabook would be, but here we can see pretty clearly that there is a lot of work to be done for the Maxwell GPUs to get close to integrated GPUs low power usage.

Battery Life 2013 - Heavy

It is pretty much the same story on the heavy test. Only the Clevo manages worse battery life, and it has a desktop processor inside of it. A ten pound notebook is already not very portable, so the battery life scores are certainly not as relevant as they would be on many machines.

Battery Life 2013 - Light Normalized

Battery Life 2013 - Heavy Normalized

With a 47 watt quad-core Haswell processor, a powerful GPU which can’t be completely disabled, and no real requirement to be efficient, the G751 is closer to a desktop replacement than other similar devices. Like a desktop, this laptop should be plugged in for pretty much any usage. If you wanted to watch a movie on it, you could get by as long as it isn’t Lord of the Rings (the extended version of course). But if you are going to have a knock on a device like this, battery life is likely the one area that is not as critical to the experience.

Charging

ASUS ships the G751 with a 230 watt A/C adapter, which should be plenty to cover the peak power usage of the laptop. With a 47 watt processor and a GPU that will draw somewhere around 100 watts, there is still a nice margin even if the device is fully loaded up. This also leaves quite a bit of power available to charge the large 90 Wh battery.

Battery Charge Time

Despite the huge battery, ASUS manages to charge up very quickly, with the G751 being right at the top of any device we’ve tested. In fact the only device that charges quicker is the Lenovo ThinkPad X1 Carbon, and it has just a 50 Wh battery inside, so ASUS slightly makes up for the less than stellar battery life by at least getting back up to full charge quickly.

All in all, the battery life is poor, but with G-SYNC requiring that the GPU be directly connected to the display, rather than through the integrated GPU, there is no possibility for Optimus to be used. Some devices have a hardware multiplexer to enable the integrated GPU to be used, but it adds cost and complexity to the laptop, and you would lose access to G-SYNC on the desktop too in that case. With these handicaps, ASUS has tried to compensate with a large battery, but in the end the device is just not that power efficient, but its use targeted audience is likely not too worried about that.

Display Wireless, Audio, Cooling, and Software
Comments Locked

52 Comments

View All Comments

  • meacupla - Wednesday, July 29, 2015 - link

    Correct me if I'm wrong, but didn't optimus cause a ton of problems that people just wanted to disable it permanently?
  • Dribble - Wednesday, July 29, 2015 - link

    You're wrong, never had any problems with it and hardly read any complaints about it.
  • Gigaplex - Wednesday, July 29, 2015 - link

    I've had plenty of problems with it. Just because you haven't seen them doesn't mean they don't exist.
  • Refuge - Thursday, July 30, 2015 - link

    Never heard about it? Were you hiding under a rock during that fiasco? It was so bad that some review sites would mark a product down just for having Optimus enabled by default in the bios from the factory.
  • nerd1 - Friday, July 31, 2015 - link

    Optimus is terrible for everything except AAA gaming (big trouble with most online games, nightmare with linux, and so on), and does not make any sense for large caliber gaming rigs anyway. Basically you have to plug in otherwise battery won't last more than an hour.
  • DanNeely - Wednesday, July 29, 2015 - link

    There's a 1-2% framerate hit; and while that's a meaningless real world difference hysteria driven configuration has meant it's often not been installed in top of the line gaming laptops; and caused people to disable it in mid-range ones.
  • Samus - Wednesday, July 29, 2015 - link

    Optimus crashes pretty much every 3D modeling program I've ever tried on it, especially Solidworks.
  • Jorsher - Friday, July 31, 2015 - link

    I've never had a problem with it on my 2012 (perhaps older) Dell XPS with Intel and NVidia graphics. I'm glad to have it.
  • WorldWithoutMadness - Wednesday, July 29, 2015 - link

    I assume it has something to do with the G-sync.
    Maybe it is not compatible with optimus, switching intel hd to gtx, vice versa
  • Brett Howse - Wednesday, July 29, 2015 - link

    I think it has something to do with G-Sync, which is why I laid that out exactly on page 3 :)

    "In order to implement G-SYNC, the NVIDIA GPU must be directly connected to the display panel over eDP - since variable refresh doesn't currently translate through iGPUs - which means that it instantly precludes implementation of NVIDIA’s Optimus technology"

Log in

Don't have an account? Sign up now