Comments Locked

31 Comments

Back to Article

  • The_Assimilator - Thursday, March 12, 2015 - link

    "Launces"?

    THANKS PURCH MEDIA
  • hammer256 - Thursday, March 12, 2015 - link

    Ha! in the short run this can be nice for the authors, all the blame goes to Purch ;)
  • JarredWalton - Thursday, March 12, 2015 - link

    Sadly, Word's spell check has let me down again. Yes, "launce" (plural "launces") is actually a word, though it's not one I would likely ever use. "Hey, can you get me some launce sushi? I love the taste of sand eel!" :-p
  • Amoro - Thursday, March 12, 2015 - link

    This is really rebadging at it's worst. The performance difference between the GTX 860M Maxwell and Kepler variants is massive. 1389 GFLOPS Maxwell versus 2108 GFLOPS on Kepler
  • Meaker10 - Thursday, March 12, 2015 - link

    Yet the maxwell performs on par at stock in games and overclocks much better. Theoretical max FLOPS is not a good measure and this is proof.
  • yannigr2 - Thursday, March 12, 2015 - link

    If the overclock is not locked in BIOS based on latest rumors.
  • Amoro - Thursday, March 12, 2015 - link

    I agree, that GFLOPS are not the best indicator of performance, especially between different architectures. But it's just so hard to believe that it is 10% faster on average with nearly half the CUDA cores, 232+Mhz clock, and architectural improvements. The only review I could find was from notebook check: http://www.notebookcheck.net/Review-Nvidia-GeForce...

    Something still doesn't seem right.
  • JarredWalton - Thursday, March 12, 2015 - link

    Each Maxwell core has been optimized to perform certain calculations better than a Kepler core, so even with many extra CUDA cores Maxwell can often keep up and surpass Kepler -- depending on the game, of course. The real problem here is that we now have a second generation of mobile "Maxwell 1.0" parts where the desktop is moving to "Maxwell 2.0" for most of their current generation GPUs.

    Bottom line: get a GTX 965M or above if you care about gaming performance.
  • Penti - Thursday, March 12, 2015 - link

    So no HDMI 2.0 600MHz yet?
  • Tegeril - Saturday, March 14, 2015 - link

    Which then begs the question: what is Apple going to put in the next 15" rMBP? Might we see another flip back to AMD? The 965m is probably too power hungry.
  • JarredWalton - Monday, March 16, 2015 - link

    I think the AMD GPUs are probably less efficient overall, but who knows? My guess would be a 940M, or maybe Apple will do a custom order from NVIDIA where they get a 128-bit memory bus with something like 512 cores. Really, I still don't get why NVIDIA isn't doing any GM206 mobile parts yet -- do one of those with lower clocks and maybe disable a couple SMX as well, and it could be a very interesting part. Maybe Apple will do a custom order to get what they want for the next rMPB...or maybe Apple will just use Broadwell GT4e and have no dGPU?
  • SPBHM - Thursday, March 12, 2015 - link

    I hate how they have 2 very different performing models under the same name, one 950M with 80GB/s memory and other with 32GB/s, and most consumers can't tell what version they are buying, the DDR3 model should be called 950M Le, 940M or something else.
  • Hrel - Thursday, March 12, 2015 - link

    Yeah, they've been doing this for too long. It's as infuriating as ever. Makes me happy when I can justify not buying Nvidia, which I can't do at all in laptops, but in Desktop my hatred for Nvidia's business practices makes me not buy them. It's just a bonus when AMD offers a better bang/buck, which they have quite consistently.

    Wish AMD could come up with legitimate competition in mobile however, they need their version of Optimus to meet performance and stability parity with Optimus.
  • Dark_Archonis - Thursday, March 12, 2015 - link

    Yes I can't stand that either. I wish it was illegal for technology to be sold under the exact same model name, yet vastly different internal configurations.

    I still buy Nvidia, because I absolutely can't stand AMD, and there are no other viable alternatives. That's why I usually stick to the mid-high-end Nvidia offerings that don't pull any of this shady nonsense with one model name having two vastly different RAM configurations and speeds.
  • MrSpadge - Thursday, March 12, 2015 - link

    I agree in some way - seeing those mobile specs pretty much always makes me want to puke. It's so clearly aimed at "but it's got the bigger numbers" customers, it makes me sad that they get away from it. And even more so that the OEMs are probably demanding it like this.

    Anyway, my conclusion is the opposite of yours: nVidia for the desktop (to run GPU-Grid and other GP-GPU software 24/7 energy efficiently), but not mobile. I wouldn't want an AMD there either, though.
  • JarredWalton - Thursday, March 12, 2015 - link

    Truth is, NVIDIA only makes the DDR3 GTX 950M for one reason: there's at least one major notebook OEM that wants to save a few pennies. I'll refrain from pointing out the culprit, but I'm sure others can do so.
  • jeffry - Friday, March 13, 2015 - link

    I agree on that one. With older Desktop GPUs, the names designated are even worse in some cases (eg GT-630 / GT-640, check them at the nvidia homepage and take a look at the specifications)
  • Aikouka - Thursday, March 12, 2015 - link

    Is the 960M the exact same as the 860M, or does it have some of the hardware changes that the desktop 960 has (i.e. PureVideo that supports h.265)?
  • MrSpadge - Thursday, March 12, 2015 - link

    It's the same chip, so no changes apart from a handful of MHz more.
  • jeffkibuule - Thursday, March 12, 2015 - link

    How do these compare to the 750M in the MacBook Pro (which feels like only a small bump compared to the 650M from *gasp* 2012!). Am I going to have to wait another year for a *real* upgrade from 3 year old tech? Where are the 20nm or 14/16nm FinFET GPUs? Why does mobile always get the shaft? =/
  • JarredWalton - Thursday, March 12, 2015 - link

    Generally speaking, the GTX 860M is nearly twice as fast as the GT 750M:
    http://www.anandtech.com/bench/product/1142?vs=128...
  • eanazag - Friday, March 13, 2015 - link

    The it is the exact same hardware as the 860M Maxwell part. What Nvidia and AMD tend to do is is differentiate supported features on the newer versus older model numbers. Otherwise it is a clock bump and likely a more mature process.

    In mobile both companies (A and N) rebadge frequently. AMD does this a lot on desktop too. AMD 7970 and R9 280 - same thing but different in driver support long term.

    I don't mind a rebadge if there is a difference in how it is treated from a feature standpoint. I have gripes with rebadging when it results in support mismatches - like DirectX versions or simple features missing.

    An example is AMD's R9 285. Having better software feature support than the R9 290 in the Omega driver. There are hardware differences and I understand that. I just don't get the best features go to a #3 card in the lineup. Release at the tippy top and refresh on down. Whatever.
  • Sushisamurai - Thursday, March 12, 2015 - link

    Is it just me, or are the X60/X50 series degrading in performance? The process hasn't changed, but we're getting less and less shader cores and smaller memory interfaces. Sure we get more battery life, but with so many rebadges and little increase in performance I feel a little shafted from behind.
  • Gigaplex - Thursday, March 12, 2015 - link

    It's not just you.
  • Atakelane - Friday, March 13, 2015 - link

    "920M is a Kepler design", while the official source says it is of Maxwell gen
    http://www.geforce.com/hardware/notebook-gpus/gefo...
    ??????
  • JarredWalton - Friday, March 13, 2015 - link

    I believe the "official page" is wrong, as NViDIA technical marketing specifically told me it was based on Kepler. Plus:
    http://www.geforce.com/hardware/notebook-gpus/gefo...
    http://www.geforce.com/hardware/notebook-gpus/gefo...
    http://www.geforce.com/hardware/notebook-gpus/gefo...

    I've notified NVIDIA of the website error.
  • garadante - Friday, March 13, 2015 - link

    Somewhat offtopic but this is Nvidia and Maxwell. Are we ever going to see that 960 review?
  • JarredWalton - Friday, March 13, 2015 - link

    I'm not sure if Ryan will ever do a full 960 review at this stage, but the performance numbers will likely be included in upcoming GPU reviews. Fundamentally, 960 isn't much more than the same Maxwell 2.0 architecture in a different configuration (GM206 is half the shaders and memory bus as GM204), so performance, price, and power use are the only real factors to investigate.
  • Crunchy005 - Friday, March 13, 2015 - link

    Thanks marketing for ruining tech. These can barely be performance boosts at all, they rebrand to a new badge with the same tech and all they did was clock it a bit higher. It's like all the tech these days everyone makes a huge deal over 5% bump year to year if even that but they rebrand and advertise big performance increases to sell. Already hard enough to find a discrete GPU in a laptop with intel HD running wild, and companies hardly give you all the info you need if there is one. Good luck finding the specifics on that 950m, hope it isn't that DDR3 one. Nvidia makes great cards, but this is ridiculous.
  • mapesdhs - Tuesday, March 17, 2015 - link


    It annoys me intensely that so much of this stuff in the laptop world is dominated by
    what is (let's face) an unbelievable amount of jargon puking. It really should not be
    necessary for any end user to have to wade through this level of spec detail about
    laptop-based gfx options.

    For all the supposed specs given, they convey no real idea how one option compares
    to another, what level of gaming they're aimed at, how they compare to older products, etc.
    And like others, I loathe the rebranding and spec overlap for old vs. new. Much more so
    than desktop GPUs, laptop GPU tech feels again and again like marketing spin gone mad.
    It wouldn't be quite so infuriating if laptops were actually built to last more than a year.

    Ian.
  • mapesdhs - Tuesday, March 17, 2015 - link

    Sorry for typos, was supposed to be, "(let's face it)" - why can't we edit posts?? Get
    with the 21st century already! :\

    Ian.

Log in

Don't have an account? Sign up now