A Bit More On Graphics Core Next 1.1

With the launch of Hawaii, AMD is finally opening up a bit more on what Graphics Core Next 1.1 entails. No, they still aren’t giving us an official name – most references to GCN 1.1 are noting that 290X (Hawaii) and 260X (Bonaire) are part of the same IP pool – but now that AMD is in a position where they have their new flagship out they’re at least willing to discuss the official feature set.

So what does it mean to be Graphics Core Next 1.1? As it turns out, the leaked “AMD Sea Islands Instruction Set Architecture” from February appears to be spot on. Naming issues with Sea Islands aside, everything AMD has discussed as being new architecture features in Hawaii (and therefore also in Bonaire) previously showed up in that document.

As such the bulk of the changes that come with GCN 1.1 are compute oriented, and clearly are intended to play into AMD’s plans for HSA by adding features that are especially useful for the style of heterogeneous computing AMD is shooting for.

The biggest change here is support for flat (generic) addressing support, which will be critical to enabling effective use of pointers within a heterogeneous compute context. Coupled with that is a subtle change to how the ACEs (compute queues) work, allowing GPUs to have more ACEs and more queues in each ACE, versus the hard limit of 2 we’ve seen in Southern Islands. The number of ACEs is not fixed – Hawaii has 8 while Bonaire only has 2 – but it means it can be scaled up for higher-end GPUs, console APUs, etc. Finally GCN 1.1 also introduces some new instructions, including a Masked Quad Sum of Absolute Differences (MQSAD) and some FP64 floor/ceiling/truncation vector functions.

Along with these architectural changes, there are a couple of other hardware features that at this time we feel are best lumped under the GCN 1.1 banner when talking about PC GPUs, as GCN 1.1 parts were the first parts to introduce this features and every GCN 1.1 part (at least thus) far has that feature. AMD’s TrueAudio would be a prime example of this, as both Hawaii and Bonaire have integrated TrueAudio hardware, with AMD setting clear expectations that we should also see TrueAudio on future GPUs and future APUs.

AMD’s Crossfire XDMA engine is another feature that is best lumped under the GCN 1.1 banner. We’ll get to the full details of its operation in a bit, but the important part is that it’s a hardware level change (specifically an addition to their display controller functionality) that’s once again present in Hawaii and Bonaire, although only Hawaii is making full use of it at this time.

Finally we’d also roll AMD’s power management changes into the general GCN 1.1 family, again for the basic reasons listed above. AMD’s new Serial VID interface (SIV2), necessary for the large number of power states Hawaii and Bonaire support and the fast switching between them, is something that only shows up starting with GCN 1.1. AMD has implemented power management a bit differently in each product from an end user perspective – Bonaire parts have the states but lack the fine grained throttling controls that Hawaii introduces – but the underlying hardware is identical.

With that in mind, that’s a short but essential summary of what’s new with GCN 1.1. As we noted way back when Bonaire launched as the 7790, the underlying architecture isn’t going through any massive changes, and as such the differences are of primarily of interest to programmers more than end users. But they are distinct differences that will play an important role as AMD gears up to launch HSA next year. Consequently what limited fracturing there is between GCN 1.0 and GCN 1.1 is primarily due to the ancillary features, which unlike the core architectural changes are going to be of importance to end users. The addition of XDMA, TrueAudio, and improved power management (SIV2) are all small features on their own, but they are features that make GCN 1.1 a more capable, more reliable, and more feature-filled design than GCN 1.0.

The AMD Radeon R9 290X Review Hawaii: Tahiti Refined
Comments Locked

396 Comments

View All Comments

  • Spunjji - Friday, October 25, 2013 - link

    Word.
  • extide - Thursday, October 24, 2013 - link

    That doesn't mean that AMD can't come up with a solution that might even be compatible with G-Sync... Time will tell..
  • piroroadkill - Friday, October 25, 2013 - link

    That would not be in NVIDIA's best interests. If a lot of machines (AMD, Intel) won't support it, why would you buy a screen for a specific graphics card? Later down the line, maybe something like the R9 290X comes out, and you can save a TON of money on a high performing graphics card from another team.

    It doesn't make sense.

    For NVIDIA, their best bet at getting this out there and making the most money from it, is licencing it.
  • Mstngs351 - Sunday, November 3, 2013 - link

    Well it depends on the buyer. I've bounced between AMD and Nvidia (to be upfront I've had more Nvidia cards) and I've been wanting to step up to a larger 1440 monitor. I will be sure that it supports Gsync as it looks to be one of the more exciting recent developments.

    So although you are correct that not a lot of folks will buy an extra monitor just for Gsync, there are a lot of us who have been waiting for an excuse. :P
  • nutingut - Saturday, October 26, 2013 - link

    Haha, that would be something for the cartel office then, I figure.
  • elajt_1 - Sunday, October 27, 2013 - link

    This doesn't prevent AMD from making something similiar, if Nvidia decides to not make it open.
  • hoboville - Thursday, October 24, 2013 - link

    Gsync will require you to buy a new monitor. Dropping more money on graphics and smoothness will apply at the high end and for those with big wallets, but for the rest of us there's little point to jumping into Gsync.

    In 3-4 years when IPS 2560x1440 has matured to the point where it's both mainstream (cheap) and capable of delivering low-latency ghosting-free images, then Gsync will be a big deal, but right now only a small percentage of the population have invested in 1440p.

    The fact is, most people have been sitting on their 1080p screens for 3+ years and probably will for another 3 unless those same screens fail--$500+ for a desktop monitor is a lot to justify. Once the monitor upgrades start en mass, then Gsync will be a market changer because AMD will not have anything to compete with.
  • misfit410 - Thursday, October 24, 2013 - link

    G-String didn't kill anything, I'm not about to give up my Dell Ultrasharp for another Proprietary Nvidia tool.
  • anubis44 - Tuesday, October 29, 2013 - link

    Agreed. G-sync is a stupid solution to non-existent problem. If you have a fast enough frame rate, there's nothing to fix.
  • MADDER1 - Thursday, October 24, 2013 - link

    Mantle could be for either higher frame rate or more detail. Gsync sounds like just frame rate.

Log in

Don't have an account? Sign up now