Comparing Percentile Numbers Between the GTX 980 Ti and Fury X

As the two top end cards from both graphics silicon manufacturers were released this year, there was all a big buzz about which is best for what. Ryan’s extensive review of the Fury X put the two cards head to head on a variety of contests. For DirectX 12, the situation is a little less clear cut for a number of reasons – games are yet to mature, drivers are also still in the development stage, and both sides competing here are having to rethink their strategies when it comes to game engine integration and the benefits that might provide. Up until this point DX12 contests have either been synthetic or having some controversial issues. So for Fable Legends, we did some extra percentile based analysis for NVIDIA vs. AMD at the top end.

For this set of benchmarks we ran our 1080p Ultra test with any adaptive frame rate technology enabled and recorded the result:

For these tests, usual rules apply – GTX 980 and Fury X, in our Core i7/i5/i3 configurations at all three resolution/setting combinations (3840x2160 Ultra, 1920x1080 Ultra and 1280x720 Low). Data is given in the form of frame rate profile graphs, similar to those on the last page.

As always, Fable Legends is still in early access preview mode and these results may not be indicative of the final version, but at this point they still provide an interesting comparison.


At 3840x2160, both frame rate profiles from each card looks the same no matter the processor used (one could argue that the Fury X is mildly ahead on the i3 at low frame rates), but the 980 Ti has a consistent gap across most of the profile range.


At 1920x1080, the Core i7 model gives a healthy boost to the GTX 980 Ti in high frame rate scenarios, though this seems to be accompanied by an extended drop off region in high frame rate areas. It is also interesting that in the Core i3 mode, the Fury X results jump up and match the GTX 980 Ti almost across the entire range. This again points to some of the data we saw on the previous page – at 1080p somehow having fewer cores gave the results a boost due to lighting scenarios.


At 1280x720, as we saw in the initial GPU comparison page on average frame rates, the Fury X has the upper hand here in all system configurations. Two other obvious points are noticeable here – moving from the Core i5 to the Core i7, especially on the GTX 980 Ti, makes the easy frames go quicker and the harder frames take longer, but also when we move to the Core i3, performance across the board drops like a stone, indicating a CPU limited environment. This is despite the fact that with these cards, 1280x720 at low settings is unlikely to be used anyway.

Discussing Percentiles and Minimum Frame Rates - AMD Fury X Final Words
Comments Locked

141 Comments

View All Comments

  • Traciatim - Thursday, September 24, 2015 - link

    RAM generally has very little to no impact on gaming except for a few strange cases (like F1).

    Though, the machine still has it's cache available so the i3 test isn't quite the same thing as a real i3 it should be close enough that you wouldn't notice the difference.
  • Mr Perfect - Thursday, September 24, 2015 - link

    In the future, could you please include/simulate a 4 core/8 thread CPU? That's probably what most of us have.
  • Oxford Guy - Thursday, September 24, 2015 - link

    How about Ashes running on a Fury and a 4.5 GHz FX CPU.
  • Oxford Guy - Thursday, September 24, 2015 - link

    and a 290X, of course, paired against a 980
  • vision33r - Thursday, September 24, 2015 - link

    Just because a game supports DX12 doesn't mean it uses all DX12 features. It looks like they have DX12 as a check box but not really utilizing DX12 complete features. We have to see more DX12 implemenations to know for sure how each card stack up.
  • Wolfpup - Thursday, September 24, 2015 - link

    I'd be curious about a direct X 12 vs 11 test at some point.

    Regarding Fable Legends, WOW am I disappointed by what it is. I shouldn't be in a sense, I mean I'm not complaining that Mario Baseball isn't a Mario game, but still, a "free" to play deathmatch type game isn't what I want and isn't what I think of with Fable (Even if, again, really this could be good for people who want it, and not a bad use of the license).

    Just please don't make a sequel to New Vegas or Mass Effect or Bioshock that's deathmatch LOL
  • toyotabedzrock - Thursday, September 24, 2015 - link

    You should have used the new driver given you where told it was related to this specific game preview.
  • Shellshocked - Thursday, September 24, 2015 - link

    Does this benchmark use Async compute?
  • Spencer Andersen - Thursday, September 24, 2015 - link

    Negative, Unreal Engine does NOT use Async compute except on Xbox One. Considering that is one of the main features of the newer APIs, what does that tell you? Nvidia+Unreal Engine=BFF But I don't see it as a big deal considering that Frostbite and likely other engines already have most if not all DX12 features built in including Async compute.

    Great article guys, looking forward to more DX12 benchmarks. It's an interesting time in gaming to say the least!
  • oyabun - Thursday, September 24, 2015 - link

    There is something wrong with the webpages of the article, an ad by Samsung seems to cover the entire page and messes up all the rendering. Furthermore wherever I click a new tab opens at www.space.com! I had to reload several times just to be able to post this!

Log in

Don't have an account? Sign up now