As something of a counter-event to NVIDIA’s gaming showcase taking place in Montreal, Canada this week, AMD has organized an early, brief reveal of their forthcoming Radeon R9 290X video card. The card won’t be launching until sometime in the future, but for today we’re being allowed to confirm that we have the card and are being allowed to publish a single benchmark: Bioshock Infinite at 3840x2160 (4K).

AMD has purposely kept the public details on the R9 290X sparse so far, so we know little other than that it’s a larger GPU rated for 5 TFLOPS of compute performance, and paired with 4GB of memory for a total memory bandwidth of over 300GB/sec. Like most segments of the consumer electronics industry AMD has been gearing up for 3840x2160 (4K) displays, so the Radeon 290X is AMD’s flagship card geared towards gamers using 3840x2160 or 2560x1440 monitors.

Consequently AMD is seeking to draw attention to their 4K performance with today’s benchmark reveal. AMD named the game, the cards, and the resolution – Bioshock at 4K against the GTX 780 – so this is a very limited subset of our full results. And as with all controlled benchmark releases we’d advise not reading too much into any single benchmark here, as the relative performance of NVIDIA and AMD cards changes with the game being tested, at times rather wildly..

Bioshock Infinite - 3840x2160

The biggest problem with 4K displays for at least the intermediate future, other than price of course, will be that you’re either going to need a lot of GPU power to drive them or will have to take a quality hit to achieve acceptable performance. Neither the R9 290X nor the GTX 780 are powerful enough to stay above 30fps on Bioshock with everything turned up. For that you will need to drop down to Medium quality, which gets performance past 30fps and up into the 60fps range. The fact that we’re even talking about playing a game at 60fps this high of a resolution – with 2.25 times as many pixels as 2560x1440 – is a big accomplishment right there, it’s just not one that will come without tradeoffs. For little-to-no compromise 4K gaming we’ll undoubtedly need to turn to multiple GPUs and Crossfire/SLI.

Moving on, it’s interesting to note in this case that both cards are essentially tied at Ultra quality, but when we dial down to medium the 290X takes a very decisive 14% lead. At the highest quality settings we should be shader/texture bound due to the significant use of shader effects on Bioshock’s highest quality settings, whereas at lower quality settings at least some of the bottleneck will shift to elements such as ROP throughput, memory bandwidth, and the geometry pipeline.

Wrapping this preview up, we’ll have more details on the 290X in the near future. AMD has made it clear that they are aiming high with their new flagship video card, so it will be interesting to see what they can pull off as we approach Tahiti/7970’s second birthday.

Comments Locked

89 Comments

View All Comments

  • nathanddrews - Friday, October 18, 2013 - link

    There's a huge difference between pushing 2MP (1080p) and 8MP (4K). That's why I linked to a 6MP (triple screen) example.
  • pattycake0147 - Thursday, October 17, 2013 - link

    Maybe that's planned for the 270X and 260X article that was promised in the 280X launch piece. I hope it comes out soon as that's the article I'm most interested in.
  • Hrel - Thursday, October 17, 2013 - link

    Bioshock favors AMD.
  • tviceman - Thursday, October 17, 2013 - link

    It's a GE title, if anything it should "favor" AMD.
  • Ryan Smith - Thursday, October 17, 2013 - link

    I only have 1 270X, so there won't be any CF results at this time.
  • MoreDinosaurs - Thursday, October 17, 2013 - link

    Outside of the realm of benchmarks, how useful/enjoyable/different is running Bioshock at 4k/60fps? Curious about this strange 4k future...
  • inighthawki - Thursday, October 17, 2013 - link

    I'd like to know too, but it'll probably be a lot more beneficial on large displays (30" or greater), pixel density is usually incredibly poor. On a smaller display such as 22-24" I'd be surprised if the difference is much different than 1080p with good AA and AF. We'll have to wait and see :)
  • A5 - Thursday, October 17, 2013 - link

    Yeah. Not that I have the budget for a 4K monitor anyway, but I'd like to see a subjective comparison of how it feels compared to 1080p or 1440p, especially with the quality tradeoff.
  • Sancus - Thursday, October 17, 2013 - link

    It's pretty cool. You can certainly see higher detail than 1080p in scenes with little or no movement. When there's a lot of moving(turning around, panning, etc) 120hz/Lightboost let you see more detail. For FPSes specifically I would say that higher refresh rates/lightboost add more to the experience than higher resolutions do.

    However, if you like exploring in games like Skyrim, or you enjoy tile-based RTSes/rpgs/strategy games, 4k is pretty cool because you definitely see a lot more detail when you're not jumping/turning around like mad, scenes are sharper, and in the latter case with tile-based games, you get to see a lot more of the map at once without sacrificing the ability to read text. Civ5 is pretty awesome, for example.

    The one big problem with 4k and other high resolutions is that at much smaller monitor sizes(say, 3200x1800 13.3 inch panels or a hypothetical 3840x2160 24 inch panel) scaling becomes a big issue, and even in Windows 8.1, the majority of applications you are going to use have messed up UIs and problems with scaling. Web browsers and built-in WIndows apps tend to be okay, but as soon as you get outside that box, the higher you set the scaling the weirder things are going to get, and then some apps like Skype will just be completely unusable because they don't support scaling at all. Fortunately, my monitor is 31.5 inches so it's only 140dpi, which means that I can get away with 125% scaling, and apps that don't support it at all(Steam, Skype) are still completely usable if a little uncomfortable at times.

    reference: I own an Asus PQ321.
  • Spunjji - Friday, October 18, 2013 - link

    Cheers for the evaluation!

Log in

Don't have an account? Sign up now