The NVIDIA GeForce RTX 2070 Super & RTX 2060 Super Review: Smaller Numbers, Bigger Performance
by Ryan Smith on July 2, 2019 9:00 AM EST- Posted in
- GPUs
- GeForce
- NVIDIA
- Turing
- GeForce RTX
The 2019 GPU Benchmark Suite & The Test
As we’re kicking off a new(ish) generation of video cards, we’re also kicking off a new generation of the AnandTech GPU benchmark suite.
For 2019 most of the suite has been refreshed to include games released in the last year. The latest iteration of the Tomb Raider franchise, Shadow of the Tomb Raider, is 2019’s anchor title and is the game used for power/temperature/noise testing as well as game performance testing. Also making its introduction to the GPU benchmark suite for the first time is an Assassin’s Creed game, thanks to Assassin’s Creed Odyssey’s extra-handy built-in benchmark.
For 2019 Ashes of the Singularity has been rotated out, so we’re empty on RTSes at the moment. But as an alternative we have Microsoft’s popular Forza Horizon 4, which marks the first time a Forza game has been included in the suite.
AnandTech GPU Bench 2019 Game List | ||||
Game | Genre | Release Date | API | |
Shadow of the Tomb Raider | Action/TPS | Sept. 2018 | DX12 | |
F1 2019 | Racing | Jun. 2019 | DX12 | |
Assassin's Creed Odyssey | Action/Open World | Oct. 2018 | DX11 | |
Metro Exodus | FPS | Feb. 2019 | DX12 | |
Strange Brigade | TPS | Aug. 2018 | Vulkan | |
Total War: Three Kingdoms | TBS | May. 2019 | DX11 | |
The Division 2 | FPS | Mar. 2019 | DX12 | |
Grand Theft Auto V | Action/Open world | Apr. 2015 | DX11 | |
Forza Horizon 4 | Racing | Oct. 2018 | DX12 |
All told, I’m pleasantly surprised by the number of DirectX 12-enabled AAA games available this year. More than half of the benchmark suite is using DX12, with both AMD and NVIDIA cards showing performance gains across all of the games using this API. So this is a far cry from the early days of DX12, where using the low-level API would often send performance backwards. And speaking of low-level APIs, I’ve also thrown in Strange Brigade for this iteration, as it’s one of the only major Vulkan games to be released in the past year.
Finally, I’ve also kept Grand Theft Auto V as our legacy game for 2019. Despite being released for the PC over 4 years ago – and for game consoles 2 years before that – the game continues to be one of the top selling games on Steam. And even with its age, the scalability of the game means that it’s a heavy enough load to challenge even the latest video cards.
As for our hardware testbed, it too has been updated for the 2019 video card release cycle.
Internally we’ve made a pretty big change, going from an Intel HEDT platform (Core i7-7820X) to a standard desktop platform based around an overclocked Core i9-9900K and Z390 chipset. While we’ve used HEDT platforms for the GPU testbed for the last decade, HEDT is becoming increasingly irrelevant/compromised for gaming; while the extra PCIe lanes are nice, these platforms haven’t delivered the best CPU performance for games as of late.
By contrast, desktop processors with 8 cores now provide more than enough cores, and they also provide far better clockspeeds, delivering more of the single/lightly-threaded performance that games need. Furthermore, as SLI and Crossfire are on the rocks, the extra PCIe lanes aren’t as necessary as they once were.
On a side note, I had originally hoped to cycle in a Ryzen 3000 platform at this point, particularly for PCIe 4.0. However the timing of all of these hardware launches meant that we needed to go with an established platform, as it takes a week or so to build and validate a new GPU testbed. Plus with Ryzen 3000 not launching for another week, we wouldn’t have been able to use it for this review anyhow.
Otherwise the rest of our 2019 GPU testbed is relatively straightforward. With 32GB of RAM and a high-end Phison E12-based NVMe SSD, the system and any video cards being tested as well-fed. Enclosing all of this for our real-world style testing is our trusty NZXT Phantom 630 Windowed Edition case.
CPU: | Intel Core i9-9900K @ 5.0GHz |
Motherboard: | ASRock Z390 Taichi |
Power Supply: | Corsair AX1200i |
Hard Disk: | Phison E12 PCIe NVMe SSD (960GB) |
Memory: | G.Skill Trident Z RGB DDR4-3600 2 x 16GB (17-18-18-38) |
Case: | NZXT Phantom 630 Windowed Edition |
Monitor: | Asus PQ321 |
Video Cards: | NVIDIA GeForce GTX 2070 Super Founders Edition NVIDIA GeForce GTX 2060 Super Founders Edition NVIDIA GeForce GTX 2080 Founders Edition NVIDIA GeForce GTX 2070 Founders Edition NVIDIA GeForce GTX 2060 Founders Edition AMD Radeon RX Vega 64 |
Video Drivers: | NVIDIA Release 431.15 AMD Radeon Software Adrenalin 2019 Edition 19.6.3 |
OS: | Windows 10 Pro (1903) |
281 Comments
View All Comments
Koenig168 - Tuesday, July 2, 2019 - link
You can look at the RTX 2080 review. The RTX 2080 is slightly faster than the GTX 1080Ti and the RTX 2070 Super is slightly slower than the RTX 2080. Therefore, the RTX 2070 Super should be around GTX 1080Ti performance.wolfwalker78 - Tuesday, July 2, 2019 - link
Would have been nice to see some GTX numbers in there for comparison, I can't be the only person still running a 1080 or something that is still at least semi competitive. Hell I've got a house full of 1060's on 1080p screens and haven't seen any reason to touch them yet. Also, F these prices. The new norm for GPU cost blows.imaheadcase - Tuesday, July 2, 2019 - link
Yes, yes you are the only one.Korguz - Tuesday, July 2, 2019 - link
um.. no hes not...catavalon21 - Tuesday, July 2, 2019 - link
um...I believe the sarcasm filter was high wide open...Meteor2 - Saturday, July 6, 2019 - link
I don't think we're going to see much progress with 1080p, not for a long time. We have 60fps and little sign of increasing graphics fidelity which is going to push that fps down on any hardware which currently achieves it.Dug - Tuesday, July 2, 2019 - link
Looks like my 1080ti will hold out another year. 2+ years seems like forever.eastcoast_pete - Tuesday, July 2, 2019 - link
@Ryan: Firstly, thanks for the quick review of these "S" cards by Nvidia. I have two questions about your description of the 2070s, you write "All told, NVIDIA has disabled 8 of TU104’s 48 SMs here, leaving a card with 40 SMs, or 2560 Turing CUDA cores." My questions are: Are those chips lower binned (partially defective) big Turings that are then "cut" down to exactly 40 SMs? And, regardless of the binning question, how does Nvidia disable SMs? Laser them out? Thanks for answering!Ryan Smith - Tuesday, July 2, 2019 - link
" Are those chips lower binned (partially defective) big Turings that are then "cut" down to exactly 40 SMs?"They don't have to be, but generally yes.
"And, regardless of the binning question, how does Nvidia disable SMs? Laser them out?"
Lasers and eFuses, as I understand it. Either way it's very much baked into the GPU itself.
eastcoast_pete - Tuesday, July 2, 2019 - link
Thanks Ryan! So, maybe they do have a bunch of lower binned Turings that needed a home (and a paying customer). Dating myself here, but, many, many years ago, NVIDIA had a GeForce card that could be made into a Quattro that cost 3x that by changing a connection with a soldering iron, and a very (!) steady hand. I never dared to try, as one wrong move with that soldering tip could trash the entire card, no repair possible.