Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading
by Daniel Williams & Ryan Smith on February 24, 2016 1:00 PM ESTWe’ve been following DirectX 12 for about 2 years now, watching Microsoft’s next-generation low-level graphics API go from an internal development project to a public release. Though harder to use than earlier high-level APIs like DirectX 11, DirectX 12 gives developers more control than ever before, and for those who can tame it, they can unlock performance and develop rendering techniques simply not possible with earlier APIs. Coupled with the CPU bottlenecks of DirectX 11 coming into full view as single-threaded performance increases have slowed and CPUs have increased their core counts instead, and DirectX 12 could not have come at a better time.
Although DirectX 12 was finalized and launched alongside Windows 10 last summer, we’ve continued to keep an eye on the API as the first games are developed against it. As developers need the tools before they can release games, there’s an expected lag period between the launch of Windows 10 and when games using the API are ready for release, and we are finally nearing the end of that lag period. Consequently we’re now getting a better and clearer picture of what to expect with games utilizing DirectX 12 as those games approach their launch.
There are a few games vying for the title of the first major DirectX 12 game, but at this point I think it’s safe to say that the first high profile game to be released will be Ashes of the Singularity. This is due to the fact that the developer, Oxide, has specifically crafted an engine and a game meant to exploit the abilities of the API – large numbers of draw calls, asynchronous compute/shading, and explicit multi-GPU – putting it a step beyond adding DX12 rendering paths to games that were originally designed for DX11. As a result, both the GPU vendors and Microsoft itself have used Ashes and earlier builds of its Nitrous engine to demonstrate the capabilities of the API, and this is something we’ve looked at with both Ashes and the Star Swarm technical demo.
Much like a number of other games these days, Ashes of the Singularity for its part has been in a public beta via Steam early access, while its full, golden release on March 22nd is fast approaching. To that end Oxide and publisher Stardock are gearing up to release the second major beta of the game, and the last beta before the game goes gold. At the same time they’ve invited the press to take a look at the beta and its updated benchmark ahead of tomorrow’s early access release, so today we’ll be taking a second and more comprehensive look at the game.
The first time we poked at Ashes was to investigate an early alpha of the game’s explicit multi-GPU functionality. Though only in a limited form at the time, Oxide demonstrated that they had a basic implementation of DX12 multi-GPU up and running, allowing us to not only pair up similar video cards, but dissimilar cards from opposing vendors, making a combined GeForce + Radeon setup a reality. This early version of Ashes showed a lot of promise for DX12 multi-GPU, and after some additional development it is now finally being released to the public as part of this week’s beta.
Since that release Oxide has also been at work both cleaning up the code to prepare it for release, and implementing even more DX12 functionality. The latest beta adds greatly improved support another one of DX12’s powerhouse features: asynchronous shading/computing. By taking advantage of DX12’s lower-level access, games and applications can directly interface with the various execution queues on a GPU, scheduling work on each queue and having it executed independently. Async shading is another one of DX12’s optimization features, allowing for certain tasks to be completed in less time (lower throughput latency) and/or to better utilize all of a GPU’s massive arrays of shader ALUs.
Between its new functionality, updated graphical effects, and a significant amount of optimization work since the last beta, the latest beta for Ashes gives us quite a bit to take a look at today, so let’s get started.
153 Comments
View All Comments
tuxRoller - Friday, February 26, 2016 - link
It's the simpler drivers which provide less room to hide architectural deficiencies.My point was that, across the board, gcn improves its performance a good deal relative to d3d11. That includes cards that are four years old. I don't think Maxwell is older than that.
I don't think we are really disagreeing, though.
RMSe17 - Wednesday, February 24, 2016 - link
Nowhere near as bad as the DX9 fiasco back in the FX 5xxx days where a low level ATi card would demolish the highest end GeForcept2501 - Thursday, February 25, 2016 - link
Few if anyone here is going to remember the fiasco when the radeon 9700 pro demolished the competition in performance and stability. Even fewer remember nvidia "optimizing" games with lower quality textures to compete.dray67 - Thursday, February 25, 2016 - link
I remember it and it was the reason I went for the 9700 and the later 9800, atm I'm back to Nvidia I've had 2 AMD card die on me due to heat, as much as I like them I've had my fingers burnt and moved away from them, if dx12 and dual gpu support becomes better supported I'll buy a high AMD card in an instant.knightspawn1138 - Thursday, February 25, 2016 - link
I remember it clearly. My Radeon 9800 was the last ATI card I bought. I loved it for years, and only ended up replacing it with an NVidia card when the Catalyst Control Center started sucking all the cycles out of my CPU. It's funny that half of the comments on this article complain that NVidia's drivers are over-optimized for every specific game, yet ATI and AMD were content to allow the CCC to be a resource hog that ruined even non-gaming performance for years. I'm happy with my NVidia cards. I've been able to easily play all modern games with great performance using a pair of GTX 460's, and recently replaced those with a GTX 970.xenol - Thursday, February 25, 2016 - link
Considering there aren't any other async shader games in development and nothing announced and with Pascal coming within the next year (which maybe, a game might actually use DX12) which will probably alleviate the situation, your evaluation of NVIDIA's situation is pretty poor.It takes more than a generation or a game to make a hardware company go down. NVIDIA suffered plenty during its GeForce FX days, and it got right back on its feet.
MattKa - Thursday, February 25, 2016 - link
No, no, no. An RTS game that probably isn't going to sell very well and seems incredibly lacking is going to destroy Nvidia.gamerk2 - Thursday, February 25, 2016 - link
AMD has had an async compute engine in their GPUs going back to the 7000 series. NVIDIA has not. Stands to reason AMD would do better in async compute based benchmarking.Let's see how Pascal compares, since it's being designed with DX12, and async compute, in mind.
agentbb007 - Saturday, February 27, 2016 - link
"NVIDIA telling us that async shading is not currently enabled in their drivers", yeah this pretty much sums it up. This beta stuff is interesting but just that beta...JlHADJOE - Saturday, February 27, 2016 - link
The GTX 680 seems to have done well though. I feel like Maxwell is being let down by the compromises Nvidia made optimizing for FP16 only and sacrificing real compute performance.