While I was expecting Zen 3 to blow past expectations, I am underwhelmed by the RX 6000 series. 61 fps in Borderlands 3 maxed out means it is slower: 5% slower than the RTX 3080 (65 fps) and 20% slower than the RTX 3090 (76 fps). So close and yet so far. The point is beating the competition, not coming close. I guess that I will continue using my GTX 1080 Ti for a couple more years. I am, however, strongly tempted by the Ryzen 9 5950X because it's 4.9 GHz boost is effectively 5.8 GHz on my 3950X and that would be YUGE!
If these are the best case scenario numbers from AMD (why wouldn't they be), then they will have to price it against the 3070. Although if it gets beat by last gen cards with DLSS and or Ray Tracing it may be DOA.
Various publications have now had look at DLSS 2.0 and it's sometimes even better than native res. DLSS 1.0 was pretty janky, but with 2.0 it's definitely not just a marketing name for upscaling anymore.
It *IS* just a marketing name for upscaling, though. That's literally what it does! It's fairly complex upscaling, but it's still upscaling. There's no way in hell it's "better than native res" unless you compare with the worst possible forms of AA enabled. Getting really bored of people posting this junk.
It's not junk if it's true. DLSS 2.0 has at worst maintained image quality and at best slightly improved it in multiple titles, including Death Stranding, Control, and Wolfenstein: Youngblood. The recent Boundary demo also looks exceptional with DLSS enabled. There are multiple side by side comparisons at this point showing the difference between DLSS 2.0 enabled and disabled. In most cases it's a wash even when staring at a 400% zoomed in shot. If you need to look in that close and still can't spot any difference then the feature is working as intended. No one is saying it's not upscaling but outright dismissing it simply because it's upscaling without even considering all the recent examples is foolish.
The biggest takeaway with DLSS, especially 2.0, is that you can get better performance than native resolution and use that to enable settings that you could not at native resolution. There is no downside to DLSS 2.0 that I've seen anywhere, except that it still requires devs to add support for a on a title-by-title basis.
Calling DLSS "just upscaling" is as uselessly reductive as calling supersampling antialiasing "just downscaling."
And if DLSS were just upscaling, devs wouldn't need to add support for it. The output would just get upscaled, just like when you're watching a TV show on a TV set that upscales. In fact, DLSS and AMD's equivalent are both sorely lacking a feature where they could just upscale without any support by telling the game the monitor is at a lower resolution.
"Usually the highest quality AA available in most games is TAA" Says who? Oh yes, Nvidia.
TAA significantly blurs the entire image and comes with a sizeable performance hit. Not by coincidence, it got a huge boost in mindshare when Nvidia first started doing DLSS demos, because it was the only way they could make a semi-plausible "better than native" claim at the time. The fact that so many sites just ran with that narrative is a damning indictment of how much of the tech press will outright just repeat whatever they're fed, and apparently comment sections follow suit now.
DLSS 2.0 is significantly better than DLSS 1.0, but it's still a way of creatively adding data to a lower-resolution image. It's not just a naive upscale, but it's not magic either, and it comes with its own artifacts. Some people might prefer those to the alternatives, personally I really don't.
I don't know how many different ways I can say this before people stop coming at me with the same copy/paste strawman responses, but hey, let's give it another go!
DLSS is a real technology, but DLSS 1.0 results were terrible, and the worst lie of all was calling it "4k DLSS" when it's really 1440p and it has absolutely nothing to do with 4k. Call it 4k (1440p DLSS) which is what it actually is.
If the resolution and technology is good enough so it actually improves performance with the SAMEimage quality? And it's not just resolution scaling with a sharpness filter, while missing a ton of detail? Cool. Maybe we won't see the difference on 8k screens, I don't know. I don't really care. Graphics are a scam, as we all know, the best games were produced long ago.
Just like RT isnt "just a gimmick", right? "Look guys, there's a tree in the window now, totally worth tanking our framerates by 60%"
It is a total gimmick to give Nvidia some new benchmark to measure against as rasterization is getting so powerful the percentage changes dont look as impressive, even if they are.
@Chaser what do you mean? I know what Cognitive Dissonance is, but not sure how you're applying it here. Are you saying this person is experiencing stress because they hold two contradictory beliefs in some way? I don't get it.
I'm genuinely surprised Wreckage is still an Nvidia shill after all these years. He has been paid by Nvidia(in water bottles and $5 sweaters) to lie about Nvidia performance on this website since back before the Radeon 4870 was demolishing the GTX 260 for way less money and Nvidia paid to block anti aliasing from working on AMD on Batman Arkham Asylum. Get a job dude.
Remember back in 2008 when you were saying PhysX was a game changer and in a few years every single game would be using it in every single post you ever made dozens of times per day?
Seriously though - I love this "logic". If they don't perform exactly the same as the high-end card then they "have" to price against the one below... because why? The pricing should fall in between. Why would it get "beat" by last-gen cards? Why should we compare DLSS to full-res rendering when they're not comparable? Yeesh.
If they price it well and the power/thermals are sensible, it could make a very good alternative to the RTX 3070 or 3080, seemingly slotting in close to the latter in certain loads.
A hypothetical $599 AMD card that's a hair slower than a $699 3080 would be very promising, indeed.
I mean, from a price perspective, that would be tantalizing, but they again fell short of snatching the performance leadership on the GPU front. Halo positioning would really give them the edge in perceived quality. Even if it is the best price to performance, they still need to outdo NVIDIA.
I agree that it would put them in a great market and technology position, but I don't think anyone in the tech sector has realistically expected them to pull off an upset at the GPU top tier segment. And playing devil's advocate, I think that's ok.
While they've been plenty competitive before (R9 290X comes to mind), the prospect of having multiple $1,000-1,500 GPUs targeted at gamers duking it out is honestly all a bit fatiguing.
I'd be more interested to see how they approach future MCM style GPU designs that may allow them to help leverage some of their Zen expertise.
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.
No they don't. 🤷♂️ Feel free to keep moving those goalposts on "perceived quality" while the rest of us buy the products that provide the best balance of price, performance, and power consumption.
I think AMD has to deliver more performance to offset DLSS 2.0 (which works well and is coming to more and more games). Without a DLSS alternative, without higher overall performance, I don't see it competing.
They'd have to seriously undercut pricing. $499 would be ideal.
I disagree, even a $50 price deficit, combine that with it having 16gb of VRAM and they would absolutely trounce the 3080 in the market. That has value written alll over it.
Even when they are competitive AMD have trouble trouncing anything. Even 580 and 570, their most popular recent cards peaked, at about a fifth of the value for 1060s in the steam hardware survey.
Yup. It's almost like a sustained viral marketing campaign by Nvidia to label AMD cards as slow and buggy has had some sort of cumulative effect on purchasing habits.
Look at the AMD subreddit, hundreds upon hundreds of reports of 'black screen' and other problems with 5700XT and other AMD cards. And of course, the manufacturer won't honor the warranty. Spending an extra $100 to get a 2070S to get proper reliability & support is no big deal for people who have a bit of money.
You mean the longstanding complaints from AMD users that AMD's GPU drivers have been poorly coded scrap for a decade? The NAVI users complaining about performance issues were all paid Nvidia shills now?
the funny thing is, for every 1 person that says amds gpu drivers suck, there is the same that say nvidia's drivers are just as as bad. i have had issues with both over the years. each side has had their own issues. NONE of them are perfect.
"the funny thing is, for every 1 person that says amds gpu drivers suck, there is the same that say nvidia's drivers are just as as bad" -- trouble is, Nvidiia outsells AMD >5:1.
@TheinsanegamerN - no, I mean posts like yours and flyingpants256 that act as if problems temporarily affecting one generation of their cards apply to everything they've ever released.
If you need to pretend that I think literally every criticism of an AMD card is written by a "shill" in an attempt to refute what I'm saying, then you're not really refuting what I'm saying, are you? The post I responded to was about the 570 and 580 - feel free to come back to me with the same reply when those cards magically turn into Navi chips.
Those nebulous "longstanding complaints" are exactly what I was referring to - there has been a sustained narrative of ATi/AMD drivers being crap for two solid decades, periodically fuelled by actual instances of truth. The thing is, Nvidia managed to weather storms like being the number one source of Vista crashes and literally burning out a bunch of their cards with driver releases, and yet they never picked up the same rep... Funny how that works.
For context, I just switched across from two years on a GTX 980M to an RX 580 a month and a half ago. The worst issue I had on Nvidia were the usual issues with selecting the right GPU for an application (I forced dGPU all the time, not ideal but it works) and the system flailing when I plugged in an external display (something neither Intel no Nvidia seem to get right). In other words, nothing remarkable at all.
I've not had a single issue on my RX 580 yet. No game compatibility issues, no driver crashes, nothing. Doesn't mean the drivers are perfect, but if they were as bad as folks claims then you'd think I'd have noticed now - or back when I was running a 7970M for 2 years. Somehow, I didn't. 🤔
think of it this way spunjji, if intel or nvidia do something, good or bad, its perfectly fine. if amd does it, its a federal offence, and people loose their minds over it, look at the complaints about the 50 buck price increase for ryzen 5000. intel and nvidia have been raising their prices over the years, i dont remember people complaining about that like they are now about ryzen 5k.
And it's like Nvidia were right, huh? Disfunctional drivers, overheating cards, high power draw with shitty performance (the 580 had the same TDP as GTX 1080 while offering half of its performance).
I doubt it unless Nvidia continues their stock shortage, and most would just wait. For professional, lots of workloads are CUDA accelerated, AMD is lacking for a lot of those. For private use, if they are similar enough in price, most will go Nvidia as they know the branding (GTX or RTX). AMD's driver support hasn't been that great either, issues with black screen. I still like their control center way more though, and I like Chill, but if performance is the same for both with a similar price, I'd still go Nvidia due to DLSS for 1440p.
Heh! 5700xt was $400... so 6700xt will be near $400. This 6900 is douple the size of 6700xt! So the price most likely would be near $800. Very fair price! But there is that $700 3080 (that you can not get) that makes that price hard bargaing... amd does not need to sell at $500. Anything between $650 to $750 is competative prising considering this has more wram! Dlls is used in so few games that you can not count on it... it is very nice if the game has it... but none of the game I play has it and I have hundreds of games... Even new games don`t have it in most cases. So it is like hairfx... interesting, but not really usefull because it does not exist.
In a same way people shout about 3080 only to have 10Gb and Are waiting 20gb version even it will be more expensive than normal 10Gb version... In most cases even 8Gb is enough. I am just saying that extra memory Also contributes to more expensive card. I am not saying that extra memory is better deal to customers at this momement. But don`t Expect tye 6900 to be cheap. It is/will be big rare 4K gpu with eyewatering price tag.
"Even at $499 it will be a tough sell if the 3070 is beating it with ray tracing and DLSS" Big IF there, buddy. Any price is a tough sell to a known Nvidia shill. Please don't pretend there's any reason you'd buy or recommend an AMD card over the Nvidia alternative; most of us know who you are and it's just sad seeing you piss into the wind like this.
In my opinion, DLSS like AMD’s image sharpening tech are not gimmicks. They work, but only if considerable effort goes in from the developer’s end, particularly DLSS. Which is why with so many games released, you don’t see many games supporting DLSS. Only those titles that worked closely with Nvidia gets DLSS support. Also given that most gamers are still on 1080p and 1440p, there is little incentive for developers to spend time and effort baking DLSS support in games.
100% this. There seems to be this tendency to describe DLSS as either utterly essential or "just a gimmick" and neither of those things are true.
It's a great feature to have for the people that want it. I hope AMD's alternative is a solid one, and ideally not one that requires any special effort from developers. I'd argue that if they can manage that, slightly lower resulting image quality compared with DLSS would still be an acceptable compromise. If not, it's going to have to at least match it.
Not really, as you're comparing apples to oranges. Apparently AMD have their own equivalent to DLSS coming - it makes more sense to compare that to DLSS (in terms of both image quality and frame-rate) than it does to compare DLSS to native rendering.
Unless AMD's feature set is better than I expect, Nvidia still has the bonus features of DLSS, ray tracing, NVenc and AI that I think mean it can sustain premium pricing over AMD. I think Nvidia could even beat AMD on price this time around if it has to, given that Samsung will be cheaper than TSMC.
1. NV dies are much bigger, so even if SS 8nm is much cheaper, the die won't be that much cheaper. (N7 has really nice yields now).
2. NV is a company that is used to really high margins, much higher than AMD. The premium desktop chips have probably a margin >70%. They are definitely not willing to give up those margins. Amd is a company that has a gross margin of ~45% and the GPUs are on average below that. Any improvement here would be good enough for AMD.
DLSS is great. Actually, its amazing. But its locked down to certain games, which kinda sucks unless you only play AAAs :/.
We'll see how raytracing and AI performance shake out. CUDA on AMD is still quite funky, and the research side of things is still on CUDA, but there are some Vulkan ports of popular programs coming out.
Ray tracing is a feature on RDNA2. This is the same RDNA2 that is powering the Xbox and PS which comes with hardware raytracing. In fact with AMD providing hardware for all the new consoles, I suspect ray tracing optimisation may tip towards AMD’s favour. DLSS I feel is not going to make much traction considering it’s been introduced since Turing, but how many games actually supports it? I don’t think it’s easy to implement, which is why game Developers are not in favour of it. Moreover as mentioned, most RTX hardware are still capable of 1080p and 1440p, which most gamers are still using.
I love the posts that basically say "I will buy an Nvidia card", but try to phrase it like it's objective. PhysX and HairWorks were the last big draws for these claims.
Yep! You definitely want to get amd option in jees and it seems that we now options Also for 4K gaming. And that is the most important thing. Nvidia did not price 3080 for $700 for no reason! They did know what is coming! If you just can get 3080 at $700...
That's the take-home. Based on the rumours I've been hearing this isn't their best card, but honestly I'm not sure whether or not to trust that. If it is the best they can do, it's still pretty impressive - and it pisses all over the naysayers who've been saying that a card with a 256bit bus would struggle to compete with the 3070. Those claims will no doubt be abandoned and replaced with slightly different ones, in the great tradition of fanboy shitposting.
I would pay $100 more for an Nvidia card, because it has DLSS and better support in games. Not to mention far superior drivers and features. However, if AMD delivers more than this laugh of 8/10 GB VRAM, then I might be very likely to buy their card instead.
VRAM is good, I know nothing about GPU but it seems cheap/easy to add, compared to designing a whole CPU, you just add more RAM chips in parallel for a relatively small cost.
Eh. I wouldn't call coming very close to the 3080 a problem.
And comparisons to the 3090 are rather moot, as it's essentially a Titan i.e. not really something any real number of people are going to buy (or even be able to buy at this rate).
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.
It would be exciting to see them beat Nvidia but thats not really important at all. If they are competitive on the high-end with performance, price, and power thats what counts and this looks pretty good. RTG has had a pretty rough going for the last few generations so hopefully RDNA2 is their Zen1 moment and things just get better from here.
Don't you think your expectations were way higher than reasonable? Look at how many gens it took AMD to catch Intel in gaming. AMD is further behind NVIDIA than they were behind Intel. I expected somewhere along the lines of them being competitive with the 3070's going by NVIDIA's $500 price tag. I assumed NVIDIA knew something when they didn't gouge us like they did with the OG RTX. Either way, this looks to be a huge boost for AMD. Oh, and NVIDIA already has faster tech going by all the old articles stating that they weren't bringing out their best since there was no competition/reason for them, but at least AMD has a 4k card finally.
"Oh, and NVIDIA already has faster tech going by all the old articles stating that they weren't bringing out their best since there was no competition/reason for them"
The AMD 6000 series will be even harder to find. Ethereum miners will pick them up for their efficiency. While the 3080 mines well, it consumes a lot of power. Another factor playing into the limited availability will be TSMC's crowded 7nm node. AMD is already using a lot of their 7nm capacity building chips for next-gen consoles and zen 3.
How are people still whining about fictional "failing caps" in October? You should probably stay off of Reddit and Youtube if you're that gullible to sensational reporting.
That the problem was related to the caps is fictional. That there was a problem in the first place is not "sensational reporting", it's a fact. Seems to be sorted now, though.
Wait for zen 4 to blow your expectations. As for graphics cards, you will need to wait for drivers to update as always, so the cards themselves can perform optimal.
If the 6000 series is competitive to within a few percent of the 3080 then that is a win in my book. Especially if it’s cheaper to compensate for the few frames losses. The next question is power consumption.
If nothing else it’s an option because it will be in stock.
I don't know. If they come close, but they're half the price, their product should sell well. And, for competitive gamers, what matters is performance with useless stuff like ray tracing turned off anyways. Given the huge leap in performance that the 3080 represented, I'll be pleased if AMD manages to come close; not even coming close is when I'd start worrying.
Ian quoted Anand in the Zen 3 article but I think it bears repeating here. "There are no bad products, only bad prices."
Performance numbers by themselves don't paint the whole picture. If they are 95% of a 3080 at 85% the cost, that's a good value. Personally I never spend more than ~$300 on a GPU anyway, so the cards with the best bang for the buck in THAT price range are what interests me.
I imagine you're right that they'd leave the halo card quiet until they can have their "one more thing" moment like with the 5950X. I wish there were some convenient Nvidia numbers to compare to those benchmarks though...
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.
Your own personal definitions don't reflect the reality that they've released a £1500 gaming card and everyone is trying to rationalise it as a "Titan". If it were a Titan, it'd be called a Titan and have the appropriate drivers. It doesn't.
Nothing wrong with 3070-3080 territory so long as the price, power, heat, noise, drivers, etc all hold up. Honestly considering how far behind AMD has been on the GPU front if they're in the same range on these metrics that would be great. Guess we'll see.
Zen 3 looks good. It'll be interesting to see how it compares to Rocket Lake, but it sounds to me like probably equivalent performance and Zen 3 will use less power and have higher core offerings. Not a great spot for Intel to be if that's the case.
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.
Even if it can get near a 3080's performance, the question is, what is the pricetag going to be? If it replaces the RX 5700XT at the same pricetag, then AMD is going to really hurt Nvidia in the price to performance ratio.
The good part is, the AMD driver team has really turned things around for themselves. The 20.9s have been really well done.
I'll say it this way, if the Reference model gets near the 3080, AIB partners like PowerColor and Sapphire will beat the 3080. If it end up cheaper, then Nvidia will have a problem.
Either the CU/GCN architecture not scaling well, or they used the second, third tier chip. I don't think they achieved much in terms of raw IPC, but rather through simple clock speeds.
I think the idea here is either it's not the top-end Navi with its 80 CUs (in which case increased clock speed over the 5700XT may have played a significant part), or having double the CUs doesn't yield anywhere near double the performance in these titles. It's a bit soon to tell.
If they can do 3070 performance at 200-225W for $400, they can have my money. I don't care if they get even remotely close to a 3080(let alone 3090), because I can't justify spending that much money on a graphics card anyway.
Yep! So true! 6900 will be expensive halo gpu. The Nvdia 3070 and amd 6700 is the upper limit where most people goes! $500 is a lot of money for gpu! To most people the 3060 at $400 is too much... and we will get something like 6600 for that segment from amd. Aka cut down 6700. 6900zt will be high above those in price wise!
I cant believe there are actually people writing off AMDs GPUs based on this small preview, do we really expect AMD to show us the best of what they have...in a preview, if they did that then the launch presentation is going to be a pretty negative affair, my bet is this was a low end or midrange card, Nvidia knows whats in the pipeline thats why they have priced the 3000 series as they have, AMD I think will be equal too or even beat Nvidia this time
I don't understand either. Waiting for more information before making a judgement may be prudent. The performance of either card may be misleading given limited information.
I'm skeptical about non-DLSS performance with Nvidia. With AMD, I am skeptical about the RT performance. Need more information...
AMD, the most secretive company in the last 3 years, the company hiding the 3950x, Radeon 7, RDNA, ZEN 3, Threadripper 3000... and so on... is going to spill the beans on their biggest hyped announcement for PC gamer in the last 5 years in a teaser...?
Well, news for you, they shown you the 6800XT performances.
I feel the performance looks decent, but with only 3 games being shown here, it’s still too early to conclude on performance. I don’t think it will beat the RTX 3080 for sure, but if it’s just slightly slower with a better price tag and power consumption, I feel it will still do well. The 16GB VRAM will be sufficient for a longer time than 10GB. Sure Nvidia may release a 20GB version, but at a higher cost is what I feel. So if AMD can undercut them in terms of cost, they may have a winner(if they don’t botch up driver).
Considering that the current AVAILABLE price of the 3080 is really $1500 from scalpers, AMD should price the RX6000 at $700 as they will sell out at that price for months. 3080 won’t be available readily until spring 2021. I’d prefer AMD and the AIBs to make some money for once from their GPUs instead of the scalpers.
As I expected, pretty much 90% of the 3080. Pricing is where Nvidia and AMD will fight it over though, where Nvidia has to delay the 3070 to get their competitive pricing right
I really don't think we will see a review @nadim.kahwaji @Qasar
You see, it was "hopefully" gonna be ready in time for the release of 3090 a week later. But the 3090 release was 2 weeks ago.
We who have been reading the site since it was founded and have good memories have unfortunately seen this same thing come to pass many a times with GPU reviews especially.
This site is the best, has the best writers and a pretty civilized atmosphere in the comment sections. Unfortunately, when they miss a "deadline" the pressure is off and it's more common that the review never materializes at all.
This would be fine, if they wouldnt keep promising "it's coming", and then never deliver, stringing readers along. When Anand worked here, I don't think a review was ever late. Now there's is always an excuse and then that excuse is milked to the max and then it just runs out into the sand.
It's really sad cause the last year or so I don't think any big deadlines was missed and things were looking up. I still believe and hope though, checking the site everyday. Maybe one day... :)
As far as I can remember, smartphone reviews were late, but GPU reviews weren't, really.
This site has tanked over time, in more ways than one. The current owners/editors know it, it's not really their fault if they don't have the proper talent/skill to be number one anymore.
funkforce the fires in california are STILL burning, still getting the smoke creep in where i live from time to time. if you call mandatory evac orders an excuse, then i dont know what to say, i thnk ryan smith lives in california, and he reviews the gpu's on here. google california wild fires and you will see. https://www.theguardian.com/us-news/2020/oct/05/ca...
"When Anand worked here, I don't think a review was ever late."
Long-term reader here to confirm that people have been whining about late reviews for as long as Anandtech has existed. They still exist even in the era of breathless pre-release YouTube promo videos because nobody else goes into the same depth.
Anand cashed out and that was really the end of the site as we knew it, still decent but it’s lost a lot over the years. Was sad when HardOCP closed, same deal - main person got scooped up by the “real” industry and left all us poorer.
"Aiming for 3080" with a question mark? Surely if AMD doesn't manage, on October 28th, to announce GPUs which at least approach performance parity to the 3080 and even the 3090, they would be in trouble? Of course, if they stick to the lower end of the market, but offer superior value there, they can certainly stay in business.
I can guarantee they don't have any links to reliable info on bad yields. The issue isn't with yields. It's that Nvidia didn't start production until mid/late August, and AIB's were flailing to pump cards out in a matter of weeks.
Nobody but Nvidia has reliable info on their yields, but we can draw inferences.
What we have to go on is: Customers can't get hold of the cards. There's no competition for wafers on Samsung 8nm, so they're not capacity constrained. Nvidia are saying "yields are great". Nvidia are also saying "supply will be constrained through early 2021".
They couldn't know that last part unless they can predict either supply or demand precisely, and if it's demand they're predicting precisely, then why didn't they make more of these cards before launch? So: either they're incompetent and underestimated demand on a card they chose to market as "up to 2x faster (creative fiction)" and "the biggest generational leap ever (lie)", they're deliberately constraining supply to drive prices up so AIBs / resellers can cash in, or yields are bad and they're lying about it.
Given that they've lied every other time they'd have terrible yields, I'd happily bet that's at least a contributing factor.
If they can get close to 3080 performance and undercut the 3080 on price and availability, that's going to be a huge win for AMD. That being said, I'm not interested in anything over $500, so my interest is in what replaces the 5700XT at the $400'ish price point.
Reading these comments are fun indeed. People really believe that after all the hype that AMD showed there best "Big Navi" as an afterthought at CPU press conference?
They shown the 6800XT with 64CUs. They know they are beating the 3080, but they just want Nvidia to panic and do something stupid. They are doing what they did with Radeon 7. I cannot believe some people here really believes they have shown their biggest die in a teaser.
AMD likes to play with the competition. Push them into releasing early(NVIDIA Ampere, and note there are no cards available, cards that crash(black screens), driver problems, you name it). AMD lets just enough information out there to force the competition into showing its hand. We have seen AMD playing with Intel from 2017 through 2020, making Intel talk big, but showing time and time again that Intel doesn't have anything ready to compete with what AMD is doing. AMD set up for NVIDIA to burn its customers by releasing some Super cards well before the one year mark of the original RTX 2060, so those with a 2060 from launch felt burned, and those who bought a 2070 or 2080, yea, Super cards came out for the same price with better performance, just to stay ahead of the Radeon 5700XT.
Radeon VII wasn't even a specific gaming card. The Radeon Instinct card based on 7nm with 16GB of memory...just slap a new name on it and call it Radeon VII. It was FULLY a Radeon Instinct card with a different name, nothing more. RDNA was really the first card that AMD intentionally planned to release as a consumer gaming card since Vega 56 and Vega 64.
AMD has shown that the management knows how to play games with the competition. Leak numbers that may or may not be correct to push the competition into raising or lowering prices is a part of those games. Show numbers far above or below what they really are, and the competition responds. Show numbers right on target for when your product is a lot better than expected, and the competition will respond, but possibly outmaneuver you to minimize the appeal of the new products.
I hope you are correct that the numbers provided by AMD are from the 6800XT, but we don't really know right now. No matter what, we can expect more than double the performance of Vega64 or Radeon VII with Big Navi, and that means it will be a solid card.
THIS is why careful consumers wait for all the facts to be in before committing a large wad of cash on a component. I think that it is sometimes called 'buyer's remorse' when not adhered to.
Then there is 'buyer's frustration' when their needs can't be met. As in, 'what do ya mean I can't run a GPU and a PC\e x 16 m.2 raid card on an AM4? Or, 'When will the Zen 3 Threadrippers arrive?' Rocket lake probably won't help, either... Looks to sit on a spec that just matches AM4. Barely.
I hope this isn’t Vega 2.0. When Vega 64 came out it matched or even beat the Gtx 1080 in some titles but was instantly sold out because of miners, way louder, and much more power hungry. Now if 6000 series can match 2080 at similar power levels, fan noise and they get supply in check, we might have a solid team Red offering.
Well.. considering just how power hungry the 3080 is it's probably safe to say that it won't be worse in that area.. might be on par power wise though (knowing AMD)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
179 Comments
Back to Article
Hifihedgehog - Thursday, October 8, 2020 - link
While I was expecting Zen 3 to blow past expectations, I am underwhelmed by the RX 6000 series. 61 fps in Borderlands 3 maxed out means it is slower: 5% slower than the RTX 3080 (65 fps) and 20% slower than the RTX 3090 (76 fps). So close and yet so far. The point is beating the competition, not coming close. I guess that I will continue using my GTX 1080 Ti for a couple more years. I am, however, strongly tempted by the Ryzen 9 5950X because it's 4.9 GHz boost is effectively 5.8 GHz on my 3950X and that would be YUGE!Hifihedgehog - Thursday, October 8, 2020 - link
https://www.eurogamer.net/articles/digitalfoundry-...Wreckage - Thursday, October 8, 2020 - link
If these are the best case scenario numbers from AMD (why wouldn't they be), then they will have to price it against the 3070. Although if it gets beat by last gen cards with DLSS and or Ray Tracing it may be DOA.MisterAnon - Thursday, October 8, 2020 - link
DLSS is a gimmick. It is a marketing name for upscaling, and reduces quality compared to native resolution. It is not something you just dial up.bigboxes - Thursday, October 8, 2020 - link
Look who you're responding to.JlHADJOE - Friday, October 9, 2020 - link
Various publications have now had look at DLSS 2.0 and it's sometimes even better than native res. DLSS 1.0 was pretty janky, but with 2.0 it's definitely not just a marketing name for upscaling anymore.https://www.eurogamer.net/articles/digitalfoundry-...
https://babeltechreviews.com/the-death-stranding-i...
https://www.eurogamer.net/articles/digitalfoundry-...
Spunjji - Friday, October 9, 2020 - link
It *IS* just a marketing name for upscaling, though. That's literally what it does! It's fairly complex upscaling, but it's still upscaling. There's no way in hell it's "better than native res" unless you compare with the worst possible forms of AA enabled. Getting really bored of people posting this junk.krazyfrog - Friday, October 9, 2020 - link
It's not junk if it's true. DLSS 2.0 has at worst maintained image quality and at best slightly improved it in multiple titles, including Death Stranding, Control, and Wolfenstein: Youngblood. The recent Boundary demo also looks exceptional with DLSS enabled. There are multiple side by side comparisons at this point showing the difference between DLSS 2.0 enabled and disabled. In most cases it's a wash even when staring at a 400% zoomed in shot. If you need to look in that close and still can't spot any difference then the feature is working as intended. No one is saying it's not upscaling but outright dismissing it simply because it's upscaling without even considering all the recent examples is foolish.jordanclock - Friday, October 9, 2020 - link
The biggest takeaway with DLSS, especially 2.0, is that you can get better performance than native resolution and use that to enable settings that you could not at native resolution. There is no downside to DLSS 2.0 that I've seen anywhere, except that it still requires devs to add support for a on a title-by-title basis.Calling DLSS "just upscaling" is as uselessly reductive as calling supersampling antialiasing "just downscaling."
quadibloc - Friday, October 9, 2020 - link
And if DLSS were just upscaling, devs wouldn't need to add support for it. The output would just get upscaled, just like when you're watching a TV show on a TV set that upscales. In fact, DLSS and AMD's equivalent are both sorely lacking a feature where they could just upscale without any support by telling the game the monitor is at a lower resolution.Luke212 - Monday, October 12, 2020 - link
DLSS is upscaling. its smart upscaling.althaz - Friday, October 9, 2020 - link
Usually the highest quality AA available in most games is TAA. DLSS 2.0 *consistently* betters native 4k with TAA and is essentially never worse.Calling it "upscaling" might be technically correct, but it in no way whatsoever reflects the actual results.
Spunjji - Monday, October 12, 2020 - link
"Usually the highest quality AA available in most games is TAA" Says who? Oh yes, Nvidia.TAA significantly blurs the entire image and comes with a sizeable performance hit. Not by coincidence, it got a huge boost in mindshare when Nvidia first started doing DLSS demos, because it was the only way they could make a semi-plausible "better than native" claim at the time. The fact that so many sites just ran with that narrative is a damning indictment of how much of the tech press will outright just repeat whatever they're fed, and apparently comment sections follow suit now.
DLSS 2.0 is significantly better than DLSS 1.0, but it's still a way of creatively adding data to a lower-resolution image. It's not just a naive upscale, but it's not magic either, and it comes with its own artifacts. Some people might prefer those to the alternatives, personally I really don't.
I don't know how many different ways I can say this before people stop coming at me with the same copy/paste strawman responses, but hey, let's give it another go!
Tilmitt - Saturday, October 10, 2020 - link
Did you read the linked articles?Dug - Friday, October 9, 2020 - link
It's hardly a gimmick bud. But keep up the good thinking.flyingpants265 - Friday, October 9, 2020 - link
DLSS is a real technology, but DLSS 1.0 results were terrible, and the worst lie of all was calling it "4k DLSS" when it's really 1440p and it has absolutely nothing to do with 4k. Call it 4k (1440p DLSS) which is what it actually is.If the resolution and technology is good enough so it actually improves performance with the SAMEimage quality? And it's not just resolution scaling with a sharpness filter, while missing a ton of detail? Cool. Maybe we won't see the difference on 8k screens, I don't know. I don't really care. Graphics are a scam, as we all know, the best games were produced long ago.
liquid_c - Friday, October 23, 2020 - link
Stop trying, man. He hates Nvidia with a passion, as you can see.TheinsanegamerN - Saturday, October 10, 2020 - link
Just like RT isnt "just a gimmick", right? "Look guys, there's a tree in the window now, totally worth tanking our framerates by 60%"It is a total gimmick to give Nvidia some new benchmark to measure against as rasterization is getting so powerful the percentage changes dont look as impressive, even if they are.
Chaser - Friday, October 9, 2020 - link
Cognitive dissonance.vol.2 - Sunday, October 11, 2020 - link
@Chaser what do you mean? I know what Cognitive Dissonance is, but not sure how you're applying it here. Are you saying this person is experiencing stress because they hold two contradictory beliefs in some way? I don't get it.Meteor2 - Tuesday, October 13, 2020 - link
Most people forget the "stress" part when they throw around the term "cognitive dissonance" :-(dguy6789 - Friday, October 9, 2020 - link
I'm genuinely surprised Wreckage is still an Nvidia shill after all these years. He has been paid by Nvidia(in water bottles and $5 sweaters) to lie about Nvidia performance on this website since back before the Radeon 4870 was demolishing the GTX 260 for way less money and Nvidia paid to block anti aliasing from working on AMD on Batman Arkham Asylum. Get a job dude.dguy6789 - Friday, October 9, 2020 - link
Remember back in 2008 when you were saying PhysX was a game changer and in a few years every single game would be using it in every single post you ever made dozens of times per day?Spunjji - Friday, October 9, 2020 - link
Nvidiot says what?Seriously though - I love this "logic". If they don't perform exactly the same as the high-end card then they "have" to price against the one below... because why? The pricing should fall in between. Why would it get "beat" by last-gen cards? Why should we compare DLSS to full-res rendering when they're not comparable? Yeesh.
Slash3 - Thursday, October 8, 2020 - link
If they price it well and the power/thermals are sensible, it could make a very good alternative to the RTX 3070 or 3080, seemingly slotting in close to the latter in certain loads.A hypothetical $599 AMD card that's a hair slower than a $699 3080 would be very promising, indeed.
Hifihedgehog - Thursday, October 8, 2020 - link
I mean, from a price perspective, that would be tantalizing, but they again fell short of snatching the performance leadership on the GPU front. Halo positioning would really give them the edge in perceived quality. Even if it is the best price to performance, they still need to outdo NVIDIA.Slash3 - Thursday, October 8, 2020 - link
I agree that it would put them in a great market and technology position, but I don't think anyone in the tech sector has realistically expected them to pull off an upset at the GPU top tier segment. And playing devil's advocate, I think that's ok.While they've been plenty competitive before (R9 290X comes to mind), the prospect of having multiple $1,000-1,500 GPUs targeted at gamers duking it out is honestly all a bit fatiguing.
I'd be more interested to see how they approach future MCM style GPU designs that may allow them to help leverage some of their Zen expertise.
Either way, it's still a good time for tech.
Hifihedgehog - Thursday, October 8, 2020 - link
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.firerod1 - Thursday, October 8, 2020 - link
I looked up Techspot's numbers and it has the 3080 at 4k for gears 5 at 72 fps, so 1 fps win!Spunjji - Friday, October 9, 2020 - link
Interesting, if true. Definitely waiting for independent reviews on these.omi-kun - Thursday, October 8, 2020 - link
If they come in close in perf but cheaper and use less power, that will be a huge win.zamroni - Thursday, October 8, 2020 - link
I guess 3070 is $499 because of 2080ti-equivalent xsx is $499 then 3080 is $699 because it needs to maintain similar price/performance ratio to 3070Spunjji - Friday, October 9, 2020 - link
No they don't. 🤷♂️ Feel free to keep moving those goalposts on "perceived quality" while the rest of us buy the products that provide the best balance of price, performance, and power consumption.Sttm - Thursday, October 8, 2020 - link
I think AMD has to deliver more performance to offset DLSS 2.0 (which works well and is coming to more and more games). Without a DLSS alternative, without higher overall performance, I don't see it competing.They'd have to seriously undercut pricing. $499 would be ideal.
Kutark - Thursday, October 8, 2020 - link
I disagree, even a $50 price deficit, combine that with it having 16gb of VRAM and they would absolutely trounce the 3080 in the market. That has value written alll over it.ArcadeEngineer - Thursday, October 8, 2020 - link
Even when they are competitive AMD have trouble trouncing anything. Even 580 and 570, their most popular recent cards peaked, at about a fifth of the value for 1060s in the steam hardware survey.Spunjji - Friday, October 9, 2020 - link
Yup. It's almost like a sustained viral marketing campaign by Nvidia to label AMD cards as slow and buggy has had some sort of cumulative effect on purchasing habits.flyingpants265 - Friday, October 9, 2020 - link
Look at the AMD subreddit, hundreds upon hundreds of reports of 'black screen' and other problems with 5700XT and other AMD cards. And of course, the manufacturer won't honor the warranty. Spending an extra $100 to get a 2070S to get proper reliability & support is no big deal for people who have a bit of money.TheinsanegamerN - Saturday, October 10, 2020 - link
"viral marketing campaign"You mean the longstanding complaints from AMD users that AMD's GPU drivers have been poorly coded scrap for a decade? The NAVI users complaining about performance issues were all paid Nvidia shills now?
Qasar - Saturday, October 10, 2020 - link
the funny thing is, for every 1 person that says amds gpu drivers suck, there is the same that say nvidia's drivers are just as as bad. i have had issues with both over the years. each side has had their own issues. NONE of them are perfect.Meteor2 - Tuesday, October 13, 2020 - link
"the funny thing is, for every 1 person that says amds gpu drivers suck, there is the same that say nvidia's drivers are just as as bad" -- trouble is, Nvidiia outsells AMD >5:1.Spunjji - Monday, October 12, 2020 - link
@TheinsanegamerN - no, I mean posts like yours and flyingpants256 that act as if problems temporarily affecting one generation of their cards apply to everything they've ever released.If you need to pretend that I think literally every criticism of an AMD card is written by a "shill" in an attempt to refute what I'm saying, then you're not really refuting what I'm saying, are you? The post I responded to was about the 570 and 580 - feel free to come back to me with the same reply when those cards magically turn into Navi chips.
Those nebulous "longstanding complaints" are exactly what I was referring to - there has been a sustained narrative of ATi/AMD drivers being crap for two solid decades, periodically fuelled by actual instances of truth. The thing is, Nvidia managed to weather storms like being the number one source of Vista crashes and literally burning out a bunch of their cards with driver releases, and yet they never picked up the same rep... Funny how that works.
Spunjji - Monday, October 12, 2020 - link
For context, I just switched across from two years on a GTX 980M to an RX 580 a month and a half ago. The worst issue I had on Nvidia were the usual issues with selecting the right GPU for an application (I forced dGPU all the time, not ideal but it works) and the system flailing when I plugged in an external display (something neither Intel no Nvidia seem to get right). In other words, nothing remarkable at all.I've not had a single issue on my RX 580 yet. No game compatibility issues, no driver crashes, nothing. Doesn't mean the drivers are perfect, but if they were as bad as folks claims then you'd think I'd have noticed now - or back when I was running a 7970M for 2 years. Somehow, I didn't. 🤔
Qasar - Monday, October 12, 2020 - link
think of it this way spunjji, if intel or nvidia do something, good or bad, its perfectly fine. if amd does it, its a federal offence, and people loose their minds over it, look at the complaints about the 50 buck price increase for ryzen 5000. intel and nvidia have been raising their prices over the years, i dont remember people complaining about that like they are now about ryzen 5k.liquid_c - Friday, October 23, 2020 - link
And it's like Nvidia were right, huh? Disfunctional drivers, overheating cards, high power draw with shitty performance (the 580 had the same TDP as GTX 1080 while offering half of its performance).RSAUser - Wednesday, October 14, 2020 - link
I doubt it unless Nvidia continues their stock shortage, and most would just wait.For professional, lots of workloads are CUDA accelerated, AMD is lacking for a lot of those.
For private use, if they are similar enough in price, most will go Nvidia as they know the branding (GTX or RTX).
AMD's driver support hasn't been that great either, issues with black screen. I still like their control center way more though, and I like Chill, but if performance is the same for both with a similar price, I'd still go Nvidia due to DLSS for 1440p.
haukionkannel - Thursday, October 8, 2020 - link
Heh! 5700xt was $400... so 6700xt will be near $400. This 6900 is douple the size of 6700xt! So the price most likely would be near $800. Very fair price! But there is that $700 3080 (that you can not get) that makes that price hard bargaing... amd does not need to sell at $500. Anything between $650 to $750 is competative prising considering this has more wram!Dlls is used in so few games that you can not count on it... it is very nice if the game has it... but none of the game I play has it and I have hundreds of games... Even new games don`t have it in most cases. So it is like hairfx... interesting, but not really usefull because it does not exist.
Sttm - Thursday, October 8, 2020 - link
If that VRAM doesnt deliver higher performance in games, how is it worth it?haukionkannel - Friday, October 9, 2020 - link
In a same way people shout about 3080 only to have 10Gb and Are waiting 20gb version even it will be more expensive than normal 10Gb version...In most cases even 8Gb is enough. I am just saying that extra memory Also contributes to more expensive card. I am not saying that extra memory is better deal to customers at this momement.
But don`t Expect tye 6900 to be cheap. It is/will be big rare 4K gpu with eyewatering price tag.
Wreckage - Thursday, October 8, 2020 - link
Even at $499 it will be a tough sell if the 3070 is beating it with ray tracing and DLSS. This is the 5700xt all over again.flyingpants265 - Friday, October 9, 2020 - link
$399 for 6700XT will be cool. They should bundle these together (5600, 6700XT and motherboard) for $800.Spunjji - Monday, October 12, 2020 - link
"Even at $499 it will be a tough sell if the 3070 is beating it with ray tracing and DLSS"Big IF there, buddy. Any price is a tough sell to a known Nvidia shill. Please don't pretend there's any reason you'd buy or recommend an AMD card over the Nvidia alternative; most of us know who you are and it's just sad seeing you piss into the wind like this.
MisterAnon - Thursday, October 8, 2020 - link
DLSS is a gimmick. It is a marketing name for upscaling, and reduces quality compared to native resolution. It is not something you just dial up.watzupken - Thursday, October 8, 2020 - link
In my opinion, DLSS like AMD’s image sharpening tech are not gimmicks. They work, but only if considerable effort goes in from the developer’s end, particularly DLSS. Which is why with so many games released, you don’t see many games supporting DLSS. Only those titles that worked closely with Nvidia gets DLSS support. Also given that most gamers are still on 1080p and 1440p, there is little incentive for developers to spend time and effort baking DLSS support in games.Spunjji - Monday, October 12, 2020 - link
100% this. There seems to be this tendency to describe DLSS as either utterly essential or "just a gimmick" and neither of those things are true.It's a great feature to have for the people that want it. I hope AMD's alternative is a solid one, and ideally not one that requires any special effort from developers. I'd argue that if they can manage that, slightly lower resulting image quality compared with DLSS would still be an acceptable compromise. If not, it's going to have to at least match it.
Spunjji - Friday, October 9, 2020 - link
Not really, as you're comparing apples to oranges. Apparently AMD have their own equivalent to DLSS coming - it makes more sense to compare that to DLSS (in terms of both image quality and frame-rate) than it does to compare DLSS to native rendering.playtech1 - Thursday, October 8, 2020 - link
Unless AMD's feature set is better than I expect, Nvidia still has the bonus features of DLSS, ray tracing, NVenc and AI that I think mean it can sustain premium pricing over AMD. I think Nvidia could even beat AMD on price this time around if it has to, given that Samsung will be cheaper than TSMC.brakdoo - Thursday, October 8, 2020 - link
1. NV dies are much bigger, so even if SS 8nm is much cheaper, the die won't be that much cheaper. (N7 has really nice yields now).2. NV is a company that is used to really high margins, much higher than AMD. The premium desktop chips have probably a margin >70%. They are definitely not willing to give up those margins. Amd is a company that has a gross margin of ~45% and the GPUs are on average below that. Any improvement here would be good enough for AMD.
haukionkannel - Thursday, October 8, 2020 - link
Amd needs and want to have better marging like that! So Expect prices that Are really close to 3080!brucethemoose - Thursday, October 8, 2020 - link
AMD already has hardware encoders.DLSS is great. Actually, its amazing. But its locked down to certain games, which kinda sucks unless you only play AAAs :/.
We'll see how raytracing and AI performance shake out. CUDA on AMD is still quite funky, and the research side of things is still on CUDA, but there are some Vulkan ports of popular programs coming out.
TheinsanegamerN - Saturday, October 10, 2020 - link
If I wanted my games to look blurry I'd just play at 720p. No DLSS BS code required!MisterAnon - Thursday, October 8, 2020 - link
DLSS is a gimmick. It is a marketing name for upscaling, and reduces quality compared to native resolution. It is not a "feature" you just dial up.watzupken - Thursday, October 8, 2020 - link
Ray tracing is a feature on RDNA2. This is the same RDNA2 that is powering the Xbox and PS which comes with hardware raytracing. In fact with AMD providing hardware for all the new consoles, I suspect ray tracing optimisation may tip towards AMD’s favour. DLSS I feel is not going to make much traction considering it’s been introduced since Turing, but how many games actually supports it? I don’t think it’s easy to implement, which is why game Developers are not in favour of it. Moreover as mentioned, most RTX hardware are still capable of 1080p and 1440p, which most gamers are still using.Gigaplex - Friday, October 9, 2020 - link
I'm not convinced that the console hardware is powerful enough for raytracing to be all that popular.Spunjji - Monday, October 12, 2020 - link
I love the posts that basically say "I will buy an Nvidia card", but try to phrase it like it's objective. PhysX and HairWorks were the last big draws for these claims.haukionkannel - Thursday, October 8, 2020 - link
Yep! You definitely want to get amd option in jees and it seems that we now options Also for 4K gaming. And that is the most important thing. Nvidia did not price 3080 for $700 for no reason! They did know what is coming! If you just can get 3080 at $700...lobz - Thursday, October 8, 2020 - link
Especially if you don't have to buy a new PSU just because you wanna upgrade your graphics card.Spunjji - Friday, October 9, 2020 - link
That's the take-home. Based on the rumours I've been hearing this isn't their best card, but honestly I'm not sure whether or not to trust that. If it is the best they can do, it's still pretty impressive - and it pisses all over the naysayers who've been saying that a card with a 256bit bus would struggle to compete with the 3070. Those claims will no doubt be abandoned and replaced with slightly different ones, in the great tradition of fanboy shitposting.Beaver M. - Friday, October 9, 2020 - link
I would pay $100 more for an Nvidia card, because it has DLSS and better support in games. Not to mention far superior drivers and features.However, if AMD delivers more than this laugh of 8/10 GB VRAM, then I might be very likely to buy their card instead.
flyingpants265 - Friday, October 9, 2020 - link
VRAM is good, I know nothing about GPU but it seems cheap/easy to add, compared to designing a whole CPU, you just add more RAM chips in parallel for a relatively small cost.RSAUser - Wednesday, October 14, 2020 - link
VRAM is not that cheap and adds quite a bit in terms of power cost.Tams80 - Thursday, October 8, 2020 - link
Eh. I wouldn't call coming very close to the 3080 a problem.And comparisons to the 3090 are rather moot, as it's essentially a Titan i.e. not really something any real number of people are going to buy (or even be able to buy at this rate).
Hifihedgehog - Thursday, October 8, 2020 - link
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.raywin - Thursday, October 8, 2020 - link
or could even find to buyOperandi - Thursday, October 8, 2020 - link
It would be exciting to see them beat Nvidia but thats not really important at all. If they are competitive on the high-end with performance, price, and power thats what counts and this looks pretty good. RTG has had a pretty rough going for the last few generations so hopefully RDNA2 is their Zen1 moment and things just get better from here.JlHADJOE - Thursday, October 8, 2020 - link
TPU Numbers for the 3080FE at 4kSource: https://www.techpowerup.com/review/nvidia-geforce-...
Borderlands 3: 70.3 fps
Gear 5: 84.4 fps
Slash3 - Thursday, October 8, 2020 - link
While it isn't immediately obvious which detail setting is used in the test, keep in mind that TPU does run BL3 in DX11 mode.Showtime - Thursday, October 8, 2020 - link
Don't you think your expectations were way higher than reasonable? Look at how many gens it took AMD to catch Intel in gaming. AMD is further behind NVIDIA than they were behind Intel. I expected somewhere along the lines of them being competitive with the 3070's going by NVIDIA's $500 price tag. I assumed NVIDIA knew something when they didn't gouge us like they did with the OG RTX. Either way, this looks to be a huge boost for AMD. Oh, and NVIDIA already has faster tech going by all the old articles stating that they weren't bringing out their best since there was no competition/reason for them, but at least AMD has a 4k card finally.Spunjji - Monday, October 12, 2020 - link
"Oh, and NVIDIA already has faster tech going by all the old articles stating that they weren't bringing out their best since there was no competition/reason for them"Wut?
godrilla - Thursday, October 8, 2020 - link
Show me where one can buy a 3080 though?DigitalFreak - Thursday, October 8, 2020 - link
Show me where one can buy a 6900 though?raywin - Thursday, October 8, 2020 - link
considering they haven't launched your comment is mootMorawka - Thursday, October 8, 2020 - link
The AMD 6000 series will be even harder to find. Ethereum miners will pick them up for their efficiency. While the 3080 mines well, it consumes a lot of power. Another factor playing into the limited availability will be TSMC's crowded 7nm node. AMD is already using a lot of their 7nm capacity building chips for next-gen consoles and zen 3.raywin - Thursday, October 8, 2020 - link
i'll just wait for the reply...yeeeeman - Thursday, October 8, 2020 - link
we don't know if this is the top card...BeepBeep2 - Thursday, October 8, 2020 - link
According to your source, EuroGamer/DigitalFoundry, the 3080 avg is 60.9fps in Borderlands 3, not 65.raywin - Thursday, October 8, 2020 - link
so the hypothetical 3080 FE that you can't buy, or the failing caps aib cards?Dizoja86 - Friday, October 9, 2020 - link
How are people still whining about fictional "failing caps" in October? You should probably stay off of Reddit and Youtube if you're that gullible to sensational reporting.Spunjji - Monday, October 12, 2020 - link
That the problem was related to the caps is fictional.That there was a problem in the first place is not "sensational reporting", it's a fact.
Seems to be sorted now, though.
silencer12 - Thursday, October 8, 2020 - link
Wait for zen 4 to blow your expectations. As for graphics cards, you will need to wait for drivers to update as always, so the cards themselves can perform optimal.Thernn - Thursday, October 8, 2020 - link
The 65 fps was a mistake that has since been retracted. Its 61 FPS in BL3 for the RTX 3080 on Badass.Gigaplex - Friday, October 9, 2020 - link
Well, being close this generation is a vast improvement of not having anything close in the previous generation.GruntboyX - Friday, October 9, 2020 - link
If the 6000 series is competitive to within a few percent of the 3080 then that is a win in my book. Especially if it’s cheaper to compensate for the few frames losses. The next question is power consumption.If nothing else it’s an option because it will be in stock.
eva02langley - Friday, October 9, 2020 - link
They are at a striking distance from each others...eva02langley - Friday, October 9, 2020 - link
And we are talking about a 6800XT, not the 6900XT.Spunjji - Monday, October 12, 2020 - link
There's no solid evidence for that either way, yet.Qasar - Monday, October 12, 2020 - link
but is is unlikely amd would show off the 6900xt in a teaser though, what would be left to show on oct 28 ?quadibloc - Friday, October 9, 2020 - link
I don't know. If they come close, but they're half the price, their product should sell well. And, for competitive gamers, what matters is performance with useless stuff like ray tracing turned off anyways. Given the huge leap in performance that the 3080 represented, I'll be pleased if AMD manages to come close; not even coming close is when I'd start worrying.teamet - Saturday, October 10, 2020 - link
Efficiency, and therefore heat and noise, will be interesting to see though.RX 6000 on TSMC 7 nm and RTX 3000 on Samsung 8 nm.. Do we know if any of them are made using EUV lasers?
Alexvrb - Sunday, October 11, 2020 - link
Ian quoted Anand in the Zen 3 article but I think it bears repeating here. "There are no bad products, only bad prices."Performance numbers by themselves don't paint the whole picture. If they are 95% of a 3080 at 85% the cost, that's a good value. Personally I never spend more than ~$300 on a GPU anyway, so the cards with the best bang for the buck in THAT price range are what interests me.
TheinsanegamerN - Monday, October 12, 2020 - link
Oh no! 5%?!? The HORROR.Who cares about the 3090? Oh, AMD is 20% slower but the 3090 cost $1500. It isnt a viable option for 99% of people. Just like the titans before it.
If AMD gets within a few % of a 3080 they have a winner on their hands.
Samus - Tuesday, October 13, 2020 - link
Nvidia is a goliath, I don't expect AMD to ever surpass them at the ultra high end now. Unless Nvidia makes some Intel-esqe mistake :)Unashamed_unoriginal_username_x86 - Thursday, October 8, 2020 - link
I imagine you're right that they'd leave the halo card quiet until they can have their "one more thing" moment like with the 5950X. I wish there were some convenient Nvidia numbers to compare to those benchmarks though...Hifihedgehog - Thursday, October 8, 2020 - link
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.DigitalFreak - Thursday, October 8, 2020 - link
There is no way in hell that their 2nd tier card performs that close to a 3080.Spunjji - Friday, October 9, 2020 - link
Why not? The 3080 is Nvidia's second-tier card. I don't think it's terribly likely, but it's not impossible either.I'm more interested in numbers at 1440p, because AMD's architecture shouldn't drop off a cliff at lower resolutions the way Ampere does.
Gigaplex - Friday, October 9, 2020 - link
The 3090 isn't really in the same realm, it's more of a Titan/workstation replacement card. I'd still consider the 3080 as the top tier.eva02langley - Friday, October 9, 2020 - link
They labeled the thing as a gaming card... productivity drivers are not available for the 3090... it is a gaming card.Spunjji - Monday, October 12, 2020 - link
Your own personal definitions don't reflect the reality that they've released a £1500 gaming card and everyone is trying to rationalise it as a "Titan". If it were a Titan, it'd be called a Titan and have the appropriate drivers. It doesn't.eva02langley - Friday, October 9, 2020 - link
Nvidia AMpere is one of the worst uarch they made. It is a datacenter card while RDNA 2.0 is a gaming uarch. It is more than plausible.andrewaggb - Thursday, October 8, 2020 - link
Nothing wrong with 3070-3080 territory so long as the price, power, heat, noise, drivers, etc all hold up. Honestly considering how far behind AMD has been on the GPU front if they're in the same range on these metrics that would be great. Guess we'll see.Zen 3 looks good. It'll be interesting to see how it compares to Rocket Lake, but it sounds to me like probably equivalent performance and Zen 3 will use less power and have higher core offerings. Not a great spot for Intel to be if that's the case.
Hifihedgehog - Thursday, October 8, 2020 - link
I took a second look at the link where I grabbed my numbers. It appears the 65 fps number I got for the RTX 3080 is wrong. Eurogamer's chart shows 65 at the top but there is an error in the system. The actual mean frame rate it shows is 60.9 fps at the bottom for the RTX 3080. So we are actually looking at RTX 3080 performance.raywin - Thursday, October 8, 2020 - link
the 30 series launch is a complete mess, even by nvidia standardsJorgp2 - Thursday, October 8, 2020 - link
Will this have HBMDigitalFreak - Thursday, October 8, 2020 - link
No. GDDR 6.james4591 - Thursday, October 8, 2020 - link
Even if it can get near a 3080's performance, the question is, what is the pricetag going to be? If it replaces the RX 5700XT at the same pricetag, then AMD is going to really hurt Nvidia in the price to performance ratio.The good part is, the AMD driver team has really turned things around for themselves. The 20.9s have been really well done.
I'll say it this way, if the Reference model gets near the 3080, AIB partners like PowerColor and Sapphire will beat the 3080. If it end up cheaper, then Nvidia will have a problem.
hehatemeXX - Thursday, October 8, 2020 - link
Either the CU/GCN architecture not scaling well, or they used the second, third tier chip. I don't think they achieved much in terms of raw IPC, but rather through simple clock speeds.Demiurge - Friday, October 9, 2020 - link
What are you talking about?silverblue - Friday, October 9, 2020 - link
I think the idea here is either it's not the top-end Navi with its 80 CUs (in which case increased clock speed over the 5700XT may have played a significant part), or having double the CUs doesn't yield anywhere near double the performance in these titles. It's a bit soon to tell.haukionkannel - Thursday, October 8, 2020 - link
There will be 40cu 6700 that replase 5700. 6900 has near 80cu so it will be douple the price of 5700xt aka douple the 6700xt...Spunjji - Friday, October 9, 2020 - link
Doesn't make sense to assume that a card with double the CUs will cost twice as much - the cost of the card isn't just the silicon.0siris - Thursday, October 8, 2020 - link
If they can do 3070 performance at 200-225W for $400, they can have my money. I don't care if they get even remotely close to a 3080(let alone 3090), because I can't justify spending that much money on a graphics card anyway.raywin - Thursday, October 8, 2020 - link
even if you could justify it, where are you going to buy ithaukionkannel - Thursday, October 8, 2020 - link
Yep! So true! 6900 will be expensive halo gpu. The Nvdia 3070 and amd 6700 is the upper limit where most people goes! $500 is a lot of money for gpu!To most people the 3060 at $400 is too much... and we will get something like 6600 for that segment from amd. Aka cut down 6700.
6900zt will be high above those in price wise!
hehatemeXX - Thursday, October 8, 2020 - link
Same here! I doubt the 3070 will be available :(Spunjji - Friday, October 9, 2020 - link
This is where I'm at too, but it's still interesting to see how they line up elsewhere.alufan - Thursday, October 8, 2020 - link
I cant believe there are actually people writing off AMDs GPUs based on this small preview, do we really expect AMD to show us the best of what they have...in a preview, if they did that then the launch presentation is going to be a pretty negative affair, my bet is this was a low end or midrange card, Nvidia knows whats in the pipeline thats why they have priced the 3000 series as they have, AMD I think will be equal too or even beat Nvidia this timeDemiurge - Thursday, October 8, 2020 - link
I don't understand either. Waiting for more information before making a judgement may be prudent. The performance of either card may be misleading given limited information.I'm skeptical about non-DLSS performance with Nvidia. With AMD, I am skeptical about the RT performance. Need more information...
Gigaplex - Friday, October 9, 2020 - link
"do we really expect AMD to show us the best of what they have...in a preview"Yes. Marketing always oversells the product.
eva02langley - Friday, October 9, 2020 - link
AMD, the most secretive company in the last 3 years, the company hiding the 3950x, Radeon 7, RDNA, ZEN 3, Threadripper 3000... and so on... is going to spill the beans on their biggest hyped announcement for PC gamer in the last 5 years in a teaser...?Well, news for you, they shown you the 6800XT performances.
eva02langley - Friday, October 9, 2020 - link
They have shown you what they wanted to show you... and they did to spoil Nvidia party, simple as that. This is called MARKETING...Gigaplex - Saturday, October 10, 2020 - link
"is going to spill the beans on their biggest hyped announcement for PC gamer in the last 5 years"Zen 3 is more significant to PC gamers than the 6800XT at this point.
liquid_c - Friday, October 23, 2020 - link
There was nothing secret about either of those products.Spunjji - Monday, October 12, 2020 - link
"Yes. Marketing always oversells the product."Didn't happen with RDNA, or with Ryzen 1000, 2000, or 3000. But sure, "always".
a300thanhlong - Thursday, October 8, 2020 - link
If performance is good then the scalpers will scalp AMD, we then have a chance to buy the RTX 3080raywin - Thursday, October 8, 2020 - link
nah, the scalper benches are deepwatzupken - Thursday, October 8, 2020 - link
I feel the performance looks decent, but with only 3 games being shown here, it’s still too early to conclude on performance. I don’t think it will beat the RTX 3080 for sure, but if it’s just slightly slower with a better price tag and power consumption, I feel it will still do well. The 16GB VRAM will be sufficient for a longer time than 10GB. Sure Nvidia may release a 20GB version, but at a higher cost is what I feel. So if AMD can undercut them in terms of cost, they may have a winner(if they don’t botch up driver).Pneumothorax - Thursday, October 8, 2020 - link
Considering that the current AVAILABLE price of the 3080 is really $1500 from scalpers, AMD should price the RX6000 at $700 as they will sell out at that price for months. 3080 won’t be available readily until spring 2021. I’d prefer AMD and the AIBs to make some money for once from their GPUs instead of the scalpers.zodiacfml - Friday, October 9, 2020 - link
As I expected, pretty much 90% of the 3080. Pricing is where Nvidia and AMD will fight it over though, where Nvidia has to delay the 3070 to get their competitive pricing rightnadim.kahwaji - Friday, October 9, 2020 - link
Guys really it so strange that till now we have no RTX review???Qasar - Friday, October 9, 2020 - link
as AT have already said, seems the delay, is due to the fires that are on going in california.funkforce - Friday, October 9, 2020 - link
I really don't think we will see a review @nadim.kahwaji @QasarYou see, it was "hopefully" gonna be ready in time for the release of 3090 a week later. But the 3090 release was 2 weeks ago.
We who have been reading the site since it was founded and have good memories have unfortunately seen this same thing come to pass many a times with GPU reviews especially.
This site is the best, has the best writers and a pretty civilized atmosphere in the comment sections. Unfortunately, when they miss a "deadline" the pressure is off and it's more common that the review never materializes at all.
This would be fine, if they wouldnt keep promising "it's coming", and then never deliver, stringing readers along. When Anand worked here, I don't think a review was ever late. Now there's is always an excuse and then that excuse is milked to the max and then it just runs out into the sand.
It's really sad cause the last year or so I don't think any big deadlines was missed and things were looking up. I still believe and hope though, checking the site everyday. Maybe one day... :)
flyingpants265 - Friday, October 9, 2020 - link
As far as I can remember, smartphone reviews were late, but GPU reviews weren't, really.This site has tanked over time, in more ways than one. The current owners/editors know it, it's not really their fault if they don't have the proper talent/skill to be number one anymore.
Spunjji - Monday, October 12, 2020 - link
"it's not really their fault if they don't have the proper talent/skill to be number one anymore"If you believe that, why are you even here? Talent isn't the same as resources, but whatever, you go ahead and grind that axe.
Qasar - Saturday, October 10, 2020 - link
funkforce the fires in california are STILL burning, still getting the smoke creep in where i live from time to time. if you call mandatory evac orders an excuse, then i dont know what to say, i thnk ryan smith lives in california, and he reviews the gpu's on here. google california wild fires and you will see.https://www.theguardian.com/us-news/2020/oct/05/ca...
Spunjji - Monday, October 12, 2020 - link
"When Anand worked here, I don't think a review was ever late."Long-term reader here to confirm that people have been whining about late reviews for as long as Anandtech has existed. They still exist even in the era of breathless pre-release YouTube promo videos because nobody else goes into the same depth.
Icehawk - Tuesday, October 13, 2020 - link
Anand cashed out and that was really the end of the site as we knew it, still decent but it’s lost a lot over the years. Was sad when HardOCP closed, same deal - main person got scooped up by the “real” industry and left all us poorer.Lord of the Bored - Friday, October 9, 2020 - link
In before nVidia announces a 3080 Super!Beaver M. - Friday, October 9, 2020 - link
Doubt that, because the design is at its limits already. Even the 3090 can only get ~10% more on average.eva02langley - Friday, October 9, 2020 - link
We all know that 20GB variants are coming around december. It is the worst kept secret.Qasar - Saturday, October 10, 2020 - link
i bet that will make all those that have already bought these cards happy.......................Gigaplex - Saturday, October 10, 2020 - link
That happens every generation.Qasar - Saturday, October 10, 2020 - link
but so soon between releases ? doubtful, there is usualy a good 6 months in between, not 2 :-)Spunjji - Monday, October 12, 2020 - link
They're already deep into rationalising a deeply ill-advised early purchase, nothing's gonna change on that front. Never has, never will.Beaver M. - Sunday, October 11, 2020 - link
Eh... You do know the difference between VRAM size and GPU/VRAM performance, do you?GruntboyX - Friday, October 9, 2020 - link
Well hopefully it’s good enough to cool demand enough to allow nvidia to catch up supply and reduce the scalpingBotched1 - Friday, October 9, 2020 - link
Seems like it is kind of a moot point... Since I can't actually BUY a 3080 or 3090, it really doesn't matter how fast it is now does it?If I can actually BUY an RX 6000 when released, I guess that's what I'm going to get. Slower or not.
quadibloc - Friday, October 9, 2020 - link
"Aiming for 3080" with a question mark? Surely if AMD doesn't manage, on October 28th, to announce GPUs which at least approach performance parity to the 3080 and even the 3090, they would be in trouble? Of course, if they stick to the lower end of the market, but offer superior value there, they can certainly stay in business.Smell This - Monday, October 19, 2020 - link
Raising that bar, Bubba?
AMD will be fine as long as they price it right and run some enterprise software on it
shabby - Friday, October 9, 2020 - link
Watch Nvidia ramp up production of their cards otherwise amd will gobble up their lost sales.eva02langley - Friday, October 9, 2020 - link
They can't the yields are horrible.shabby - Friday, October 9, 2020 - link
Oh? How bad are they? Any links to the info?Dizoja86 - Friday, October 9, 2020 - link
I can guarantee they don't have any links to reliable info on bad yields. The issue isn't with yields. It's that Nvidia didn't start production until mid/late August, and AIB's were flailing to pump cards out in a matter of weeks.Gigaplex - Saturday, October 10, 2020 - link
It's not just the AIBs struggling to pump cards out. The FE is supply constrained too.Spunjji - Monday, October 12, 2020 - link
Nobody but Nvidia has reliable info on their yields, but we can draw inferences.What we have to go on is:
Customers can't get hold of the cards.
There's no competition for wafers on Samsung 8nm, so they're not capacity constrained.
Nvidia are saying "yields are great".
Nvidia are also saying "supply will be constrained through early 2021".
They couldn't know that last part unless they can predict either supply or demand precisely, and if it's demand they're predicting precisely, then why didn't they make more of these cards before launch? So: either they're incompetent and underestimated demand on a card they chose to market as "up to 2x faster (creative fiction)" and "the biggest generational leap ever (lie)", they're deliberately constraining supply to drive prices up so AIBs / resellers can cash in, or yields are bad and they're lying about it.
Given that they've lied every other time they'd have terrible yields, I'd happily bet that's at least a contributing factor.
Gigaplex - Saturday, October 10, 2020 - link
Horrible enough that they can't produce enough to match demand.mrvco - Friday, October 9, 2020 - link
If they can get close to 3080 performance and undercut the 3080 on price and availability, that's going to be a huge win for AMD. That being said, I'm not interested in anything over $500, so my interest is in what replaces the 5700XT at the $400'ish price point.wordlv - Friday, October 9, 2020 - link
Reading these comments are fun indeed. People really believe that after all the hype that AMD showed there best "Big Navi" as an afterthought at CPU press conference?eva02langley - Friday, October 9, 2020 - link
Delusional fanboys from taking for Nvidia and Intel do...Tomatotech - Friday, October 9, 2020 - link
This comment section reads like the same 3 comments repeated over and over endlessly.eva02langley - Friday, October 9, 2020 - link
They shown the 6800XT with 64CUs. They know they are beating the 3080, but they just want Nvidia to panic and do something stupid. They are doing what they did with Radeon 7. I cannot believe some people here really believes they have shown their biggest die in a teaser.eva02langley - Friday, October 9, 2020 - link
It is called, a buildup... you don't blew it by scrapping your hype train.Gigaplex - Saturday, October 10, 2020 - link
"They are doing what they did with Radeon 7."Releasing a dud? That card wasn't successful.
Targon - Monday, October 12, 2020 - link
AMD likes to play with the competition. Push them into releasing early(NVIDIA Ampere, and note there are no cards available, cards that crash(black screens), driver problems, you name it). AMD lets just enough information out there to force the competition into showing its hand. We have seen AMD playing with Intel from 2017 through 2020, making Intel talk big, but showing time and time again that Intel doesn't have anything ready to compete with what AMD is doing. AMD set up for NVIDIA to burn its customers by releasing some Super cards well before the one year mark of the original RTX 2060, so those with a 2060 from launch felt burned, and those who bought a 2070 or 2080, yea, Super cards came out for the same price with better performance, just to stay ahead of the Radeon 5700XT.Radeon VII wasn't even a specific gaming card. The Radeon Instinct card based on 7nm with 16GB of memory...just slap a new name on it and call it Radeon VII. It was FULLY a Radeon Instinct card with a different name, nothing more. RDNA was really the first card that AMD intentionally planned to release as a consumer gaming card since Vega 56 and Vega 64.
Smell This - Monday, October 19, 2020 - link
"They are doing what they did with Radeon 7."Releasing a dud? That card wasn't successful.
_______________________________________
I wish I had one _ could have been a monster for me with AMD Radeon Pro Software and Radeon ProRender. No cash for me |;--(
Targon - Monday, October 12, 2020 - link
AMD has shown that the management knows how to play games with the competition. Leak numbers that may or may not be correct to push the competition into raising or lowering prices is a part of those games. Show numbers far above or below what they really are, and the competition responds. Show numbers right on target for when your product is a lot better than expected, and the competition will respond, but possibly outmaneuver you to minimize the appeal of the new products.I hope you are correct that the numbers provided by AMD are from the 6800XT, but we don't really know right now. No matter what, we can expect more than double the performance of Vega64 or Radeon VII with Big Navi, and that means it will be a solid card.
croc - Saturday, October 10, 2020 - link
THIS is why careful consumers wait for all the facts to be in before committing a large wad of cash on a component. I think that it is sometimes called 'buyer's remorse' when not adhered to.Then there is 'buyer's frustration' when their needs can't be met. As in, 'what do ya mean I can't run a GPU and a PC\e x 16 m.2 raid card on an AM4? Or, 'When will the Zen 3 Threadrippers arrive?' Rocket lake probably won't help, either... Looks to sit on a spec that just matches AM4. Barely.
DejayC - Sunday, October 11, 2020 - link
I hope this isn’t Vega 2.0. When Vega 64 came out it matched or even beat the Gtx 1080 in some titles but was instantly sold out because of miners, way louder, and much more power hungry. Now if 6000 series can match 2080 at similar power levels, fan noise and they get supply in check, we might have a solid team Red offering.just4U - Tuesday, October 13, 2020 - link
Well.. considering just how power hungry the 3080 is it's probably safe to say that it won't be worse in that area.. might be on par power wise though (knowing AMD)