This afternoon NVIDIA announced their plans for a public “GeForce Gaming Celebration” later this month, taking place amidst the Gamescom expo in Cologne, Germany. Promising talk of games and “spectacular surprises,” this marks the first real GeForce-branded event that NVIDIA has held this year, and just over two years since they’ve held such a large event opposite a major gaming expo.

The world’s biggest gaming expo, Gamescom 2018, runs August 21-25 in Cologne, Germany. And GeForce will loom large there -- at our Gamescom booth, Hall 10.1, Booth E-072; at our partners’ booths, powering the latest PC games; and at our own off-site GeForce Gaming Celebration that starts the day before [August 20th].

The event will be loaded with new, exclusive, hands-on demos of the hottest upcoming games, stage presentations from the world’s biggest game developers, and some spectacular surprises.

The timing of the event along with the vague description of what it’s about is sure to drive speculation about what exactly NVIDIA will have to show off, especially as we're approaching the end of NVIDIA's usual 24 - 30 month consumer product cycle. Their 2016 event opposite Dreamhack was of course the reveal of the GeForce 10 Series. And the date of this year’s event – August 20th – happens to be same day as the now redacted/canceled NVIDIA Hot Chips presentation about “NVIDIA’s Next Generation Mainstream GPU.”

For what it’s worth, NVIDIA’s 2016 teaser didn’t say anything about the event in advance – it was merely an invitation to an unnamed event – whereas this teaser specifically mentions games and surprises. So make of that what you will.

Meanwhile, as noted previously, this is a public event. So NVIDIA says that there is a limited amount of space for Gamescom attendees and other locals to register and catch the event in person. Otherwise like most other NVIDIA events, this event will be live streamed for the rest of the world, with the event kicking off at 6pm CET.

Source: NVIDIA

Comments Locked

41 Comments

View All Comments

  • PeachNCream - Tuesday, July 31, 2018 - link

    The 1050 isn't an across the board power connector-less GPU. It's a bit more of a toss-up whether or not the OEM includes one or not. Since it's on the threshold, it isn't something I was willing to include. As I was typing that statement about the 1030, it crossed my mind to make that comment, but I felt like I'd have been anticipating and defeating in advance and that would have been rather rude so I didn't take that approach.
  • MajGenRelativity - Tuesday, July 31, 2018 - link

    If it consumes 75W, which is supported by the PCIe slot, and comes without a power connector, then I don't see how it's on the threshold. It's up to OEMs if they want to overclock and increase the power consumption for that, same as for every card.
  • PeachNCream - Tuesday, July 31, 2018 - link

    Not all PCIe slots support the 75W specification. For instance Dell Optiplex boxes, good candidates for low end graphics card upgrades when they enter the gray market as refurbs or second-hand systems, have had PCIe 16x slots that can't handle more than 37W. Furthermore, the 1050 at stock clocks ships with an external power connector. Since that's outside of our control, it really isn't fair to say the 1050 is free from direct-from-PSU electrical energy. Conversely, there are no 1030 cards that ship with said connector. In fact, it'd be laughable if they did given the TDP is around 30W.

    At this point, I think we're beating a dead horse. I'm not looking forward to the 11-series launch because I have set my expectations based on a lot of the things I've already explained. No amount of back-and-forth between us is going to change that. NVIDIA might be able to get me a bit excited if the company can deliver a credible product stack at a reasonable price range, but I to say I'm skeptical of that possibility is an understatement.
  • MajGenRelativity - Tuesday, July 31, 2018 - link

    I mean, I'm fine with dropping the conversation. That being said, any PCIe slot that doesn't support the spec is out of spec, and shouldn't be counted.
  • PeachNCream - Tuesday, July 31, 2018 - link

    I'm fine with not counting the slots that don't support the spec, but its equally not unfair to discount the cards that straddle the line and may or may not feature an external power connector.
    :D
  • MajGenRelativity - Tuesday, July 31, 2018 - link

    It's definitely an ambiguous situation. I can agree on that
  • MajGenRelativity - Tuesday, July 31, 2018 - link

    As a footnote, Intel's and AMD's TDPs have crept up for the high end too. The i9-7980XE has a TDP of 165W (greater than that of a GTX 1070), with practical power consumption higher than that (especially if overclocked). AMD's Threadripper 1 lineup has a TDP of 180W (equal to a 1080), with their Threadripper 2 lineup being rumored all over the place, but going up to 250W (equal to a 1080Ti). Lastly, the AMD FX-9590 had a 220W TDP.
  • PeachNCream - Tuesday, July 31, 2018 - link

    Yup, those TDP numbers for TR and the i9 are just as shamefully pathetic and you're absolutely right to call out both Intel and AMD for it. The difference on the CPU side is that there's still a wide range of choices with reasonable TDP numbers and the more costly, hotter-running parts add little to no performance advantage in a significant number of modern computing tasks including the enthusiast focus of gaining FPS in a benchmark. There's marginal differences between a CPU with a 65W TDP running 4 cores and a 180W model with 16 cores (for now - and I admit that can certainly change and said change will result in a valid set of complaints about the stupidity of CPU products in the future too). The probability of that changing anytime soon is low since the PC gaming market is beholden to what will be ported over from consoles or what will also run on consoles in the case of parallel development so CPU power as a factor and as a TDP concern is much reduced for a number of reasons.
  • MajGenRelativity - Tuesday, July 31, 2018 - link

    Nevertheless, they're still the "high end". You cannot criticize the GPU high end, while completely ignoring the CPU high end. It's unfair to compare the power consumption of a $700 GPU to a $100 CPU. They're not even aimed at the same markets.
  • PeachNCream - Tuesday, July 31, 2018 - link

    I didn't leave out the criticism. In fact, I agree with you completely that the upper end of the price spectrum when it comes to modern CPUs deserves every bit of the criticism you've leveled against it. In fact, I'll even go as far as saying that since there's no benefit in a large number of use cases for high TDP processors where a home PC is concerned, those chips make even less sense than whatever NVIDIA is selling in the XX80-class ranges.

Log in

Don't have an account? Sign up now