Meet The GeForce GTX 670

Because of the relatively low power consumption of GK104 relative to past high-end NVIDIA GPUs, NVIDIA has developed a penchant for small cards. While the GTX 680 was a rather standard 10” long, NVIDIA also managed to cram the GTX 690 into the same amount of space. Meanwhile the GTX 670 takes this to a whole new level.

We’ll start at the back as this is really where NVIDIA’s fascination with small size makes itself apparent. The complete card is 9.5” long, however the actual PCB is far shorter at only 6.75” long, 3.25” shorter than the GTX 680’s PCB. In fact it would be fair to say that rather than strapping a cooler onto a card, NVIDIA strapped a card onto a cooler. NVIDIA has certainly done short PCBs before – such as with one of the latest GTX 560 Ti designs – but never on a GTX x70 part before. But given the similarities between GK104 and GF114, this isn’t wholly surprising, if not to be expected.

In any case this odd pairing of a small PCB with a large cooler is no accident. With a TDP of only 170W NVIDIA doesn’t necessarily need a huge PCB, but because they wanted a blower for a cooler they needed a large cooler. The positioning of the GPU and various electronic components meant that the only place to put a blower fan was off of the PCB entirely, as the GK104 GPU is already fairly close to the rear of the card. Meanwhile the choice of a blower seems largely driven by the fact that this is an x70 card – NVIDIA did an excellent job with the GTX 560 Ti’s open air cooler, which was designed for the same 170W TDP, so the choice is effectively arbitrary from a technical standpoint (there’s no reason to believe $400 customers are any less likely to have a well-ventilated case than $250 buyers). Accordingly, it will be NVIDIA’s partners that will be stepping in with open air coolers of their own designs.

Starting as always at the top, as we previously mentioned the reference GTX 670 is outfitted with a 9.5” long fully shrouded blower. NVIDIA tells us that the GTX 670 uses the same fan as the GTX 680, and while they’re nearly identical in design, based on our noise tests they’re likely not identical. On that note unlike the GTX 680 the fan is no longer placed high to line up with the exhaust vent, so the GTX 670 is a bit more symmetrical in design than the GTX 680 was.


Note: We dissaembled the virtually identical EVGA card here instead

Lifting the cooler we can see that NVIDIA has gone with a fairly simple design here. The fan vents into a block-style aluminum heatsink with a copper baseplate, providing cooling for the GPU. Elsewhere we’ll see a moderately sized aluminum heatsink clamped down on top of the VRMs towards the front of the card. There is no cooling provided for the GDDR5 RAM.


Note: We dissaembled the virtually identical EVGA card here instead

As for the PCB, as we mentioned previously due to the lower TDP of the GTX 670 NVIDIA has been able to save some space. The VRM circuitry has been moved to the front of the card, leaving the GPU and the RAM towards the rear and allowing NVIDIA to simply omit a fair bit of PCB space. Of course with such small VRM circuitry the reference GTX 670 isn’t built for heavy overclocking – like the other GTX 600 cards NVIDIA isn’t even allowing overvolting on reference GTX 670 PCBs – so it will be up to partners with custom PCBs to enable that kind of functionality. Curiously only 4 of the 8 Hynix R0C GDDR5 RAM chips are on the front side of the PCB; the other 4 are on the rear. We typically only see rear-mounted RAM in cards with 16/24 chips, as 8/12 will easily fit on the same side.

Elsewhere at the top of the card we’ll find the PCIe power sockets and SLI connectors. Since NVIDIA isn’t scrambling to save space like they were with the GTX 680, the GTX 670’s PCIe power sockets are laid out in a traditional side-by-side manner. As for the SLI connectors, since this is a high-end GeForce card NVIDIA provides 2 connectors, allowing for the card to be used in 3-way SLI.

Finally at the front of the card NVIDIA is using the same I/O port configuration and bracket that we first saw with the GTX 680. This means 1 DL-DVI-D port, 1 DL-DVI-I port, 1 full size HDMI 1.4 port, and 1 full size DisplayPort 1.2. This also means the GTX 670 follows the same rules as the GTX 680 when it comes to being able to idle with multiple monitors.

NVIDIA GeForce GTX 670 Meet The EVGA GeForce GTX 670 Superclocked
POST A COMMENT

414 Comments

View All Comments

  • chizow - Thursday, May 10, 2012 - link

    Except in this case, the "underdog" AMD initiated this pricing debacle with the terribly overpriced 7970 and the "leader" Nvidia was content to follow, selling their mid-range ASIC GK104 as a high-end SKU.

    While Nvidia did improve the situation with their GK104 pricing, its still by far, the worst increase we've seen from a price:performance perspective in the last decade of GPUs.
    Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    You're in the GTX670 review, it's $399, it has come out fast, and it's awesome and beats the more epxensive flagship 7970, and destroys and historical price/perf you've got handy.
    Utter decimates it.
    Best in years, best in a decade is now the line you should be using for the GTX670.
    Reply
  • Crazyeyeskillah - Thursday, May 10, 2012 - link

    don't buy it if you can't afford it, other people will gladly take your place in line. I'm just glad we have some next gen products from both companies to choose from. If anything we are very fortunate to have so many products available that can max out all our games at present. Reply
  • chizow - Thursday, May 10, 2012 - link

    Its not a matter of being able to afford it, its about standards and expectations, which I'm not willing to lower for substandard offerings for products that are neither essential for survival nor expire on their own due to wear.

    They're high-priced toys and nothing more and there's *PLENTY* of other distractions in that endless category of entertainment to compete with, especially when these new offerings don't offer compelling reasons to upgrade over my last-gen $500 GPUs.

    The other consideration is buying these parts at high premiums sets a bad precedence, where the consumer gets *LESS* for their money and similarly gives Nvidia free reign to set a new bar for premium price and performance in the future.

    We've already gotten a taste of this with the GTX 690 for $1000!!! What do you think is next with GK110? Why don't you look historically at the reaction to the 8800 Ultra at $830? Nvidia is *STILL* trying to downplay that part and justify their pricing decisions, but with a mid-range ASIC like GK104 selling for $500 premium flagship prices, Nvidia is once again positioned to sell an "Ultra" part at ultra-premium pricing. For what? A part that performs as you would've expected from a $500 flagship to begin with, roughly +50% more than the last-gen flagship.....
    Reply
  • Crazyeyeskillah - Thursday, May 10, 2012 - link

    i don't buy any of that wahhh Reply
  • CeriseCogburn - Thursday, May 10, 2012 - link

    Charlie D from semi-accurate buys it 100%, why no U ? Reply
  • chizow - Thursday, May 10, 2012 - link

    Yeah I know, you're too busy blithely buying overpriced GPUs to understand what I'm talking about. Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    Maybe if you provided a percentage with a simple texted chart, heck you don't need to do ten years, the doubter could gauge the level of your sourness properly - after all .01% less of a jump in performance below the worst jump in the last ten years fits all of your descriptions 100%.
    So why are you moaning about .01% ?
    Reply
  • SlyNine - Thursday, May 10, 2012 - link

    Well when the 7970 came out that was by far the worst. Its alot better now, but I agree this jump hasn't been one for value at all. People don't remember the great videocards I guess. The 5870 was the last one in my eyes. Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    5870 jumped from the 4890. Now please, let's see this enormous perf increase somewhere... as compared to the current.
    No less than that, the 5870 was replaced by the 6870, also not so great a leap.
    We keep hearing about these ephemeral perf increases, but so far NO ONE, and I mean NO ONE has provided even a simple percent increase chart - and you know why ?
    Because you people love to quaff out moaning fantasies like "double performance" and says things like "the great GTX880 !" (after of course bitching for a four years it was extremely overpriced and not ever worth it).
    So let's see it my friends, where pray tell is this great alluded to but never actually defined gigantic performance increase now not seen ?
    4890-5870-6970 ????
    Come on now, let's have one of you true believers gum up the work and give us a good percentage comparison we don't have to rip apart for immense biased game picking.
    Should take one of you all but 10 minutes. Charts are everywhere.
    Use the anand bench for cripes sakes, I'm sick of hearing the moanings and fantasies with no simple effort of a comonly available percentage - and you know why - because I'm calling BS !
    Now - let's see it !
    Reply

Log in

Don't have an account? Sign up now