Meet The EVGA GeForce GTX 670 Superclocked

Our second card of the day is EVGA’s GeForce GTX 670 Superclocked, which in EVGA’s hierarchy is their first tier of factory overclocked cards. EVGA is binning GTX 670s and in turn promoting some of them to this tier, which means the GTX 670 Superclocked are equipped with generally better performing chips than the average reference card.

GeForce GTX 670 Partner Card Specification Comparison
  EVGA GeForce GTX 670 Superclocked GeForce GTX 670 (Ref)
CUDA Cores 1344 1344
Texture Units 112 112
ROPs 32 32
Base Clock 967MHz 915MHz
Boost Clock 1046MHz 980MHz
Memory Clock 6210MHz 6008MHz
Memory Bus Width 256-bit 256-bit
Frame Buffer 2GB 2GB
TDP 170W 170W
Manufacturing Process TSMC 28nm TSMC 28nm
Width Double Slot Double Slot
Length 9.5" 9.5"
Warranty 3 Years N/A
Price Point $419 $399

For the GTX 670 SC, EVGA has given both the core clock and memory clock a moderate boost. The core clock has been increased by 52MHz (6%) to 967MHz base and 66MHz (7%) boost to 1046MHz. Meanwhile the memory clock has been increased by 202MHz (3%) to 6210MHz.

Other than the clockspeed changes, the GTX 670 SC is an almost-reference card utilizing a reference PCB with a slightly modified cooler. EVGA is fabricating their own shroud, but they’ve copied NVIDIA’s reference shroud down to almost the last detail. The only functional difference is that the diameter of the fan intake is about 5mm less, otherwise the only difference is that EVGA has detailed it differently than NVIDIA and used some rounded corners in place of square corners.

The only other change you’ll notice is that EVGA is using their own high flow bracket in place of NVIDIA’s bracket. The high flow bracket cuts away as much metal as possible, maximizing the area of the vents. Though based on our power and temperature readings, this doesn’t seem to have notably impacted the GTX 670 SC.

While we’re on the matter of customized cards and factory overclocks, it’s worth reiterating NVIDIA’s position on factory overclocked cards. Reference and semi-custom cards (that is, cards using the reference PCB) must adhere to NVIDIA’s power target limits. For GTX 670 this is a 141W power target, with a maximum power target of 122% (170W). Fully custom cards with better power delivery circuitry can go higher, but not semi-custom cards. As a result the flexibility in building semi-custom cards comes down to binning. EVGA can bin better chips and use them in cards such as the Superclocked – such as our sample which can go 17 boost bins over the base clock versus 13 bins for our reference GTX 670 – but at the end of the day for stock performance they’re at the mercy of what can be accomplished within 141W/170W.

In any case, as the card is otherwise a reference GTX 670 EVGA is relying on the combination of their factory overclock, their toolset, and their strong reputation for support to carry the card. EVGA has priced the card at $419, $20 over the GTX 670 MSRP, in-line with other factory overclocked cards.

On the subject of pricing and warranties, since this is the first EVGA card we’ve reviewed since April 1st, this is a good time to go over the recent warranty changes EVGA has made.

Starting April 1st, EVGA has implemented what they’re calling their new Global Warranty Policy. Starting July 1st, 2011 (the policy is being backdated), all new EVGA cards ship with at least a 3 year warranty. And for the GTX 600 series specifically, so far EVGA has only offered models with a 3 year warranty in North America, which simplifies their product lineup.

To complement the 3 year warranty and replace the lack of longer term warranties, EVGA is now directly selling 2 and 7 year warranty extensions, for a total of 5 and 10 years respectively. So instead of buying a card with a 3 year warranty or a longer warranty, you’ll simply buy the 3 year card and then buy a warranty extension to go with it. However the extended warranty requires that the card be registered and the warranty purchased within 30 days.

The second change is that the base 3 year warranty no longer requires product registration. EVGA has other ways to entice buyers into registering, but they’ll now honor all applicable cards for 3 years regardless of the registration status. At the same time the base 3 year warranty is now a per-product warranty (e.g. a transferable warranty) rather than per-user warranty, so the base warranty will transfer to 2nd hand buyers. The extended warranties however will not.

The third change is how EVGA is actually going to handle the warranty process. First and foremost, EVGA is now allowing cards to be sent to the nearest EVGA RMA office rather than the office for the region the card was purchased from. For example a buyer moving from Europe to North America can send the card to EVGA’s North American offices rather than sending it overseas.

Finally, EVGA is now doing free cross shipping, alongside their existing Advanced RMA program. EVGA will now cross-ship replacement cards for free to buyers. The buyer meanwhile is responsible for paying to ship the faulty card back and putting up collateral on the new card until EVGA receives the old card.

There’s also one quick change to the step-up program that will impact some customers. With the move to purchasing extended warranties, the step-up program is only available to customers who either purchase an extended warranty or purchase an older generation card that comes with a lifetime warranty. Step-up is not available to cards with only the base 3 year warranty.

Moving on, along with EVGA’s new warranty EVGA is bundling the latest version of their GPU utilities, Precision X and OC Scanner X.

Precision X, as we touched upon quickly in our GTX 680 review, is the latest iteration of EVGA’s Precision overclocking & monitoring utility. It’s still based on RivaTuner and along with adding support for the GTX 600 series features (power targets, framerate caps, etc), it also introduces a new UI. Functionality wise it’s still at the top of the pack along with the similarly RivaTuner powered MSI Afterburner. Personally I’m not a fan of the new UI – circular UIs and sliders aren’t particularly easy to read – but it gets the job done.

Gallery: EVGA X Tools

OC Scanner X has also received a facelift and functionality upgrade of its own. Along with its basic FurMark-ish stress testing and error checking, it now also offers a basic CPU stress test and GPU benchmark.

Meet The GeForce GTX 670 The Test
POST A COMMENT

414 Comments

View All Comments

  • SlyNine - Saturday, May 12, 2012 - link

    No the 5870 was replaced by the 6970. The 5870 was faster then the 6870.

    The wall was coming, since the 9700pro that needed a power adapter, to videocards that need 2 power adapters and took 2 slots. That was how they got those 2 and even 4x increases. the 9700pro was as much as 6x faster then a 4600 at times.

    But like I said this wall was coming and from now on expect all performance improvements to be based on architecture and node improvements.
    Reply
  • CeriseCogburn - Sunday, May 13, 2012 - link

    My text > " 4890-5870-6970 ???? "

    It was a typo earlier, dippy do.

    9600pro was not 6X faster than a 4600 ever, period - once again we have your spew and nothing else. But below we have the near EQUAL benchmarks.

    http://www.anandtech.com/show/947/20

    http://www.anandtech.com/show/947/22

    6X, 4X, 2X your rear end... another gigantic lie.

    Congrats on lies so big - hey at least your insane amd fanboy imagination and brainwashing of endless lies is being exposed.

    Keep up the good work.
    Reply
  • Iketh - Thursday, May 10, 2012 - link

    do you listen to yourself? you're just as bad as wreckage....

    you have never and will never run a corporation
    Reply
  • CeriseCogburn - Thursday, May 10, 2012 - link

    How can I disagree as obviously you are another internet blogger CEO - one of the many thousands we now have online with corporate business school degrees and endless babbling about profits without a single price cost for a single component of a single video card discussed under your belts.
    It's amazing how many of you tell us who can cut prices and remain profitable - when none of you have even the tiniest inkling of the cost of any component whatsoever, let alone the price it's sold at by nVidia or amd for that matter.
    I'm glad so many of you are astute and learned CEO mind masters, though.
    Reply
  • chizow - Thursday, May 10, 2012 - link

    You really don't need to be an internet blogger or CEO, you don't even need a business degree although it certainly wouldn't hurt (especially in accounting).

    Just a rudimentary understanding of financial statements and you can easily understand Nvidia's business model, then see when and why they are most successful financially by looking at the market landscape and what products were selling and for how much.

    I can tell you right now, Nvidia was at its most profitable during G80 and G92's run of success (6 straight quarters of record profits that have been unmatched since), so we know for a fact what kind of revenues, margins and ASPs for components they can succeed with by looking at historical data.
    Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    G92's were the most wide ranging selection of various cores hacks, bit width, memory config, etc- and released a enormous amount of different card versions - while this release is a flagship only tier thus far - so they don't relate at all.
    So no, you're stuck in the know exactly nothing spot I claimed you are, no matter what you spew about former releases.
    Worse than that, nVidia profit came from chipset sales and high end cards then - and getting information to show the G80 G92 G92b G94 etc profitability by itself will cost you a lot of money buying industry information.
    So you know nothing again, and tried to use a false equivalency.
    Thanks for trying though, and I certainly won't say you should change your personal stance on pricing of the "mid tier" 680, on the other hand I don't see you making a reasonable historical pricing/ performance/current prices release analysis - you haven't done that, and I've been reading all of your comments of course, and otherwise often agree with you.
    As I've said, the GTX580 was this year $499 - the 7970 released and 2.5 months later we're supposed to see the 580 killer not just at $499, but at $299 as the semi-accurate rumors and purported and unbelievable "insider anonymous information" rumors told us - that $299, since it was so unbelievable if examined at all, has become $399, or maybe $449, or $420, whatever the moaner wants it to be...
    I frankly don't buy any of it - and for good reason - this 680 came in as it did because it's a new core and they stripped it down for power/perf and that's that - and they drove amd pricing down.
    Now they're driving it down further.
    If the 680 hit at $299 like everyone claimed it was going to (bouncing off Charlie D's less than honest cranium and falling back on unquoted and anonymous "industry wide" claimed rumors or a single nVidia slide or posted trash prediction charts proven to be incorrect), then where would the 670 be priced at now ? $250 ?
    I suggest the performance increase along with the massive driver improvement bundle and keeping within the 300watt power requirements means that there is nowhere else to go right now.
    The "secret" "held back" performance is nowhere - the rumored card not here yet is a compute monster - so goodbye power/perf win and the giant PR advantage not to mention the vast body of amd fanboys standing on that alone - something nVidia NEVER planned to lead with this time - the big Kepler.
    It's not that nVidia outperformed itself, it's that their secrecy outperformed all the minds of the rabble - and all that's left is complainers who aren't getting something for nothing or something for half price as they hoped.
    Reply
  • chizow - Thursday, May 10, 2012 - link

    I don't need to run a corporation to understand good and bad business. The fact there are *OUTRAGED* GTX 680 buyers who feel *CHEATED* after seeing the GTX 670 price:performance drives the point home.

    Nvidia really needs to be careful here as they've successfully upset their high-end target market on two fronts:

    1) High-end enthusiasts like myself who are upset they decided to follow AMD's lackluster price:performance curve and market a clearly mid-range ASIC (GK104) as a high-end SKU (GTX 670, 680, 690) and charge high-end premiums for it.

    2) High-end enthusiasts who actually felt the GTX 680 was worthy of its premium price tag, paid the $500 asking price and often, more to get them. Only to see that premium completely eroded by a card that performs within a few % points, yet costs 20% less and is readily available on the market.

    Talk about losing insane value overnight, you don't need to run a business to understand the kind of anger and angst that can cause.
    Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    Well, the $$ BURN $$ is still less than the $$ BURN $$ the amd flagship cost - $130 + and that's the same card, not a need to be overclocked lower shader cut version.
    So as far as angry dollar burning, yeah, except amd has done worse in dollar costs than nvidia, and with the same card.
    Nice to know, hopefully your theory has a lot fo strong teeth, then the high end buyers can hold back and drive the price down...
    ( seems a dream doesn't it )
    Reply
  • CeriseCogburn - Friday, May 11, 2012 - link

    Let's not forget there rage guy, that 7970 burn of $130+ bucks just turned into a $180 or $200 burn.

    Yet, CURRENTLY, all GTX680 owners can unload for upwards of $500... LOL

    Not so for 7970 owners, they are already perma burned.

    I guess you just didn't think it through, it was more important to share a falsity and rage against nVidia.
    Nice try, you've failed.
    Reply
  • chizow - Sunday, May 13, 2012 - link

    Yes I've said from Day 1 the 7970 was horribly overpriced; it was just an extension of the 40nm price:performance curve 18 months after the fact.

    But that doesn't completely let Nvidia off the hook since they obviously used AMD's weak offering as the launching point to use a mid-range ASIC as their high-end SKU.

    End result is the consumer gets the SMALLEST increase in performance for their money in the last decade of GPUs. I don't understand why this is so hard for you to understand. Look at the benchmarks, do the math and have a seat.
    Reply

Log in

Don't have an account? Sign up now