Meet The EVGA GeForce GTX 670 Superclocked

Our second card of the day is EVGA’s GeForce GTX 670 Superclocked, which in EVGA’s hierarchy is their first tier of factory overclocked cards. EVGA is binning GTX 670s and in turn promoting some of them to this tier, which means the GTX 670 Superclocked are equipped with generally better performing chips than the average reference card.

GeForce GTX 670 Partner Card Specification Comparison
  EVGA GeForce GTX 670 Superclocked GeForce GTX 670 (Ref)
CUDA Cores 1344 1344
Texture Units 112 112
ROPs 32 32
Base Clock 967MHz 915MHz
Boost Clock 1046MHz 980MHz
Memory Clock 6210MHz 6008MHz
Memory Bus Width 256-bit 256-bit
Frame Buffer 2GB 2GB
TDP 170W 170W
Manufacturing Process TSMC 28nm TSMC 28nm
Width Double Slot Double Slot
Length 9.5" 9.5"
Warranty 3 Years N/A
Price Point $419 $399

For the GTX 670 SC, EVGA has given both the core clock and memory clock a moderate boost. The core clock has been increased by 52MHz (6%) to 967MHz base and 66MHz (7%) boost to 1046MHz. Meanwhile the memory clock has been increased by 202MHz (3%) to 6210MHz.

Other than the clockspeed changes, the GTX 670 SC is an almost-reference card utilizing a reference PCB with a slightly modified cooler. EVGA is fabricating their own shroud, but they’ve copied NVIDIA’s reference shroud down to almost the last detail. The only functional difference is that the diameter of the fan intake is about 5mm less, otherwise the only difference is that EVGA has detailed it differently than NVIDIA and used some rounded corners in place of square corners.

The only other change you’ll notice is that EVGA is using their own high flow bracket in place of NVIDIA’s bracket. The high flow bracket cuts away as much metal as possible, maximizing the area of the vents. Though based on our power and temperature readings, this doesn’t seem to have notably impacted the GTX 670 SC.

While we’re on the matter of customized cards and factory overclocks, it’s worth reiterating NVIDIA’s position on factory overclocked cards. Reference and semi-custom cards (that is, cards using the reference PCB) must adhere to NVIDIA’s power target limits. For GTX 670 this is a 141W power target, with a maximum power target of 122% (170W). Fully custom cards with better power delivery circuitry can go higher, but not semi-custom cards. As a result the flexibility in building semi-custom cards comes down to binning. EVGA can bin better chips and use them in cards such as the Superclocked – such as our sample which can go 17 boost bins over the base clock versus 13 bins for our reference GTX 670 – but at the end of the day for stock performance they’re at the mercy of what can be accomplished within 141W/170W.

In any case, as the card is otherwise a reference GTX 670 EVGA is relying on the combination of their factory overclock, their toolset, and their strong reputation for support to carry the card. EVGA has priced the card at $419, $20 over the GTX 670 MSRP, in-line with other factory overclocked cards.

On the subject of pricing and warranties, since this is the first EVGA card we’ve reviewed since April 1st, this is a good time to go over the recent warranty changes EVGA has made.

Starting April 1st, EVGA has implemented what they’re calling their new Global Warranty Policy. Starting July 1st, 2011 (the policy is being backdated), all new EVGA cards ship with at least a 3 year warranty. And for the GTX 600 series specifically, so far EVGA has only offered models with a 3 year warranty in North America, which simplifies their product lineup.

To complement the 3 year warranty and replace the lack of longer term warranties, EVGA is now directly selling 2 and 7 year warranty extensions, for a total of 5 and 10 years respectively. So instead of buying a card with a 3 year warranty or a longer warranty, you’ll simply buy the 3 year card and then buy a warranty extension to go with it. However the extended warranty requires that the card be registered and the warranty purchased within 30 days.

The second change is that the base 3 year warranty no longer requires product registration. EVGA has other ways to entice buyers into registering, but they’ll now honor all applicable cards for 3 years regardless of the registration status. At the same time the base 3 year warranty is now a per-product warranty (e.g. a transferable warranty) rather than per-user warranty, so the base warranty will transfer to 2nd hand buyers. The extended warranties however will not.

The third change is how EVGA is actually going to handle the warranty process. First and foremost, EVGA is now allowing cards to be sent to the nearest EVGA RMA office rather than the office for the region the card was purchased from. For example a buyer moving from Europe to North America can send the card to EVGA’s North American offices rather than sending it overseas.

Finally, EVGA is now doing free cross shipping, alongside their existing Advanced RMA program. EVGA will now cross-ship replacement cards for free to buyers. The buyer meanwhile is responsible for paying to ship the faulty card back and putting up collateral on the new card until EVGA receives the old card.

There’s also one quick change to the step-up program that will impact some customers. With the move to purchasing extended warranties, the step-up program is only available to customers who either purchase an extended warranty or purchase an older generation card that comes with a lifetime warranty. Step-up is not available to cards with only the base 3 year warranty.

Moving on, along with EVGA’s new warranty EVGA is bundling the latest version of their GPU utilities, Precision X and OC Scanner X.

Precision X, as we touched upon quickly in our GTX 680 review, is the latest iteration of EVGA’s Precision overclocking & monitoring utility. It’s still based on RivaTuner and along with adding support for the GTX 600 series features (power targets, framerate caps, etc), it also introduces a new UI. Functionality wise it’s still at the top of the pack along with the similarly RivaTuner powered MSI Afterburner. Personally I’m not a fan of the new UI – circular UIs and sliders aren’t particularly easy to read – but it gets the job done.

Gallery: EVGA X Tools

OC Scanner X has also received a facelift and functionality upgrade of its own. Along with its basic FurMark-ish stress testing and error checking, it now also offers a basic CPU stress test and GPU benchmark.

Meet The GeForce GTX 670 The Test
Comments Locked

414 Comments

View All Comments

  • CeriseCogburn - Friday, May 11, 2012 - link

    Here we are treated to 5 paragraphs of attack on the 600 series, note the extreme phrasing given against, the "known problem" of the GTX cards, not the "inexplicable" results that means something is wrong other than with the amd card when it loses.

    This contrasts with the bland put downs the 670 compared to the 680 and 570 receive when they win by enormous comparative margins in the rest of the game pages.

    So the reviewer has a field day here:
    " Overall performance isn’t particularly strong either. Given the price tag of the GTX 670 the most useful resolution is likely going to be 2560x1600, where the GTX 670 can’t even cross 30fps at our enthusiast settings."

    Completely unmentioned of course after the jab at pricing just for the 670, same price as the 7950 that fares not playably better here and gets spanked the other 75% of time, is the 5760x1200 higher resolution where the 670 achieves even higher frame rates than 30, surpassing 30 all the way up to 35.6, just below 35.8 for the 7950, two tenths of one frame.
    Somehow, that isn't mentioned, only the lower 2560 resolution with lower frame rates (for all the cards) but the 670 singled out as the only card that has peaked at "given the price".

    Later in the review completely unplayable frame rates for all cards in a test is used to attack just the 570, too, for lack of memory. Forget the fact that none of the other cards had playable frame rates.

    Eye candy was turned down at the triple monitor resolution but that has never before made 2560 most useful for reviews here, especially with lower frame rates for all the cards tested at the lower resolution settings. Only when we can cut down nVidia is such a statement useful, and it is very definitely confined to just the nVidia card then.
    So avoided is the paltry frames of the other competing cards even at "easier" 5670 settings.
    If the 670 is no good past 2560, then neither are any of the other cards at all, except the 7970 ? Maybe the reviewer suddenly has decided 5670 gaming is no good.

    " Even 1920x1200 isn’t looking particularly good. This is without a doubt the legitimate lowpoint of the GTX 670. "
    Well, then the 7950 doesn't look good at 1920 either, less than 1 fps difference, not to mention the 680 that is within in couple frames.
    If we take the reviewers words with their total meaning, what we have is the unsaid statement that - only possibly the 7970 should be used for this game at 5670, no other card though.

    Now - a total examination of the Crysis Warhead gaming page fps charts reveals this:
    Every card is unplayable at every resolution except for the latest respective releases in 1920X1200 chart.
  • BrunoLogan - Friday, May 11, 2012 - link


    ... still unreachable for me on what budget is concerned. The 660Ti is what I'm looking for but as I saw somewhere it may be 5 or 6 months away and I don't know if I can wait that long. My old C2D need's replacement. I may just grab a 560Ti and later down the road get 760Ti skipping 6xx generation... bittersweet :-\
  • shin0bi272 - Friday, May 11, 2012 - link

    what gpu do you have now? You said you need to upgrade your core 2 cpu but didnt say what you have for a gpu.

    Im still running a gts 250 and getting pretty good fps on everything but BF3 at pretty high specs on a 19x12 monitor. Your major issue with games today is they are made for consoles with dx9 cards in them that came out in 2006. So with some exceptions (crysis, metro 2033, and bf3 for example) you dont really need a huge card for anything other than playing all the new games at max spec. Sure everyone wants to do that but you dont necessarily NEED to. I played metro2033 and had physx on and it was easily playable in the 30-40 fps range.

    So if you upgrade your cpu (which btw you really only need to upgrade to a quad core if its a gaming rig to get the max fps a cpu upgrade wil give you) and keep your current gpu and then when money allows grab a 670 or 685 or whatever AMD has to offer in your price range.
  • BrunoLogan - Friday, May 11, 2012 - link

    Do you really want to know? I have a 9600GT :-P Also, I can't call it an upgrade as in "adding some new parts and keeping some of the existing ones". I'm really buying a new machine PSU and tower included. That's why I say it's bittersweet to buy a new machine with previous generation graphics.
  • shin0bi272 - Monday, May 14, 2012 - link

    hmmm well see what you have for cash left over after buying the important parts. Honestly buying a new system now is a good idea. Ivy bridge being released which drops the prices of sandy bridge (which as I said before will give you the same FPS in game) and even throwing $125 at a 550ti will be a good jump till the end of summer when the 685 comes out, and the 550 wouldnt give you the best fps so youd still be wanting to upgrade.
  • shin0bi272 - Monday, May 14, 2012 - link

    oh and a gts250 is a rebadged and die shrunk 8800gtx
  • medi01 - Saturday, May 12, 2012 - link

    Hard to justify buying 560Ti, unless you somehow decided to only by nVidia.
    7850 consumes much less power while being ahead performance wise.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    7850 costs more, and has the massive disadvantage of being plagued with the now featureless in comparison amd crash pack latest, 12.4, to be followed on by another disaster within a months time.
  • medi01 - Sunday, May 13, 2012 - link

    Why don't you kill yourself, dear nVidia zealot with a lot of time to post utter nonsense?
  • CeriseCogburn - Sunday, May 13, 2012 - link

    LOL - hey man the facts aren't issues to be sad about.

    If I get depressed I'll let you know so you can help. :)

Log in

Don't have an account? Sign up now