We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

POST A COMMENT

118 Comments

View All Comments

  • Gigaplex - Thursday, January 8, 2015 - link

    Amen. I'm hoping G-Sync dies out pretty quickly so we don't have two competing implementations for too long. It's not likely G-Sync will be the better option for interoperability due to NVIDIAs reluctance to license their technology. Reply
  • chizow - Thursday, January 8, 2015 - link

    I always find this to be an interesting dichotomy. AMD fans will frequently parrot the internet meme that the world needs AMD for the sake of competition against the likes of Intel and Nvidia, or we'll all end up somehow perishing under the oppression of $10,000 CPUs and GPUs and that competition is always good and necessary for the consumer.

    But when Nvidia comes out with an innovative new technology that AMD doesn't have in an attempt to differentiate and better their products for their customers, these same AMD fans want them to fail. Because they're competing too hard?

    Just find it a bit ironic. :)
    Reply
  • mickulty - Thursday, January 8, 2015 - link

    G-Sync is a proprietary standard. Freesync is just a brand name for DisplayPort Adaptive Sync. The difference is Nvidia could easily implement Freesync if they wanted to, but AMD cannot implement G-Sync. Would you rather have a single standard that only one company is allowed to use, or a single standard that everyone can use? :) If there wasn't an open alternative standard you'd have a point, nothing wrong with good technology, but the fact is a world where everyone implements adaptive sync would be a world that's better for everyone, with the possible exception of nvidia's marketing department.

    Also, I wouldn't really compare the GPU world to the CPU world. While I'm sure you'll respond to this with something pointless about the state of the market over the last couple of months, AMD STILL have the fastest single card and have come out with faster cards than Nvidia for years. Nvidia is not like the graphics version of Intel - frankly they do as much to make AMD keep competing as AMD does to make them keep competing.
    Reply
  • Yojimbo - Thursday, January 8, 2015 - link

    It's DisplayPort because it was given to VESA for free by AMD and VESA accepted the IP. This was probably a matter of NVIDIA being months ahead with G-Sync and controlling over 60% of the discrete desktop graphics chip market share. AMD couldn't compete with NVIDIA on the matter so their best strategy was to be defensive and give it away to take the advantage away from NVIDIA. However as long as you have a current NVIDIA video card, FreeSync won't work with it, so if you want that feature, you'll have to get a G-Sync monitor instead. If FreeSync works just as well as G-Sync it would be nice if NVIDIA would support FreeSync, but I wouldn't count on it. They spent money developing it first and on their own, and I believe it or something similar has already been available for their Quadro cards since 2012, so it would mean supporting both standards at once for a while. Unless they are forced to, they probably don't want to incur extra cost overhead from an innovation they developed because a competitor decided to develop something similar and give it away. Reply
  • haukionkannel - Thursday, January 8, 2015 - link

    Hmmm... The Adaptive sync is from VESA and much older "thing" than the free sync re branding made by AMD. AMD just gave new name to allready existing technology and dis show a new way of using existing technology to achieve similar effect that can be done by using g-sync.
    All in all open standard is better. That is why DX12 will win Mantley in the long run (even though dx is not open, it is used by all competitors). That is why adaptive synch will win in long run. Intel will use it to make their igp to look less bad, AMD have to use it because they wont to compete with Nvidia. Nvidia will wait and make money as long as they can by sellin g-sync produts. When the g-sync is not selling Nvidia will release new super drivers that will allow Nvidia adaptive sync with their all products, because they don't wan to use the same name as AMD who will keep on branding the exact same system as a free sync because it sounds better than adaptive sync even though it is exactly the same thing...
    Reply
  • chizow - Thursday, January 8, 2015 - link

    Actually, according to AMD, Nvidia can't just support FreeSync. In fact, even most of AMD's recent GPUs can't support it. Who knows if Intel can? FreeSync has an even smaller addressable market than G-Sync right now, and this is a fact.

    http://techreport.com/news/25867/amd-could-counter...
    According to AMD's Raja Koduri:
    "The exec's puzzlement over Nvidia's use of external hardware was resolved when I spoke with him again later in the day. His new theory is that the display controller in Nvidia's current GPUs simply can't support variable refresh intervals, hence the need for an external G-Sync unit."
    Reply
  • medi03 - Thursday, January 8, 2015 - link

    Dear anandtech.com god.
    For the love of reader's sanity, please ban this troll.
    Reply
  • dcoca - Thursday, January 8, 2015 - link

    Off the bat I have both 970 Asus and r9 290 Asus: subjectively they are the same for gaming, for opencl AMD kicks ass... I have universe sandbox 2 alapha for the last year in a bit ( beta testing it), and the performance of AMD cards is huge when lots of shitz is happening in the simulation, in my perspective, AMD is the champ for anything that uses Opencl that isn't a benchmark... however, the 290 is a reff card that I picked up when it was released, and yes it gets much louder then the 970...both cards are great.. on the CPU side I like their idea of the APU unit and HSA, but needs to be faster and on par with Intel ipc before I ever go back to them Reply
  • FlushedBubblyJock - Tuesday, February 24, 2015 - link

    Nothing hurts a rude fanboy more than the truth.
    1st amendment, try to not be a book burrner.
    Reply
  • ppi - Thursday, January 8, 2015 - link

    Consider that only a handful of latest AMD cards support Freesync.

    nVidia will not have issues supporting it in their next generation of cards (along with G-Sync). Intel could do the same with Skylake (and mind you, this tech is best for low-performing gfx cards).

    With Samsung and LG backing this technology (and Asus having non-branded, but probably compatibe monitor as well), this is bound to get some serious market share. I mean, it looks to me Freesync monitors offering is going to be wider than G-Sync, even though that one had one year headstart.
    Reply

Log in

Don't have an account? Sign up now