We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. Overall AMD and their partners are still on target to launch FreeSync displays this quarter, with AMD telling us that as many as 11 displays could hit the market before the end of March. For CES AMD had several displays running, including a 28” 60Hz 4K display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.

More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had serveral demos running, including a tearing test demo with a large vertical block of red moving across the display, and a greatly enhanced version of their earlier windmill demo. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary (sweep) over time between 40 Hz and 55 Hz. The Samsung display meanwhile was even able to show the current refresh rate in its OSD, and with FreeSync enabled we could watch the fluctuations, as can be seen here. [Update: Video of demo has been added below.]

Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and delivering on all of AMD's feature goals, and it should be available in the next few months. Meanwhile AMD also took a moment to briefly address the issue of minimum framerates and pixel decay over time, stating that the minimum refresh rate each monitor supports will be on a per-monitor basis, and that it will depend on how quickly pixels decay. The most common outcome is that some displays will have a minimum refresh rate of 30Hz (33.3ms) and others with pixels quicker to decay will have a 40Hz (25ms) minimum.

On the retail front, what remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean that there are not additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.

Perhaps the bigger question though will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. Meanwhile if FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.

POST A COMMENT

118 Comments

View All Comments

  • eanazag - Thursday, January 8, 2015 - link

    Also, Nvidia has been more successful selling business cards due to the effort they have made. Specifically in virtualization features. AMD has been playing catch up in the enterprise for video cards for years.

    Previous gen NVidia cards had poor OpenCL performance too.
    Reply
  • chizow - Thursday, January 8, 2015 - link

    Yay, more nonsense. I guess Nvidia doesn't care about creating new, innovative technologies that didn't pre-exist, simply because everyone would love them more for just waiting for some standards board to invent it years later?

    And almost everyone who writes for CUDA hates it? What kind of nonsense is this? People love, and use CUDA because IT WORKS and makes THEIR JOBS, and therefore, their LIVES easier. I administrate and support a number of users and machines that use CUDA and Nvidia solutions and at no point have I ever heard any of this gibberish, in fact, they make it a point that the solution MUST be Nvidia CUDA-based for their needs.
    Reply
  • TheJian - Thursday, January 8, 2015 - link

    Cuda works best because of $7Billion invested in it, ~8 years of building the ecosystem for it, 500+ schools teach it, every popular app for content creation etc uses it, etc etc. The reason OpenCL sucks is there is no company backing it like NV backed Cuda for years. You get what you pay for (mostly) and NV paid and paid and paid for cuda, and they're getting what they paid for now year after year.

    Please show a test where Cuda loses to OpenCL. Find the best OpenCL AMD app and pit it against the best NV Cuda enabled app that does the same thing. Cuda will win pretty much always. There really is a REASON Cuda owns 80% or so of the workstation market. People who are risking money (IE, their business) don't buy crap for tools on purpose. I could hate NV/Cuda and I'd still buy it today due to nothing being better. If you don't, the guy who does do that will blow your doors off until you go bankrupt.

    As long as Cuda is faster, NV has no reason to do anything but develop new versions of cuda. No point in helping the enemy catch up. That would be stupid business and cause me to drop their stock like a hot rock...LOL.
    Reply
  • eanazag - Thursday, January 8, 2015 - link

    I think Intel may weigh in here and if their integrated graphics support FreeSync (since it is in DisplayPort spec), then Freesync will dominate.

    I don't care for vendor lock-in because I do go back and forth between vendors. I have multiple machines in the house. Currently my house is AMD R9 290, but it was Nvidia GTX 660. Integrated and discrete in laptops. Generally discrete GPUs in laptops I lean towards Nvidia because I don't want to play the Enduro "does it work?" game.
    Reply
  • ppi - Thursday, January 8, 2015 - link

    They will not abandon G-Sync. Just in case there is a lot of Freesync monitors in the wild (and with Samsung and LG backing, there will be), they will just easily enable on their cards Freesync compatibility. So they will support both. Reply
  • Antronman - Saturday, January 10, 2015 - link

    But if Freesync works just as well why would consumers buy ~$800 monitors when they could get the same on ~$150 monitors? Reply
  • chizow - Sunday, January 11, 2015 - link

    @Antronman,

    Where do you get $150 monitors being the same as $800 monitors? Excluding G-Sync on any panels in the market, do you think you "get the same" on a $150 monitor as a $600 non-G-Sync equivalent?

    I know you're trying to perpetuate the hopeful myth all monitors will be FreeSync compatible simply because Adaptive Sync is an optional component of the DP1.2a standard, but let's be real here, not all monitors that support DP1.2a are going to be Adaptive/FreeSync compatible.
    Reply
  • Antronman - Sunday, January 11, 2015 - link

    There's a lot of DP 1.2 monitors that have Adaptive Sync.

    Freesync is software that enables GPUs to utilize it for variable refresh rates. G-sync is hardware that enables GPUs to use it.

    Haven't you read the articles? Freesync is a standard feature of DP 1.2a and onwards. Every DP 1.2a and onwards monitor will have it.
    Reply
  • chizow - Monday, January 12, 2015 - link

    No, there's not a single DP 1.2 monitor that supports Adaptive Sync or FreeSync, or more specifically, variable refresh and dynamic frame rates. No firmware or software update on the planet is going to replace the missing hardware necessary to make FreeSync a reality.

    Haven't you read the articles? DP1.2a was ratified in May 2014 and is an OPTIONAL part of the standard that is not backward compatible with existing displays because that OPTIONAL functionality requires new scaler ASICs that were not developed until after the spec was ratified, with news of them sometime in September 2014, with pre-production samples finally being displayed this week at CES.

    http://www.amd.com/en-us/press-releases/Pages/supp...

    I don't blame you for being confused on this though, I completely understand all the noise regarding FreeSync's development has been very misleading, which is why I think most companies choose to develop the tech and begin production in working devices before making a ton of unsubstantiated claims about it. :D
    Reply
  • FlushedBubblyJock - Tuesday, February 24, 2015 - link

    We will hear for YEARS the prior falsehoods the AMD fanatics just spewed in comments over the last 2 pages, and it will become their always proclaimed belief system, despite the facts.
    The chip on their shoulders will grow, they will never have any valid counterargument, and of course the rest of us will have to put up with it.
    When freesync doesn't pan out properly, is less effective with less flexible features, has problems working with many games, and requires hacks and has driver issues with AMD, the same crowd will blame nVidia for not "supporting it" and "causing AMD problems".
    They will then deeply desire a huge lawsuit and a payment from nVidia directly to AMD, while having exactly the OPPOSITE stance when it comes to AMD's Mantle.
    Reply

Log in

Don't have an account? Sign up now