Sign up FAST! Login

Why 'Watch Dogs' Is Bad News For AMD Users -- And Potentially The Entire PC Gaming Ecosystem - Forbes


Stashed in:

To save this post, select a stash from drop-down menu or type in a new one:

I believe Hallock isn’t just offering up lip service here. AMD’s “FreeSync” technologyaims to improve the working relationship between display and GPU by tapping into the open “Adaptive Sync” specification which will soon be standard on all DisplayPort-enabled monitors. Nvidia’s solution, G-Sync, is proprietary and involves custom hardware built into standard monitors. (I haven’t seen FreeSync in action, and I admittedly love what G-Sync offers. But that doesn’t change the facts surrounding the technologies.) AMD’s Mantle, a low-level API, doesn’t require the company’s GCN architecture to function properly. AMD says it will work equally well on Nvidia cards. The company clearly waves a banner of open-source development and ideals.

With that admittedly verbose background out of the way, let’s dig into Watch Dogs specifically. I’ve been testing it over the weekend on a variety of newer AMD and Nvidia graphics cards, and the results have been simultaneously fascinating and frustrating. It’s evident that Watch Dogs is optimized for Nvidia hardware, but it’s staggering just how un-optimized it is on AMD hardware. I guarantee that when the game gets released, a swarm of upset gamers are going to point fingers at AMD for the sub-par performance. Their anger would be misplaced.

You May Also Like: