This game runs like ass on everything older than the 700 series. This is being ignored because people usually think "well yeah, those cards are old," but it's not really acceptable. Here's why:
Far Cry 4 on low settings looks worse than Far Cry 3 does on low settings. It has straight up graphical errors on low. Look at a tree in the sun with anti-aliasing off on low and you'll see what I mean. Why is it that Far Cry 3 can run on Medium-High on a 560 Ti at around 40 to 60fps, but Far Cry 4 can barely manage 'Low' at 30fps? Far Cry 4 is not that graphically superior than its predecessor, especially on the PC.
Yet everywhere I go people talk about how well optimized it is. The PC-Gaming Wiki praises it for running well "even on older GPU." Are people surprised that it's not as poorly optimized as Assassin's Creed was and therefore letting it slide by? This is the first game where the GeForce experience has recommended I downscale my resolution in order to run this at a reasonable framerate (30fps is reasonable for PC now apparently).
I plan to get a new GPU soon, but this game really dropped the ball in terms of its optimization for older hardware. The market share for 500 and 600 series cards or older is still pretty damn high so the fact that it runs this poorly is disappointing. Especially when their minimum specs include a 400 series which I severely doubt can manage a common HD resolution like 720p and 1080p. Do minimum requirements basically mean "you can launch the game and look at colors" now?
Log in to comment