@rowr: If you want this discussion to be dead serious: Yes the GTX 690 had been the "cheaper" choice, if one wanted to have the "kick ass gaming rig" where "money doesn't matter" one would have bought two 680's with 4 GB vram and ran them in SLI, such an setup would easily eat Watch Dogs. Multi-GPU cards are a pointless waste of money and power, unless you want to go quadruple or higher SLI/CF there is absolutely no reason at all to buy these overpriced monstrosities, these cards are more about prestige than actual performance or price/performance ratio.
SLI/CF support has always been notoriously dodgy, people who buy these cards (or SLI/CF two single cards) and expect "double the performance" didn't do their homework and thus shouldn't be allowed to waste such obscene amounts of money on hardware.
2GB cards won't be "worthless" now, but don't expect to run "Ultra" setting or Anti-Aliasing on any HD resolutions with tolerable FPS in any newer releases. People have gotten too used to "just cranking it up to max" without taking any care about what their rig is actually capable of performing, now when this won't work anymore people simply start blaming software for their own cluelessness. How many of the people complaining did actually check where their hardware is bottlenecking? Is the CPU too slow? The GPU? Does any memory fill up too much? Is the GPU throttling due to thermal issues? What kind of medium is the game running from, SSD or HDD?
Nobody gives a crap or checks for these things, even tho it's exactly those things that tell the true story about the performance of the game and what's responsible for it performing badly. Knowing these things helps one making the right upgrade choices and tweaking the right settings to get the game running at desirable framerates with the best possible look.
@nethlem: I don't agree with your assessment at all. I think games like Watch_Dogs run terribly on PC because Ubisoft built it with consoles in mind and the PC port was bad - not because the new consoles are in any way better than modern, top end gaming rigs that run Watch Dogs like...dogs. I have a machine that far exceeds the recommended specs for Watch Dogs, I mean for gods sake my video card is two entire generations newer than the one recommended, and I can't even get a stable 60fps at a measly 1920x1080. This has nothing to do with the size of textures and everything to do with how poorly it was programmed.
In no way did i imply that new consoles are "more powerful than top end gaming rigs" (even tho the PS4 has a certain edge with it's 8 GB of unified GDDR5 memory), i merely pointed out that they are way more powerful compared to the previous gen hardware Xbox360/PS3, especially in terms of memory.
People have gotten too used to the performance ceiling these old consoles imposed on the majority of games, these past 5 years you could basically max out any game even with a modest mid-tier gaming rig, that's been the result of the last console gen having been around for so long.
Heck I've used a HD 5870 1GB for these past 4 years and gotten along pretty nicely in most games with mostly high settings and tolerable fps, it's been a week since i upgraded to an R9 280x 3GB (got one from ebay for 180€) because Titanfall wouldn't run too nicely and i knew that Watch Dogs would end up with very high vram requirements (like all open world games). But my HD 5870 would still run these games in mid settings with tolerable FPS, an 4 year old graphics card!
My 3 year old 2600k@4,4 GHZ will most likely be beefy enough for this whole current console gen in games, people who opted for lower end i5's gonna have to get a new CPU down the line this console gen.
A decade ago a 4 year gap in GPU upgrade would usually mean not being able to play the "newest releases" at all, and don't get me started on the times when new CPU generations meant having quadrupled power every 2 years.
This might sound harsh, but there are way too many clueless people complaining around here, statements like "My graphics card is 2 generations newer than the recommended one!" ooze with ignorance about the way PC gaming hardware works. Your graphics card could be twenty generations newer for all i care, if you bought a budget model with low memory bandwidth, memory size and shitty clockspeeds that won't help you much running any game on demanding settings, because these past few generations of graphics cards have mostly been simple rebrandings of old architectures by Nvidia and AMD.
Only a few people around here do actually troubleshoot where their shuttering issues are actually coming from, by using tools like MSI Afterburner to check vram usage or simply monitoring temps to make sure nothing gets throttled down. And surprise surprise: These people do not complain about performance issues.
The thing Ubisoft fucked up with had been the pagefile check, looks like the game checked too often for the state of the pagefile before actually writing too it when the memory is full, resulting in even worse performance in situations where performance is already shitty due to full memory, but that issue can simply be fixed with the -disablepagefilecheck start parameter. It's also likely that they have a memory leak somewhere, leading to decreased performance when the game is running for extended periods of time and especially bad performance in situations when the memory is filled up, but these kind of memory leaks are pretty common, especially among open world games with lots of assets and complex systems/interactions.
GTAIV on PC has similar insane vram requirements for that very same reason, especially if you wanted to use the custom high-res mods you'd better hope you got at least something around 4 GB of vram in your system.
But GTAIV looks like shit compared to Watch Dogs on PC, even GTA5 only has aesthetics over Watch Dogs (The GTA5 world is simply build more carefully with a lot more love for detail), but in terms of what's "going on under the hood" Watch Dogs is actually all kinds of impressive, the game just does not often show it off that great. A lot of the environment is destructible in an very impressive way, but players hardly notice it because they just sneak past the action or the action is happening so fast (driving) that a lot of the details get lost in the frenzy. For example: The first hideout you start in, the motel, try getting a 5 star cop rating there and watch how the place falls apart from gun fire over time.
Watch Dogs is not a game that looks that impressive on screenshots, it's the moving action with all the particle effects that make the game look great, on the right kind of settings.
With my above mentioned setup, (R9 x280, 2600k@4,4Ghz, 8 GB ram, SSD) i can run the game with Ultra Textures, Temp SMAA, LoD Ultra and everything else on high (except water being on medium, that's also responsible for a lot of strain on the hardware) with 30-60 fps, mostly 60 on foot and 30 while driving.
The game looks worlds apart from the PS4 version and the PS4 never even comes close to 60 fps, so much about that.
Log in to comment