Something went wrong. Try again later

Unilad

This user has not updated recently.

722 6 20 4
Forum Posts Wiki Points Following Followers

Unilad's forum posts

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

" why does it not do its job of presenting the truth to consumers?"

Answer, too much money at stake.

This. I've had to change my video game purchasing style. I remember the days when I used to get excited for a game. Wait for it's release and enjoy it.

Now, I see a trailer/announcement and I know that there is a high chance this game will be overblown/overhyped and poorly produced.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@nethlem said:

@unilad said:

@nethlem said:

Please enlighten us about the specs of your PC, otherwise your comment only adds to the point and clueless drivel, of which we already have more than enough around here.

My guess: Low vram, no HT on CPU (or AMD CPU) and/or Multi-GPU setup, right?

All this is kind of sad considering how simple it actually is: Affordable, 60 fps, Ultra details, you can only have two of those, but not all 3 at the same time and even the latter two are never really guaranteed and more of an gamble, due to crappy multi-GPU support throughout the industry. Btw you should blame AMD and NVidia for that and not game developers, the drivers are responsible for proper support of multi-GPU setups, as it should be. Outsourcing that responsibility to game developers has never worked and will never work, because barely any development studio has the budget to get themselves all kinds of high-end multi-GPU setups just to optimize for them, even if money wouldn't be an issue, supply often ends up being the real issue with these cards that have been build in comparable limited numbers.

Christ, you're a condescending motherfucker.

GTX 770 4GB, 3770k, 16GB Ram. SSD.

I wouldn't complain unless I knew I had grounds too. Fine, I might not be able to play it on all ultra, but it should STILL run better than this.

Still "run better than this" based on what? You have to realize that you are throwing highly subjective terms around here. How did it run on what settings? Please define "STILL run better" in an way that's actually meaningful, otherwise your are sadly just writing words for the sake of writing words. Did you honestly expect any current gen open world game to easily run on constant 60 fps, while not looking like complete shit?

And yes i might be condescending, but this whole "drama" is just so hilarious... one side of the web is complaining that Watch Dogs is favoring Nvidia too much, due to shitty performance on AMD cards, while over here people with Nvidia cards are complaining about "not enough performance".

When in reality the issue simply boils down to people running the game with too high of an texture resolution and/or anti-aliasing while not having enough vram to support these kinds of settings. A lot of the people with "massive performance issues" would simply need to drop down texture quality and/or AA one notch to bring the game from "unplayable" to "rock solid". Of course you can just ignore all technical realities and pretend that "the game is badly optimized" after you tried to cram ultra textures with AA into those 2 GB vram of your GPU, which in turn leads to the game outsourcing assets to the swapfile on the harddrive (which ends up being an super slow HDD for a lot of people) once the vram is full and that's what is killing the performance so heavily for a lot of people. Or just to quote Sebastien Viard, the game's Graphics Technical Director:

“Making an open world run on [next-generation] and [current-generation] consoles plus supporting PC is an incredibly complex task.” He goes on to to say that Watch Dogs can use 3GB or more of RAM on next-gen consoles for graphics, and that “your PC GPU needs enough Video Ram for Ultra options due to the lack of unified memory.” Indeed, Video RAM requirements are hefty on PC, especially when cranking up the resolution beyond 1080p."

That last part is especially important, even tho way too many people believe it does not apply to them: "especially when cranking up the resolution beyond 1080p", this also applies when you enable anti-aliasing while running on 1080p, in essence AA is an resolution boost. That's also the reason why AA is among the first things you disable/tone down when something runs extremely shitty, it's seriously resource heavy on the GPU/vram. But too many people refuse to tone down the AA because they've gotten too used to Xbox360/PS3 era style games where current PC tech had been heavily unchallenged and thus people could just "crank it up to the max" without many issues.

But it's not really that complicated...

Just get Fraps to keep track of your fps and MSI Afterburner to keep track of your computers resources, once these two run you start looking for the settings that match your setup. Is your GPU not on 100% load? Crank up some visual detail! Is your vram filling up to the brim? Tone down some of these details (Textures, resolution and AA fill up vram)! Is your CPU actually busy doing something or is your GPU bottlenecking it? All these things matter in regards to the gaming performance of an individual rig.

I can run this game with a 3 year old mid-range CPU and an current gen mid-range AMD GPU on nearly "max settings", while still going into the 60 fps (depending on scene), please tell me more about that "poor performance", because i certainly can't witness anything like that over here. Is it the best optimized game ever? Nope not really, open world games rarely are, but too many people are making a mountain out of a molehill, because they simply refuse to tone down their visual settings just one notch.

These people won't have much fun in the years to come, for the next 1-2 years they might still be able to blame "bad optimization/console ports", but sooner or later the 8 GB unified GDDR5 memory inside the PS4 will take their toll on the requirements for future PC versions of these games, it's called progress and I'm damn happy that we are finally having some again.

Or have people already forgotten that even Epic wanted to have more memory in the next gen consoles? Last console gen had been heavily bottlenecked by it's memory, thus in turn nobody did really make great use with all the extra memory that's present in most PC's. This gen isn't about actual computing power (we didn't have super large jumps in the department), it's all about memory size (there we had large jumps due to the prevalence of SSD's, memory chips have become cheap).

Watch Dogs isn't Crysis.

It isn't groundbreaking. It is a game meant for a mass market appeal. It should run great on most PC's. Whilst you can have an niche game with bad optimisation, there is NO excuse for creating a game, knowingly market it widely, then sell a shitty unoptimised PC version.

It's a disgrace. I cannot believe people give Ubisoft a break.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

Fucking up on Rome: Total War 2 though, is just unforgivable. That's a PC-centric game. For fucks' sake.

Haha.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@humanity said:

@unilad: Unfortunately it's a problem with the code. I have a similar setup to yours with only a 2GB video card, and a Radeon at that, and the game runs pretty much flawlessly on High when I put the AA down lower. It's a shame since Watch Dogs really is a fun game, but poor performance can ruin that real quick.

It's true though that most games from the Ubisoft stable that I've played recently have ran way worse than they had any right to.

Especially, for how much these games cost!

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@mb said:

@nethlem: I don't agree with your assessment at all. I think games like Watch_Dogs run terribly on PC because Ubisoft built it with consoles in mind and the PC port was bad - not because the new consoles are in any way better than modern, top end gaming rigs that run Watch Dogs like...dogs. I have a machine that far exceeds the recommended specs for Watch Dogs, I mean for gods sake my video card is two entire generations newer than the one recommended, and I can't even get a stable 60fps at a measly 1920x1080. This has nothing to do with the size of textures and everything to do with how poorly it was programmed.

Exactly.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

You're letting a bunch of bad apples apples spoil the whole bunch. PC ports, for the most part, have been better than their console counterparts.

WATCH DOGS IS A BIG APPLE. No matter the overhype, that was a genuinely big game that people had been looking forward to for a long time.


Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#7  Edited By Unilad

@nethlem said:

@unilad said:

I uninstalled that piece of shit.

Ubisoft clearly didn't give a shit about optimising this game, why should I give two shits about playing it. It's not even a good game. My PC goes above and beyond the system requirements for ultra.

Please enlighten us about the specs of your PC, otherwise your comment only adds to the point and clueless drivel, of which we already have more than enough around here.

My guess: Low vram, no HT on CPU (or AMD CPU) and/or Multi-GPU setup, right?

All this is kind of sad considering how simple it actually is: Affordable, 60 fps, Ultra details, you can only have two of those, but not all 3 at the same time and even the latter two are never really guaranteed and more of an gamble, due to crappy multi-GPU support throughout the industry. Btw you should blame AMD and NVidia for that and not game developers, the drivers are responsible for proper support of multi-GPU setups, as it should be. Outsourcing that responsibility to game developers has never worked and will never work, because barely any development studio has the budget to get themselves all kinds of high-end multi-GPU setups just to optimize for them, even if money wouldn't be an issue, supply often ends up being the real issue with these cards that have been build in comparable limited numbers.

GTX 770 4GB, 3770k, 16GB Ram. SSD.

I wouldn't complain unless I knew I had grounds too. Fine, I might not be able to play it on all ultra, but it should STILL run better than this.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

I'm becoming a tad tired of PC games being shipping poorly optimised. Perhaps I'm in a bad mood (no, I know I'm in a bad mood), but it's fucking ridiculous how these games ship and run like shit.

BF4 - Initially busted as hell. I stuck with BF4 and it got better. Patches made online playable, and now it runs and looks awesome.

Watch Dogs - Uninstalled this piece of shit. So poorly optimised it hurts. Runs like shit, and to be honest, doesn't play much better.

Rome: Total War 2 - don't even get me started. *sigh.

Crysis 3 - ......

Do you agree that PC games should be optimised from Day 1. Is it okay that we now expect a day 1 (or week/month later) patch to solve these issues. Should this happen?

Be interested in your thoughts. Remember how much these damn things cost!

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

I uninstalled that piece of shit.

Ubisoft clearly didn't give a shit about optimising this game, why should I give two shits about playing it. It's not even a good game. My PC goes above and beyond the system requirements for ultra.

Avatar image for unilad
Unilad

722

Forum Posts

6

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

Live in Central London, will definitely come.