Something went wrong. Try again later

Nethlem

This user has not updated recently.

828 0 28 5
Forum Posts Wiki Points Following Followers

Nethlem's forum posts

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#1  Edited By Nethlem

http://en.wikipedia.org/wiki/Glu_Mobile

http://www.glu.com/careers

http://ggnbb.glu.com/forumdisplay.php?162-Kim-Kardashian-Hollywood

The thing that people around here might know Glu for is acquiring GameSpy two years ago and subsequently responsible for the unilateral shutdown of the service. But really that's not the bulk of what makes Glu worth looking at. It's their history, their size, and particularly their global spread just looking at the jobs page.

That and out of curiosity I looked at the Kim Kardashian: Hollywood subforum. It's mostly a lot of people asking for help about how to progress or get around an apparent bug. Still, I think after Patrick's fascinating article about the game itself, it might be worth digging in further yourselves for your own benefit.


That jobs page is actually painting a pretty negative picture about that company..

This "global spread" you are seeing, it's actually them recruiting cheap creative talent from India, China and Russia (the actual grunt force of developing), while ignoring talent in their own country (the US). Only their "leads" and "managers" are getting recruited from the US or Canada. Kinda like "white supremacy" as an business model..

To put it into one sentence: That company is basically an "freemium sweatshop", they do not only exploit their potential customers, by building freemium Skinner boxes, they are also outsourcing all the actual work to cheaper countries instead of giving the young talent from their own country a chance.

Companies like these are the reason why the US economy is actually struggling, they reap in huge profits while not actually offering any meaningful employment.

It's the same situation in many civilized western countries, most young creative talent is stuck not getting a job, as companies rather recruit people from developing countries that can do the same work at the fraction of the costs, due to cheaper costs of living in these countries.

Globalization, ain't it a cool thing?


Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#2  Edited By Nethlem

@spraynardtatum said:

I definitely agree with him that people are dicks about Kim Kardashian. They totally are. She gets under peoples skin.

Because that's her whole intention! Ever heard the saying "There is no such thing as bad PR"? This especially applies to people like Kim Kardashian.

Her sex-tape had not been an "exploitation" of her, it had been calculated media exposure to boost publicity for her reality show. It's cheap and it's shallow, people like this exist in pretty much any society that has any kind of mass media.

It's sad that people consider it an actual "feat" to "win" at the "celebrity game" (whatever that ought to be). When in all reality these people are merely a mirror for how sick most societies have become in modern times. They are only famous for their narcissism, they add nothing to peoples lives besides shallow entertainment based on envy, faked companionship and the ancient principle of "sex sells".

While at the same time billions of other people are doing actually good and useful things for other people on a daily basis, yet they never get the attention or reward they'd actually deserve for acting in such selfless ways.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#3  Edited By Nethlem

@mister_v said:

@nethlem: I'm allowed an opinion dude, no need to be a dick about it.

Sorry, my intention had not been to be a dick! :(

I merely tried to point out the differences between those games, i felt obliged to do so after you said that "gotta play it to get it" part.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@mister_v said:

Hmm.. can't say i'm too bothered. I don't really see the appeal of Evolve. It just seems like a scaled down Natural Selection. Maybe it's one of those games you need to play to "get it"

Will be interesting to see what they do with the extra time.

A scaled down Natural Selection? Either you don't understand what Evolved is actually about or you never played Natural Selection.
These games have pretty much nothing in common, except for sharing the "meele vs ranged" theme in an FPS multiplayer game.

There are no PvE elements in Natural Selection, there are no RTS elements in Evolved, yet both of these are very defining features for these games.

For all purpose and effect, Evolved is actually the "more asymmetric competitive FPS" because it's asymmetry even goes as far as having different team sizes, while Natural Selection ends up being only asymmetric in terms of each sides abilities, but not playernumbers.


Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@colourful_hippie: No performance fix will give you "Ultra textures" running on a decent framerate with those 770's and their 2 GB vram, sorry but there is nothing Ubisoft could possibly "patch" about you simply not having enough memory.

@unilad: You are right in that Watch Dogs is no Crysis, as these games don't even share the same genre, but it is a "Crysis" in the regard that it's making use of 3+ GB vram setups and HT CPU's. '

What made Crysis so "special" during it release that it basically killed every GPU you threw at it (Sounds familiar?) and you best brought a multi core CPU to the ride if you wanted to have any decent fun.

The game works well on medium settings on most setups, that's where "mass market" is actually located on the performance spectrum. But when the "mass market" expects to max out any new game, regardless of the actual performance of a setup, then the "mass market" has to be gotten pretty damn stupid and clueless.

@mb said:

@nethlem: I think you're overestimating the impact the PS4's GDDR5 is going to have on games...the new consoles are SOC's. No amount of GDDR5 in the world is going to make up for having a weak SOC GPU that needs to run on low power and be ultra quiet versus a discrete, fully powered graphics card.

Not overestimating at all, at the end of this current console gen (PS4/Xbox One) the average vram on gamer GPU's will be around 6+ GB, i'm willing to bet money on that (and most likely will lose as this console gen could also end up crashing and burning pretty soon). I already quoted Epic and Sebastien Vierd, in the above linked article, also goes into details about this, it's all about streaming from the memory.

A SOC might never be able to compete with the raw computing power of an dedicated GPU, as the SOC has to dedicate some of it's performance for tasks that are usually handled by a dedicated CPU in a PC.

But what you ignore is that an SOC is removing another bottleneck, the one between CPU and GPU, that's also why the large unified memory is so important. The PS4 basically preloads all the required assets into the memory, without having the need to "compute" them just in the moment it needs them.

And while a gaming PC might have 8+ GB dedicated system memory, it's only DDR3 which is kind of slow compared to GDDR5, in that regard a PC architecture also has a lot of more possible bottlenecks (System ram -> CPU -> GPU -> Vram (with all kinds of bridges between them) vs SOC -> System ram/vram) that's why it's easy to underestimate how much actual "CPU power" SOC's can produce, while still managing to keep a lot of free performance for GPU tasks.

Look, it's not like i'm claiming something unthinkable or never before suggested here: The long lasting 360/PS3 console gen has had PC gaming hardware requirements bottlenecked for quite a while, that's also the reason PC gaming got especially "cheap" these past years. Games that made "full use", out of the box, of the available high-end hardware just for "shiny stuff" had been very few these last years. Sure you can always crush your hardware by throwing impossible amounts of anti aliasing at it to kill your GPU with any game, but that's not really an useful benchmark for the actual performance increases (in term of new hardware and how much it actually had been better) we've had these past years.

This new console gen is way more "PC like" than many think, that's why in turn we get higher PC requirements as the "base console version" will be more demanding from the very start, so an appropriate PC version will end up even more demanding compared to 360/PS3 ports (reminder: 256 MB vram), it's the only logical course of things that in turn the "base performance" of gaming PC's has to rise over time until it hits another pseudo imposed "console ceiling".

@rowr said:

Seriously.

Condescending is one word for it.

Mr fucking know it all is pretty happy to talk all day telling us things we know and that there's obviously a single upgrade path we all should of taken, when the simple matter is like you say, that is should run better than what it is.

When i'm telling things that "we know", how come you did chose a shitty upgrade path? There also is not "a single upgrade path we all should of taken", there simply have been choices made in the past which had been the wrong ones. Look, i also frequent some hardware related forums and over there it's always been the same story with "Need help with gaming build!" threads. They result in discussions over the amount of vram required for a setup and people always skimping out on the extra vram because "no game ever uses it".

Now is the time when mainstream AAA games actually start using said extra vram, without using any third party mods, and people with the cheaper and smaller vram versions get angry at software for filling their smaller memory too fast, while the people with the extra vram are happy they can finally fill it up with something like ultra textures, it's all kind of ironic.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#6  Edited By Nethlem

@bonbolapti said:

The thing that kind of bothers me, is that Aiden is not really a hacker. Well is he? He’s a fixer right? and what a fixer is… is a hacker that spends more time outside than a typical hacker, who probably just goes to the rainymood website to get their fill of environment. But one day after a tragic accident, he’s started using his fixer powers for good and became the city’s vigilante. ( They call him Fox. And he looks a little bit like Matthew Fox. So good on the fictional people of Chicago for making that connection.)

So here comes the part where I stop believing he is a hacker of any sort. ctOS is a city wide security system for the mystical land of Chicago. It is a singular operating system for EVERYTHING. It’s in your webcams, it’s in the traffic lights… It’s even in your grenades and waterpipes.

It’s “complex system” actually seems overly simplified. And with so much “hackers” in Chicago, I’m more inclined to believe that BLUME (the company behind ctOS) designed it to be open source, and the Apple app store is riddled with hundreds of apps for your phone that let you do anything you want in the city at the click of a button.

Maybe BLUME shouldn’t have attached a NFC to your plumbing.

Man pile.
Man pile.

So Aiden runs around the city staring at nothing but his smartphone to accomplish all of his goals. Hard for cash? There’s 30 people in a 10 foot radius that have all the money that you don’t even need. Push a button. Trying to walk through this door? Ghost Trick your way through security cameras, until you see the box that's locking it. Push a button.

It’s these simple maneuvers that make ctOS incredibly flawed and you really start to wonder why Chicago would even agree to have it run the entire city.

So what exactly is an "hacker" to you, if Aiden doesn't qualify because "he goes out too much"? I believe you are kind of influenced by a bad stereotype here, the stereotype about the basement dwelling fat nerd hacker that doesn't have any people skills and instead relies on his superior technical knowledge, but that stereotype is not only dated it's also horribly wrong and cliche.

Just take a look at the original hacker manifesto that doesn't describe a skill-set or dictates any rules. Instead it describes a certain mindset, the mindset of the curious explorer and tinkerer who is not bound by realities, a mindset that's in many ways very similar to that of gamers.

Being a "hacker" is not all about "hacking code into your keyboard", it's a general tinkerer mindset that extends to everything and even everyone, in essence Conman have always been the original hackers.

That's also the reason why many modern "hacks" do not even rely in actual coding/IT skills, but rather people manipulation skills. Far more Xbox Live and Apple accounts have been compromised trough "gaming" people like the account owners and support hotlines of the companies than trough actually compromised hardware trough intrusion. Why go trough all the effort of sniffing out their systems, writing applications, finding vulnerable targets, when you can simply talk a person on the support hotline into giving you access to an strangers account by merely having the most basic data about that stranger.

Recycling is a great thing in that regard, having all the paper trash in an separate bin concentrates all the worthwhile information in a single (not so dirty) place, just waiting for some curious person to sort trough it and find all kinds of interesting and potentially compromising information, without having written a single line of code. It as simple as that, it's just that not many people actually realize it because they fail to think outside the "day to day routine" box that all of us are living in most of the time.

It something that Watch Dogs pays a little bit homage to when Aiden explains how he got to know Damien. He's saying that Damien taught him the "code skills" while he taught Damien the "people skills" (or something along those lines).

About the ctOS system being incredibly flawed: I have no doubt that's exactly how it would play out in reality, it's actually how it's already playing out. How many people are using baby monitors without being aware about their flaws? And those are not some rare freak accidents where some company did cheap out on something, lack of proper security on all things networked is a pretty common theme. Got a Smart TV with an camera for Skyping? Well it's already sending out what you are doing for everybody to see, it's only a question of time before everybody, with the right kind of knowledge, will be able to use that Smart TV's camera.

People do not realize how vulnerable they are already, because right now barely anybody bothers with the effort to exploit those vulnerabilities and combine them, but when somebody actually tries the results are usually pretty impressive. Like stalking a whole city:

Loading Video...

Did you for example know that even the cheapest SD cards actually contain a microcontroller? Yes there is a small "CPU" inside nearly every of your SD cards!

Not just that, you can actually execute code on those cards or, in essence, have them execute code on any machine their are connected with:

Loading Video...

The world is a crazy place like that, even without accounting for the NSA actively sabotaging privacy and security on the Internet or inventing new ways to get to the world outside of the internet.

From power plants (even nuclear ones), to banks, to police databases or military systems, we've seen all of them getting compromised already and those had only been the cases where we did actually "see" or notice it, nobody really knows about the real "dark number" for these cases. After all the smartest crimes are those that never get noticed, the same logic applies to compromising systems.

So with a little bit of imagination the "reality" of Watch Dogs, with it's Chicago ctOS system and it being so "hackable", does not really seem that implausible. In essence it's just a seriously streamlined power fantasy of things that are already possible, right now they just require a lot more effort and preparation.


All that aside: I agree with quite a few of your points, the "hacking" mostly boils down to being an "ranged use" button, with mostly entertaining results. Aiden as an character seems inconsistent in quite a few places for the sake of gameplay and the whole "revenge for the killed niece" arc feels underwhelming due to presentation and how the player is introduced to the setup.

I still enjoy the game for it's looks and action, just messing around with parts of the city, the Jordi character barely manages to be entertaining for now (not trough the story yet) but overall the game feels just so "regular" in a few too many places.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#7  Edited By Nethlem

@unilad said:

@nethlem said:

Please enlighten us about the specs of your PC, otherwise your comment only adds to the point and clueless drivel, of which we already have more than enough around here.

My guess: Low vram, no HT on CPU (or AMD CPU) and/or Multi-GPU setup, right?

All this is kind of sad considering how simple it actually is: Affordable, 60 fps, Ultra details, you can only have two of those, but not all 3 at the same time and even the latter two are never really guaranteed and more of an gamble, due to crappy multi-GPU support throughout the industry. Btw you should blame AMD and NVidia for that and not game developers, the drivers are responsible for proper support of multi-GPU setups, as it should be. Outsourcing that responsibility to game developers has never worked and will never work, because barely any development studio has the budget to get themselves all kinds of high-end multi-GPU setups just to optimize for them, even if money wouldn't be an issue, supply often ends up being the real issue with these cards that have been build in comparable limited numbers.

Christ, you're a condescending motherfucker.

GTX 770 4GB, 3770k, 16GB Ram. SSD.

I wouldn't complain unless I knew I had grounds too. Fine, I might not be able to play it on all ultra, but it should STILL run better than this.

Still "run better than this" based on what? You have to realize that you are throwing highly subjective terms around here. How did it run on what settings? Please define "STILL run better" in an way that's actually meaningful, otherwise your are sadly just writing words for the sake of writing words. Did you honestly expect any current gen open world game to easily run on constant 60 fps, while not looking like complete shit?

And yes i might be condescending, but this whole "drama" is just so hilarious... one side of the web is complaining that Watch Dogs is favoring Nvidia too much, due to shitty performance on AMD cards, while over here people with Nvidia cards are complaining about "not enough performance".

When in reality the issue simply boils down to people running the game with too high of an texture resolution and/or anti-aliasing while not having enough vram to support these kinds of settings. A lot of the people with "massive performance issues" would simply need to drop down texture quality and/or AA one notch to bring the game from "unplayable" to "rock solid". Of course you can just ignore all technical realities and pretend that "the game is badly optimized" after you tried to cram ultra textures with AA into those 2 GB vram of your GPU, which in turn leads to the game outsourcing assets to the swapfile on the harddrive (which ends up being an super slow HDD for a lot of people) once the vram is full and that's what is killing the performance so heavily for a lot of people. Or just to quote Sebastien Viard, the game's Graphics Technical Director:

“Making an open world run on [next-generation] and [current-generation] consoles plus supporting PC is an incredibly complex task.” He goes on to to say that Watch Dogs can use 3GB or more of RAM on next-gen consoles for graphics, and that “your PC GPU needs enough Video Ram for Ultra options due to the lack of unified memory.” Indeed, Video RAM requirements are hefty on PC, especially when cranking up the resolution beyond 1080p."

That last part is especially important, even tho way too many people believe it does not apply to them: "especially when cranking up the resolution beyond 1080p", this also applies when you enable anti-aliasing while running on 1080p, in essence AA is an resolution boost. That's also the reason why AA is among the first things you disable/tone down when something runs extremely shitty, it's seriously resource heavy on the GPU/vram. But too many people refuse to tone down the AA because they've gotten too used to Xbox360/PS3 era style games where current PC tech had been heavily unchallenged and thus people could just "crank it up to the max" without many issues.

But it's not really that complicated...

Just get Fraps to keep track of your fps and MSI Afterburner to keep track of your computers resources, once these two run you start looking for the settings that match your setup. Is your GPU not on 100% load? Crank up some visual detail! Is your vram filling up to the brim? Tone down some of these details (Textures, resolution and AA fill up vram)! Is your CPU actually busy doing something or is your GPU bottlenecking it? All these things matter in regards to the gaming performance of an individual rig.

I can run this game with a 3 year old mid-range CPU and an current gen mid-range AMD GPU on nearly "max settings", while still going into the 60 fps (depending on scene), please tell me more about that "poor performance", because i certainly can't witness anything like that over here. Is it the best optimized game ever? Nope not really, open world games rarely are, but too many people are making a mountain out of a molehill, because they simply refuse to tone down their visual settings just one notch.

These people won't have much fun in the years to come, for the next 1-2 years they might still be able to blame "bad optimization/console ports", but sooner or later the 8 GB unified GDDR5 memory inside the PS4 will take their toll on the requirements for future PC versions of these games, it's called progress and I'm damn happy that we are finally having some again.

Or have people already forgotten that even Epic wanted to have more memory in the next gen consoles? Last console gen had been heavily bottlenecked by it's memory, thus in turn nobody did really make great use with all the extra memory that's present in most PC's. This gen isn't about actual computing power (we didn't have super large jumps in the department), it's all about memory size (there we had large jumps due to the prevalence of SSD's, memory chips have become cheap).

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#8  Edited By Nethlem

@unilad said:

I uninstalled that piece of shit.

Ubisoft clearly didn't give a shit about optimising this game, why should I give two shits about playing it. It's not even a good game. My PC goes above and beyond the system requirements for ultra.

Please enlighten us about the specs of your PC, otherwise your comment only adds to the point and clueless drivel, of which we already have more than enough around here.
My guess: Low vram, no HT on CPU (or AMD CPU) and/or Multi-GPU setup, right?

All this is kind of sad considering how simple it actually is: Affordable, 60 fps, Ultra details, you can only have two of those, but not all 3 at the same time and even the latter two are never really guaranteed and more of an gamble, due to crappy multi-GPU support throughout the industry. Btw you should blame AMD and NVidia for that and not game developers, the drivers are responsible for proper support of multi-GPU setups, as it should be. Outsourcing that responsibility to game developers has never worked and will never work, because barely any development studio has the budget to get themselves all kinds of high-end multi-GPU setups just to optimize for them, even if money wouldn't be an issue, supply often ends up being the real issue with these cards that have been build in comparable limited numbers.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

#9  Edited By Nethlem

@rowr said:

I did my homework, at the time i bought it the single 690 was the better option than two 680's. I don't know how you came to the conclusion its the cheaper option since it would of cost me almost exactly the same amount. I'm pretty sure it beat it out in benchmarks and there were a few aesthetic reasons i forget now such as noise. Obviously there was nothing pushing two gb vram enough that it would be a worry then, and i still don't feel there is anything legitimate that is now.

I'm realistic about the performance i expect to get, especially since im running across triple monitors. So quit making shitty assumptions. I'm not getting my panties in a knot over the fact my machine doesn't destroy this. But this game doesn't look nearly as good to justify any of the performance hit and it's goddam fact across the board that this isn't running as well as expected given what they released as requirements and what the game defaults to.

All this talk about the cpu and ssd bottlenecking or whatever else, i understand your obviously annoyed at people who just throw money at a system and don't know how it works, but in this case it's been narrowed down to the few specific issues with vram usage.

You seem to be intent to make this an issue with peoples expectations to having cheaper rigs and i guess thats a fair assumption to make and for a large percentage that might be true. But the fact is this game has performance issues across all of the newest hardware completely out of line with what you are actually getting thus why these threads actually exist and why websites have gone as far as publishing stories regarding poor performance, and why ubisoft has come out and recognised it as an issue.

Your reply was to a guy asking if he needed to worry about his 690 or upgrade. The answer is fucking no, don't be so fucking ridiculous to argue it and put ideas in someones head that they need anything more than a gtx 690 right at the minute.

You did not do your homework, you are still not even trying to do it!

Two 680's with 4 GB (ON EACH CARD) end up being more expensive than a single 690, because the 4 GB versions of the cards cost a small premium. Sure, if you compare the 690 to two 680's 2 GB, then the price ends up being around the same, but 2 GB of vram had been a questionable choice for a while, especially for full hd+ resolutions, multimonitor setups and/or anti aliasing, all of these take a very heavy toll on the vram, even more so if you combine them.

You are in no way being "realistic" here with statements like "You don't need much vram!" while running a multi-monitor setup, previously you even admitted that you didn't know that the 690's "4 GB vram" only count as 2 GB, until after you actually bought the card, bad homework from which you seem to have learned nothing. One can never have enough vram, especially if one plans on making actual use of PC features like 4k resolution, multi-monitor setups or anti-aliasing filters, granted the GPU is actually fast enough to fill up that vram.

Just because somebody spent a lot of money on that 690 doesn't mean that it had been the best (aka most future proof choice), it's not the amount of money that matters but what you spent it on, you don't need to buy all the most expensive "high end" models, the performance editions (one level below that) usually deliver the most bang for the buck value while usually being pretty future proof.

It also doesn't matter "what the game defaults to", if the "default" does not fit you then you are free to change the settings to customize how the game looks and runs to your likening, welcome to PC gaming.It's also not an "fact" that the game "isn't running well across the board", it simply isn't running well for people that had the wrong expectations out of their hardware (Like 4 GB 690's with perfect SLI scaling, yeah..) and are clueless about hardware in general. I know this sounds offensive to a lot of people who consider themselves some "master race PC gamer" because they spent 2000+ bucks on their rigs, but being offensive is not my intention, my intention being educational.

Because there is solid proof that a lot of the issues boil down to lack of vram and HT in a ton of setups, Watch Dogs performance profits immensely from having a beefy CPU with HT and a GPU with a lot of vram, PC Games Hardware (these people are considered among the most competent in regards to PC gaming hardware) has a lengthy in-depth article on Watch Dog's performance and how different hardware combinations perform. Sadly it's in German and a lot of the actual conclusions from the benchmarks are written in the text. People tend often just to look at the graphs and interpret them the wrong way, "Uhh look at the shitty fps even a Titan has!" while ignoring that the benchmark has been run with supersampling enabled representing the "worst case scenario", just looking at individual graphs, without seeing them in the overall context, is a pretty weak way for judging performance of hardware or software.

I've been building my own gaming rigs for close to 20 years now, i do not simply talk out of my ass with the stuff i'm writing here. Hyper threading and vram are the two things a lot of people cheaped out in when they upgraded their rig these past years, because back then everybody would say "No game needs so much vram, no game supports HT, save the money don't buy it!", a lot of people are giving this very same advice to this day. But that advice had been wrong (for the past 3 years) and will be even more wrong the more "next gen" games get released. A lot of them will support HT, leading to nice performance boosts, while also requiring a lot of vram for high res textures.

Just to put this "jump" into a comparable number: The PS3 had 256 MB of vram, the PS4 can, theoretically, go up to 8 GB of vram, this increase in performance now also tickles down on the requirements for the PC versions of games, which is kind of nice considering that PC gaming has become kind of stale these past years. Barely any games had been really "demanding" enough for the hardware, with the exception of running them in Ultra HD resolutions and/or anti-aliasing and in those scenarios a lot of the GPU's end up being simply overwhelmed.

But telling people that 2 GB of vram, right now, are in some way or shape "future proof" is simply lying to people, especially if those people are planing on using "Ultra" texture setting or AA on any newer release.

Avatar image for nethlem
Nethlem

828

Forum Posts

0

Wiki Points

5

Followers

Reviews: 0

User Lists: 0

@rowr: If you want this discussion to be dead serious: Yes the GTX 690 had been the "cheaper" choice, if one wanted to have the "kick ass gaming rig" where "money doesn't matter" one would have bought two 680's with 4 GB vram and ran them in SLI, such an setup would easily eat Watch Dogs. Multi-GPU cards are a pointless waste of money and power, unless you want to go quadruple or higher SLI/CF there is absolutely no reason at all to buy these overpriced monstrosities, these cards are more about prestige than actual performance or price/performance ratio.

SLI/CF support has always been notoriously dodgy, people who buy these cards (or SLI/CF two single cards) and expect "double the performance" didn't do their homework and thus shouldn't be allowed to waste such obscene amounts of money on hardware.

2GB cards won't be "worthless" now, but don't expect to run "Ultra" setting or Anti-Aliasing on any HD resolutions with tolerable FPS in any newer releases. People have gotten too used to "just cranking it up to max" without taking any care about what their rig is actually capable of performing, now when this won't work anymore people simply start blaming software for their own cluelessness. How many of the people complaining did actually check where their hardware is bottlenecking? Is the CPU too slow? The GPU? Does any memory fill up too much? Is the GPU throttling due to thermal issues? What kind of medium is the game running from, SSD or HDD?

Nobody gives a crap or checks for these things, even tho it's exactly those things that tell the true story about the performance of the game and what's responsible for it performing badly. Knowing these things helps one making the right upgrade choices and tweaking the right settings to get the game running at desirable framerates with the best possible look.

@mb said:

@nethlem: I don't agree with your assessment at all. I think games like Watch_Dogs run terribly on PC because Ubisoft built it with consoles in mind and the PC port was bad - not because the new consoles are in any way better than modern, top end gaming rigs that run Watch Dogs like...dogs. I have a machine that far exceeds the recommended specs for Watch Dogs, I mean for gods sake my video card is two entire generations newer than the one recommended, and I can't even get a stable 60fps at a measly 1920x1080. This has nothing to do with the size of textures and everything to do with how poorly it was programmed.

In no way did i imply that new consoles are "more powerful than top end gaming rigs" (even tho the PS4 has a certain edge with it's 8 GB of unified GDDR5 memory), i merely pointed out that they are way more powerful compared to the previous gen hardware Xbox360/PS3, especially in terms of memory.

People have gotten too used to the performance ceiling these old consoles imposed on the majority of games, these past 5 years you could basically max out any game even with a modest mid-tier gaming rig, that's been the result of the last console gen having been around for so long.

Heck I've used a HD 5870 1GB for these past 4 years and gotten along pretty nicely in most games with mostly high settings and tolerable fps, it's been a week since i upgraded to an R9 280x 3GB (got one from ebay for 180€) because Titanfall wouldn't run too nicely and i knew that Watch Dogs would end up with very high vram requirements (like all open world games). But my HD 5870 would still run these games in mid settings with tolerable FPS, an 4 year old graphics card!

My 3 year old 2600k@4,4 GHZ will most likely be beefy enough for this whole current console gen in games, people who opted for lower end i5's gonna have to get a new CPU down the line this console gen.

A decade ago a 4 year gap in GPU upgrade would usually mean not being able to play the "newest releases" at all, and don't get me started on the times when new CPU generations meant having quadrupled power every 2 years.

This might sound harsh, but there are way too many clueless people complaining around here, statements like "My graphics card is 2 generations newer than the recommended one!" ooze with ignorance about the way PC gaming hardware works. Your graphics card could be twenty generations newer for all i care, if you bought a budget model with low memory bandwidth, memory size and shitty clockspeeds that won't help you much running any game on demanding settings, because these past few generations of graphics cards have mostly been simple rebrandings of old architectures by Nvidia and AMD.

Only a few people around here do actually troubleshoot where their shuttering issues are actually coming from, by using tools like MSI Afterburner to check vram usage or simply monitoring temps to make sure nothing gets throttled down. And surprise surprise: These people do not complain about performance issues.

The thing Ubisoft fucked up with had been the pagefile check, looks like the game checked too often for the state of the pagefile before actually writing too it when the memory is full, resulting in even worse performance in situations where performance is already shitty due to full memory, but that issue can simply be fixed with the -disablepagefilecheck start parameter. It's also likely that they have a memory leak somewhere, leading to decreased performance when the game is running for extended periods of time and especially bad performance in situations when the memory is filled up, but these kind of memory leaks are pretty common, especially among open world games with lots of assets and complex systems/interactions.

GTAIV on PC has similar insane vram requirements for that very same reason, especially if you wanted to use the custom high-res mods you'd better hope you got at least something around 4 GB of vram in your system.


But GTAIV looks like shit compared to Watch Dogs on PC, even GTA5 only has aesthetics over Watch Dogs (The GTA5 world is simply build more carefully with a lot more love for detail), but in terms of what's "going on under the hood" Watch Dogs is actually all kinds of impressive, the game just does not often show it off that great. A lot of the environment is destructible in an very impressive way, but players hardly notice it because they just sneak past the action or the action is happening so fast (driving) that a lot of the details get lost in the frenzy. For example: The first hideout you start in, the motel, try getting a 5 star cop rating there and watch how the place falls apart from gun fire over time.

Watch Dogs is not a game that looks that impressive on screenshots, it's the moving action with all the particle effects that make the game look great, on the right kind of settings.
With my above mentioned setup, (R9 x280, 2600k@4,4Ghz, 8 GB ram, SSD) i can run the game with Ultra Textures, Temp SMAA, LoD Ultra and everything else on high (except water being on medium, that's also responsible for a lot of strain on the hardware) with 30-60 fps, mostly 60 on foot and 30 while driving.

The game looks worlds apart from the PS4 version and the PS4 never even comes close to 60 fps, so much about that.