Rowr's forum posts

#1 Posted by Rowr (5475 posts) -

Yes, but no borderlands 2 sucks. Gearbox are a junk developer.

I could and would probably play it off my shield from my PC if I was interested.

#2 Posted by Rowr (5475 posts) -

I don't understand why it's so hard for them to do wireless on the PC. Is it just laziness?

Gotta sell those accessories yo.

#3 Posted by Rowr (5475 posts) -

I'm not a collectible guy, but that statue looks pretty fucking badass.

I thought someone said there would be a cyberpunk showing. Did i miss it?

#4 Posted by Rowr (5475 posts) -

Alrighty. As soon as they make a batarang xbone controller i'm in.

Until then does anyone know anything about when ps4 controller has support, or if that's something that is even worth waiting for.

#5 Posted by Rowr (5475 posts) -

@humanity said:

@dussck said:

@rowr said:

It always feels like ubisoft have the resources and talent to do some really interesting stuff with their triple A titles, but end up sort of going the really safe route in a "this focused better to the masses" kind of way. I've been finishing AC4 - and while i think it's pretty good, everything about that game from the puzzles to the combat is so terrified of scaring anyone away that it's just completely devoid of any challenge.

Watchdogs I feel is much the same in the way practically all the hacking is literally one button press. Surely they could of crafted a system with at least some interesting skill based mechanic associated. It's too bad because they often do really well establishing interesting settings.

Well, to their defense: there are hacking objectives which require you to complete a little puzzle. It's not that difficult most of the time, but some of them can activate a timer which makes it alot more exciting (I think when it runs out you get either CTOS or cops after you).

About Ubisoft taking the 'safe route' with their games: hell yea. Can't really blame them either, those games are just too big to take alot of risks.

Some of the "Dead Maus" boss hacking puzzles in the late game are actually kinda challenging. Those grids get pretty big. It was also really interesting to go up against another hacker and see all those tricks you've been using turned against you. Pretty awesome mission and it's funny that they used something from basically the very end of the game to demo at E3.

Ahh fair enough, yeh I'm not that far into the game. Made a few assumptions from what I have seen so far.

I think getting back to the thread title, if they were to make this game without guns the basic hacking would have to be way more interesting.

#6 Posted by Rowr (5475 posts) -

It always feels like ubisoft have the resources and talent to do some really interesting stuff with their triple A titles, but end up sort of going the really safe route in a "this focused better to the masses" kind of way. I've been finishing AC4 - and while i think it's pretty good, everything about that game from the puzzles to the combat is so terrified of scaring anyone away that it's just completely devoid of any challenge.

Watchdogs I feel is much the same in the way practically all the hacking is literally one button press. Surely they could of crafted a system with at least some interesting skill based mechanic associated. It's too bad because they often do really well establishing interesting settings.

#7 Posted by Rowr (5475 posts) -

@nethlem said:

@colourful_hippie: No performance fix will give you "Ultra textures" running on a decent framerate with those 770's and their 2 GB vram, sorry but there is nothing Ubisoft could possibly "patch" about you simply not having enough memory.

@unilad: You are right in that Watch Dogs is no Crysis, as these games don't even share the same genre, but it is a "Crysis" in the regard that it's making use of 3+ GB vram setups and HT CPU's. '

What made Crysis so "special" during it release that it basically killed every GPU you threw at it (Sounds familiar?) and you best brought a multi core CPU to the ride if you wanted to have any decent fun.

The game works well on medium settings on most setups, that's where "mass market" is actually located on the performance spectrum. But when the "mass market" expects to max out any new game, regardless of the actual performance of a setup, then the "mass market" has to be gotten pretty damn stupid and clueless.

@mb said:

@nethlem: I think you're overestimating the impact the PS4's GDDR5 is going to have on games...the new consoles are SOC's. No amount of GDDR5 in the world is going to make up for having a weak SOC GPU that needs to run on low power and be ultra quiet versus a discrete, fully powered graphics card.

Not overestimating at all, at the end of this current console gen (PS4/Xbox One) the average vram on gamer GPU's will be around 6+ GB, i'm willing to bet money on that (and most likely will lose as this console gen could also end up crashing and burning pretty soon). I already quoted Epic and Sebastien Vierd, in the above linked article, also goes into details about this, it's all about streaming from the memory.

A SOC might never be able to compete with the raw computing power of an dedicated GPU, as the SOC has to dedicate some of it's performance for tasks that are usually handled by a dedicated CPU in a PC.

But what you ignore is that an SOC is removing another bottleneck, the one between CPU and GPU, that's also why the large unified memory is so important. The PS4 basically preloads all the required assets into the memory, without having the need to "compute" them just in the moment it needs them.

And while a gaming PC might have 8+ GB dedicated system memory, it's only DDR3 which is kind of slow compared to GDDR5, in that regard a PC architecture also has a lot of more possible bottlenecks (System ram -> CPU -> GPU -> Vram (with all kinds of bridges between them) vs SOC -> System ram/vram) that's why it's easy to underestimate how much actual "CPU power" SOC's can produce, while still managing to keep a lot of free performance for GPU tasks.

Look, it's not like i'm claiming something unthinkable or never before suggested here: The long lasting 360/PS3 console gen has had PC gaming hardware requirements bottlenecked for quite a while, that's also the reason PC gaming got especially "cheap" these past years. Games that made "full use", out of the box, of the available high-end hardware just for "shiny stuff" had been very few these last years. Sure you can always crush your hardware by throwing impossible amounts of anti aliasing at it to kill your GPU with any game, but that's not really an useful benchmark for the actual performance increases (in term of new hardware and how much it actually had been better) we've had these past years.

This new console gen is way more "PC like" than many think, that's why in turn we get higher PC requirements as the "base console version" will be more demanding from the very start, so an appropriate PC version will end up even more demanding compared to 360/PS3 ports (reminder: 256 MB vram), it's the only logical course of things that in turn the "base performance" of gaming PC's has to rise over time until it hits another pseudo imposed "console ceiling".

@rowr said:

Seriously.

Condescending is one word for it.

Mr fucking know it all is pretty happy to talk all day telling us things we know and that there's obviously a single upgrade path we all should of taken, when the simple matter is like you say, that is should run better than what it is.

When i'm telling things that "we know", how come you did chose a shitty upgrade path? There also is not "a single upgrade path we all should of taken", there simply have been choices made in the past which had been the wrong ones. Look, i also frequent some hardware related forums and over there it's always been the same story with "Need help with gaming build!" threads. They result in discussions over the amount of vram required for a setup and people always skimping out on the extra vram because "no game ever uses it".

Now is the time when mainstream AAA games actually start using said extra vram, without using any third party mods, and people with the cheaper and smaller vram versions get angry at software for filling their smaller memory too fast, while the people with the extra vram are happy they can finally fill it up with something like ultra textures, it's all kind of ironic.

o my fucking god will you listen to yourself.

#8 Edited by Rowr (5475 posts) -

Rome was pretty disapointing, as was company of heroes 2.

I skipped on BF4 so I don't know.

Otherwise aside from watchdogs, I don't feel like it's been that bad of late, much better than it was a few years ago.

Crysis 3 was poorly optimisedthough?

Although that game might have been hard on systems, it looks fucking amazing so it seems fine that it would require some meat to run it on higher settings. As opposed to something like watchdogs that looks worse than AC4 but runs like shit.

I mean take a look at Crysis one. You wouldn't say it was poorly optimised, it threw everything it could at the time into the game to take advantage of systems in the future.

There is a difference between being poorly optimised and just having heavy system requirements because they threw in some really high end options for graphics settings.

@extomar said:

Yes but no one was opined "Man, what is with these unoptimized games on the PS3?" did they. Or if they did it was in the context of the idiotic console wars.

Lets re-frame the issue: Why is no one going "Man, what is with unoptimized games on the XBox One?" The "wave" has been hitting that thing for awhile now but people just seem to accept it on a more expensive and more complex piece of hardware.

So whatever. I fully expect in a couple of a couple of months someone is going to ask this again while ignoring the problems with games "looking real rough" on whatever console and no one blinks. It turns out that a game is "unoptimized" because it is a complex piece of software.

This is good news, i might head back and check that game out again.

#10 Edited by Rowr (5475 posts) -

@nethlem said:

@rowr: If you want this discussion to be dead serious: Yes the GTX 690 had been the "cheaper" choice, if one wanted to have the "kick ass gaming rig" where "money doesn't matter" one would have bought two 680's with 4 GB vram and ran them in SLI, such an setup would easily eat Watch Dogs. Multi-GPU cards are a pointless waste of money and power, unless you want to go quadruple or higher SLI/CF there is absolutely no reason at all to buy these overpriced monstrosities, these cards are more about prestige than actual performance or price/performance ratio.

SLI/CF support has always been notoriously dodgy, people who buy these cards (or SLI/CF two single cards) and expect "double the performance" didn't do their homework and thus shouldn't be allowed to waste such obscene amounts of money on hardware.

2GB cards won't be "worthless" now, but don't expect to run "Ultra" setting or Anti-Aliasing on any HD resolutions with tolerable FPS in any newer releases. People have gotten too used to "just cranking it up to max" without taking any care about what their rig is actually capable of performing, now when this won't work anymore people simply start blaming software for their own cluelessness. How many of the people complaining did actually check where their hardware is bottlenecking? Is the CPU too slow? The GPU? Does any memory fill up too much? Is the GPU throttling due to thermal issues? What kind of medium is the game running from, SSD or HDD?

Nobody gives a crap or checks for these things, even tho it's exactly those things that tell the true story about the performance of the game and what's responsible for it performing badly. Knowing these things helps one making the right upgrade choices and tweaking the right settings to get the game running at desirable framerates with the best possible look.

I did my homework, at the time i bought it the single 690 was the better option than two 680's. I don't know how you came to the conclusion its the cheaper option since it would of cost me almost exactly the same amount. I'm pretty sure it beat it out in benchmarks and there were a few aesthetic reasons i forget now such as noise. Obviously there was nothing pushing two gb vram enough that it would be a worry then, and i still don't feel there is anything legitimate that is now.

I'm realistic about the performance i expect to get, especially since im running across triple monitors. So quit making shitty assumptions. I'm not getting my panties in a knot over the fact my machine doesn't destroy this. But this game doesn't look nearly as good to justify any of the performance hit and it's goddam fact across the board that this isn't running as well as expected given what they released as requirements and what the game defaults to.

All this talk about the cpu and ssd bottlenecking or whatever else, i understand your obviously annoyed at people who just throw money at a system and don't know how it works, but in this case it's been narrowed down to the few specific issues with vram usage.

You seem to be intent to make this an issue with peoples expectations to having cheaper rigs and i guess thats a fair assumption to make and for a large percentage that might be true. But the fact is this game has performance issues across all of the newest hardware completely out of line with what you are actually getting thus why these threads actually exist and why websites have gone as far as publishing stories regarding poor performance, and why ubisoft has come out and recognised it as an issue.

Your reply was to a guy asking if he needed to worry about his 690 or upgrade. The answer is fucking no, don't be so fucking ridiculous to argue it and put ideas in someones head that they need anything more than a gtx 690 right at the minute.