AC U barely above 30 FPS on a GTX 780

  • 87 results
  • 1
  • 2
Avatar image for asilentprotagonist
ASilentProtagonist

738

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

A user on Neogaf by the name "Eziocroft" manged to snag this game early on PC and well he ran into some questionable performance. He barely was keeping above 30 FPS maxed out 1080p with a i7 2600k and GTX 780 - 1080p build with V-Sync turned off..... yes you totally read that right V-Sync is turned off and it runs that poorly. Disabling TXAA with the new drivers lead him to increased performance of 45-50 FPS which is good but V-Sync was still off which could be a big problem for people with more sensitive monitors. The view distance isn't any different then on consoles and suffers from very noticeable pop-in even on PC...

Link to his forum discussion - http://www.neogaf.com/forum/showthread.php?t=930145

No Caption Provided

Avatar image for deactivated-64162a4f80e83
deactivated-64162a4f80e83

2637

Forum Posts

39

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

That's all well and good

but the game is probably going to get patched and drivers are probably going to be released

Avatar image for hassun
hassun

10300

Forum Posts

191

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#3  Edited By hassun

Well the game did have obscene system requirements. It would be pretty funny if they were actually completely legit.

Avatar image for fear_the_booboo
Fear_the_Booboo

1228

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 3

#4  Edited By Fear_the_Booboo

The only game my PC cannot run maxed out at 60fps is Watch_Dogs and it does not look the part at ALL.

Unity actually looks super good, so if the optimization is as shitty as it was for Watch_Dogs, no wonder it takes an insane PC.

Avatar image for crithon
crithon

3979

Forum Posts

1823

Wiki Points

0

Followers

Reviews: 1

User Lists: 11

sigh, even AC1 ran horrible at PC Launch. Ubisoft has the worst track record with PC releases, we are just too used to getting the game on sale 8 months later and it runs better.

Avatar image for evilsbane
Evilsbane

5624

Forum Posts

315

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#6  Edited By Evilsbane

I don't think much can be said till actual launch, Day 1 GPU patches make a huge difference take Evil Within Pre-launch day patch it was a mess.

Not saying it's right or that it won't run like crap but ill wait till tomorrow.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

Why am I not surprised.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

TXAA is a piece of shit, always has been. It absolutely slays performance for very little return. Current versions of SMAA are far more effective, often have better visual quality and cost only a fraction of the overall framerate by comparison.

Avatar image for mcfart
Mcfart

2064

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#9  Edited By Mcfart

Tell him to play it on Xbone/PS4 then. People need to realize that their 4 year old Sandy Bridge PC ain't enough anymore for modern PC Games.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

I will also re-iterate what I've mentioned in other threads. I played a near final build of this game at PAX Aus weekend before last on Xbox, the framerate frequently dropped bellow 30 which means the experience is no more consistent on the console platforms (maybe the PS4 will run it better since it's been locked to the same spec as the Xbox).

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mcfart said:

Tell him to play it on Xbone/PS4 then. People need to realize that their 4 year old Sandy Bridge PC ain't enough anymore for modern PC Games.

His i7-2600k and the recommended i7-3770k are essentially identical in terms of gaming performance.

Avatar image for hassun
hassun

10300

Forum Posts

191

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

People are playing it on Alexis Gallisa's twitch.tv right now. I'm seeing 30s/40s in combat and high 50s/low 60s in the open world with a GTX970 at 1080p.

Not sure about the settings though.

Avatar image for hunkulese
Hunkulese

4225

Forum Posts

310

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

"Edit: I definitely suggest reading a bit more than the OP (my posts in the thread), tl;dr - game runs fine."

Avatar image for somejerk
SomeJerk

4077

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

"Maxed out".

ie "duh".

Avatar image for i_stay_puft
I_Stay_Puft

5581

Forum Posts

1879

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#15  Edited By I_Stay_Puft

I just want to remind folks that AC IV on some beefy machines ran poorly as well on launch. Looks like the main way to play this is on the lowest possible settings or grabbing it on today's gen. consoles

Avatar image for hassun
hassun

10300

Forum Posts

191

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@hunkulese: Or filter out everything except Durante's posts.

Avatar image for brendan
Brendan

9414

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

It'll be interesting to see how it runs on release day. If it still struggles I wonder to what extent would the touted world illumination have to do with performance.

Avatar image for memu
Memu

454

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

It's all in how you say it. "AC Unity stays above 30FPS even when maxed out!"

I love me some 60FPS. But I am willing to give up a few graphics cycles if it goes into better AI and gameplay. Just sayin this doesn't tell the whole story.

Avatar image for privodotmenit
PrivodOtmenit

553

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19  Edited By PrivodOtmenit

@mcfart said:

Tell him to play it on Xbone/PS4 then. People need to realize that their 4 year old Sandy Bridge PC ain't enough anymore for modern PC Games.

Except a Sandy Bridge i5 or i7 gets near identical performance (under 5% difference) to latest generation Haswell i5 or i7 when it comes to gaming, heck even the generation before Sandy (not sure what they were called) are still up there with the latest generations, this is still the case on ports of current gen console games.

If you are going to make such claims at least get your information right... Assassin's Creed have always been pretty bad on PC, Ubisoft can make good PC versions (Far Cry 3, Blacklist) but AC has had performance oddities for a long time if memory serves me.

Honestly it's very hard, I would basically say impossible (so far) for a game to hit the ceiling of a CPU's performance, that's why there's few benchmarks where state of the art CPUs pull ahead over the average PC gamer model (such as the i5 4670 or i7 4770k), I don't think any game ever has put the amount of demand on a CPU that rendering video or running Folding@Home does. All you need is an efficient processor (clock speed doesn't equal better, after all) and that will last years, it's the reason the i5 and i7 are so heralded for their gaming prowess; they last a long time and put out great gaming performance.

http://abload.de/img/acu2014-11-1021-27-11tps97.png

http://abload.de/img/acu2014-11-1021-29-115ss0h.png

http://abload.de/img/acu2014-11-1021-29-23dhs4i.png

PC at max settings.

I'm not too impressed with how it looks, the lighting is really nice but it's masking a lot of rough edges. I feel a bit mislead by the E3 showing because it looked much better.

Edit: You know what, I think the lighting is similar to Mafia 2, which in its own right was a very pretty game.

Avatar image for hunkulese
Hunkulese

4225

Forum Posts

310

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I just want to remind folks that AC IV on some beefy machines ran poorly as well on launch. Looks like the main way to play this is on the lowest possible settings or grabbing it on today's gen. consoles

I really don't understand this point of view. If you have a decent PC there's zero reason to get it for PS4. The PS4 is not running the game maxed at 1080p and 60 fps. It's 30 fps and 900p. You don't need a high end PC to beat that.

Avatar image for conmulligan
conmulligan

2292

Forum Posts

11722

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

#21  Edited By conmulligan

@hunkulese said:

If you have a decent PC there's zero reason to get it for PS4.

That's not true at all. I have a high-end PC, and I'm still planning on getting a console version of Unity, mostly because I prefer playing those games on a TV with an audio receiver, and it's a pain in the ass to run HDMI and SPDIF cables from my PC to my media centre. Plus, I'm much more likely to wrangle a couple of friends into doing the co-op stuff on a PS4 or Xbox than I am on PC.

Avatar image for fritzdude
FritzDude

2316

Forum Posts

3064

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Playing Ubisoft games on release for the PC is like playing a beta or an unfinished game. Five game updates and two new video drivers later I'm sure it will be a tiny bit better. Even though my Haswell/Maxwell combo could probably run it at an acceptable level as is I will never buy their games at release. Shame too because I enjoy most of their games.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@i_stay_puft said:

I just want to remind folks that AC IV on some beefy machines ran poorly as well on launch. Looks like the main way to play this is on the lowest possible settings or grabbing it on today's gen. consoles

I really don't understand this point of view. If you have a decent PC there's zero reason to get it for PS4. The PS4 is not running the game maxed at 1080p and 60 fps. It's 30 fps and 900p. You don't need a high end PC to beat that.

Indeed, the logic leap it takes to say "oh it's not immaculate on PC, better off playing a version that runs equal to or worse with lower visual quality" is quite strange.

Avatar image for athleticshark
AthleticShark

1387

Forum Posts

298

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By AthleticShark

Probably has to do with the fact it isn't released yet. It uses the ubisoft DRM and online so that probably has to do with it

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

Probably has to do with the fact it isn't released yet. It uses the ubisoft DRM and online so that probably has to do with it

If the DRM was the problem (Uplay) then he wouldn't be playing the game at all because it wouldn't be unlocked.

Avatar image for privodotmenit
PrivodOtmenit

553

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#26  Edited By PrivodOtmenit

@korwin said:

@hunkulese said:

@i_stay_puft said:

I just want to remind folks that AC IV on some beefy machines ran poorly as well on launch. Looks like the main way to play this is on the lowest possible settings or grabbing it on today's gen. consoles

I really don't understand this point of view. If you have a decent PC there's zero reason to get it for PS4. The PS4 is not running the game maxed at 1080p and 60 fps. It's 30 fps and 900p. You don't need a high end PC to beat that.

Indeed, the logic leap it takes to say "oh it's not immaculate on PC, better off playing a version that runs equal to or worse with lower visual quality" is quite strange.

It's a bit weird, if you are happy enough to play PC games at medium and 30 fps when the tech advances too much for your machine at high/60, it will still look as good as console versions (in most cases) and your PC would last a very long time by accepting that. BF4 on low still looks much better than the last generation version did, and medium is about the same as PS4.

If you have a GTX 770 now, you can probably make that last 6 years by accepting that eventually you will have to start turning down some settings, and you are still getting an equal experience to our consoles. I made a GTX 480 last almost 4 years easily (until it died this year), it could still run most games on high or ultra 40-60 fps 1080 with some small sacrifices like AA and HBAO turned off, tessellation too of course.

I have a PS4 and even if I couldn't run a game on ultra I'd probably buy it on PC because it's cheaper and playing on medium or high would still likely look as good if not better. (and maybe I'll also squeeze 60 fps and a higher resolution out of it too)

Avatar image for snakeitachi
snakeitachi

214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Wow......

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

It also turns out that the guy running the benchmarks is now running 4xMSAA, also a higher framerate impact option. Unfortunately the only other option is FXAA, they've removed SMAA support which they had in 4 (although SMAA in 4 was busted in Windows 8.1 because they used an old version).

Avatar image for deactivated-601df795ee52f
deactivated-601df795ee52f

3618

Forum Posts

6548

Wiki Points

0

Followers

Reviews: 1

User Lists: 1

Ubisoft games on PC

lol

Avatar image for amyggen
AMyggen

7738

Forum Posts

7669

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#30  Edited By AMyggen
No Caption Provided

Avatar image for koobz
koobz

431

Forum Posts

3093

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#31  Edited By koobz

It should be noted that the OP in that NeoGAF thread DOESN'T have updated GPU drivers (there is a new driver specifically optimizing AC:U performance that came out today), and DOESN'T have the two release-day patches for the game, both of which also improve PC performance (supposedly).

If you move deeper into the thread and read posts from people not just reacting to the OP, you find that most people with good-to-great gaming PCs are saying they get around 50fps at 1080p + maxed settings with the patches and that it's at least playable at 4k.

Also, it should be noted the PS4 version runs at 900p and maxes at around 30fps, usually dipping into the 20s when there's a lot going on. No version of the game seems to have stellar performance.

I don't know if it's true or not, but people are also implying the studio that handled the PC port, Ubi Kiev, is not particularly... great. It could just be a bunch of grumpy troglodytes on a games forum lashing out, but beware that Ubi Kiev is also handling the PC port of Far Cry 4, so brace yourself if you only buy games to get 60fps.

Avatar image for asilentprotagonist
ASilentProtagonist

738

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@korwin said:

It also turns out that the guy running the benchmarks is now running 4xMSAA, also a higher framerate impact option. Unfortunately the only other option is FXAA, they've removed SMAA support which they had in 4 (although SMAA in 4 was busted in Windows 8.1 because they used an old version).

What is the difference between MSAA and FXAA? What look's better and performances better?

Avatar image for raven10
Raven10

2427

Forum Posts

376

Wiki Points

0

Followers

Reviews: 27

User Lists: 5

#33  Edited By Raven10

Running a game as complex as AC: Unity maxed out at 1080p is no easy feat, not even for a 780. Turn the AA off and just force SMAA into the game using an injector and his problem will be mostly solved. That's the solution I used for AC games following Brotherhood when running MSAA just gave too big a hit to performance. With MSAA on at 2x in Revelations, for example, I would regularly fluctuate between 20 and 50 fps. When I switched to injecting SMAA I got nearly as good aliasing coverage with framerates that never dropped below 50 and 90% of the time ran at 60 fps. If he still is running into problems with MSAA set off then he has a right to complain. But MSAA kills framerates in every game using a modern deferred lighting engine. In the Battlefield games you can as much as double your framerate going from 4x MSAA to no MSAA. If he still isn't getting a solid 60 fps then he can turn off AO as well and his problem will almost certainly be solved.

Avatar image for monetarydread
monetarydread

2898

Forum Posts

92

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#34  Edited By monetarydread

Keep in mind the OP mentioned the terms, "max settings." Most of those settings are not designed for average systems, their only purpose is to give marginal improvements to graphical fidelity for those people with extra hardware.

SOmeone on GAF mentioned that he thinks that PC games, "max," settings should be hidden in .ini files to prevent this kind of confusion. I tend to agree with his statement because I look at this forums reactions to hardware specs, all I can think about is how nobody understands PC gaming at all.

Edit: Durante (The man who fixed Dark Souls on PC) talks about "High," graphics options on the NeoGAF forums.

Judging game performance at "max settings" is enormously counterproductive

Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:

    1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
    2. Simply don't bother with higher-end settings at all.

The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#35  Edited By korwin

@asilentprotagonist said:

@korwin said:

It also turns out that the guy running the benchmarks is now running 4xMSAA, also a higher framerate impact option. Unfortunately the only other option is FXAA, they've removed SMAA support which they had in 4 (although SMAA in 4 was busted in Windows 8.1 because they used an old version).

What is the difference between MSAA and FXAA? What look's better and performances better?

MSAA = Multisampling. It's was introduced as a higher performance option to be used instead Super Sampling, it's still reasonably resource intensive however. It's essentially method of analysing the each triangle in a frame and then sampling the surrounding pixels (either 2, 4, 8 or even 16 times if your feeling saucy). Those samples are then used to soften/blend the edges of geometry to reduce visible aliasing, the more samples you take the better the results but the higher the performance impact (both from a compute stand point and a GPU memory standpoint).

The downfall of course being that standard multi-sampling doesn't perform passes on things on top of meshes (textures, normals) or on shader effects. This can be worked around by introducing Transparency MSAA which allows you to perform similar actions on things like textures, but it sucks up more grunt. A lot of people still tend to prefer this kind of AA however because it's generally believed to provide better image quality.

FXAA - Fast Approximate Anti Aliasing. FXAA along with a lot of other modern shader based solutions (SMAA, MLAA) is a post processing shader that is applied to the what is essentially the final rendered scene. The shader analyses each frame as it comes in and find's what it determines to be the edges of all objects in a scene, at this point it applies a blur filter to all detected edges. Due to the nature of how this is performed FXAA is basically "free" since the post process is quick and of minimal complexity, also as it is performed on the fly it's memory footprint is almost non existent (at most you might loose 1 or 2 fps). Additionally as the post process is applied to the whole scene it has the advantage of being able to detect and anti-alias everything, geometry, textures, normals and specular's and shader effects without additional overhead.

The downside of FXAA however is what most people people will identify as a blurry image with a loss of texture detail. Developers can do additional work to reduce the over all loss in quality in things like details and text however it's never perfect. Due to the damage to fine details that often occurs by running FXAA a lot of people tend to prefer leaving this disabled and instead put up with aliasing. However the quality and advantages of FXAA increase substantially at ultra high resolutions (4K for example). At ultra high resolutions (with textures to match) that make use of standard aspect ratios (triple head doesn't really count here) the sheer size of the image means that the actual edges in the scene are painted on a much larger canvas, subsequently the impact to fine details in the image start to disappear as it's less likely that said small details will be crushed out by the blur filter.

TL:DR - MSAA costs a lot more but looks nicer at more common resolutions, FXAA is quick and covers everything however it can look like Vaseline on a camera lense at lower/common resolutions.

Additional: SMAA is where it's at, it performs much like FXAA but doesn't take to your fine details with a sledge hammer, it's a happy medium between the two.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Keep in mind the OP mentioned the terms, "max settings." Most of those settings are not designed for average systems, their only purpose is to give marginal improvements to graphical fidelity for those people with extra hardware.

SOmeone on GAF mentioned that he thinks that PC games, "max," settings should be hidden in .ini files to prevent this kind of confusion. I tend to agree with his statement because I look at this forums reactions to hardware specs, all I can think about is how nobody understands PC gaming at all.

Edit: Durante (The man who fixed Dark Souls on PC) talks about "High," graphics options on the NeoGAF forums.

Judging game performance at "max settings" is enormously counterproductive

Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:

    1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
    2. Simply don't bother with higher-end settings at all.

The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.

This is where Digital Foundry really add's value in my opinion. They often do a break down and look at what the equivalent feature set on PC is to get the same level of fidelity as the lower powered platforms. Good recent example is where they point out that the Ambient Occlusion being used on PS4/Xbox is a standard SSAO method vs. the PC's ability to use the more accurate HBAO+ along with the fact that for whatever reason the texture filtering on those platforms is basically non existent (which seems crazy to me since Anisotropic filtering costs practically nothing).

Avatar image for hunkulese
Hunkulese

4225

Forum Posts

310

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

By all accounts it's a pretty solid PC port.

Avatar image for humanity
Humanity

21858

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 16

At some point you gotta realize that Ubisoft just aren't that good at making PC ports or they don't care - whichever it is, you're better off getting the console version.

Avatar image for tuxfool
tuxfool

688

Forum Posts

28

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39  Edited By tuxfool

@asilentprotagonist: Alas, what has the best performance to looks ratio isn't available, which is SMAA 2Tx. It is strange that they couldn't even include plain SMAA as it has been in all their games for the past 3~4 years. I also remember they were crowing in Black Flag that their SMAA implementation was amazing.

Bah, SMAA can always be injected, but those temporal filters make a hell of a difference to aliasing flicker when in motion.

Avatar image for korwin
korwin

3919

Forum Posts

25

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@tuxfool said:

@asilentprotagonist: Alas, what has the best performance to looks ratio isn't available, which is SMAA 2Tx. It is strange that they couldn't even include plain SMAA as it has been in all their games for the past 3~4 years. I also remember they were crowing in Black Flag that their SMAA implementation was amazing.

Bah, SMAA can always be injected, but those temporal filters make a hell of a difference to aliasing flicker when in motion.

Black Flag's SMAA implementation was based on an earlier version, which unfortunately meant it didn't work in Windows 8.1 once DX11.2 came in. Odd's are it was removed because they didn't want to update it.

Avatar image for doctordonkey
doctordonkey

2139

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

Fine by me, I've gotten used to locking new PC games to 30 fps to avoid fluctuating framerates (besides a select few well-optimised games, like Shadow of Mordor).

Avatar image for tuxfool
tuxfool

688

Forum Posts

28

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@korwin: That is interesting. They could have also cribbed it from Watchdogs, which did have the aforementioned impressive SMAA 2Tx and regular old SMAA.

Avatar image for onarum
onarum

3212

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

well, seems like if I give up txaa and that fancy new nvidia shadow thing I will be able to run it at 1080p 60, so I'm fine with that.

Avatar image for hunkulese
Hunkulese

4225

Forum Posts

310

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#44  Edited By Hunkulese

@humanity said:

At some point you gotta realize that Ubisoft just aren't that good at making PC ports or they don't care - whichever it is, you're better off getting the console version.

You're a crazy person. Has there been anything that makes this look like a bad or lazy port? There's solid evidence showing otherwise.

Avatar image for tuxfool
tuxfool

688

Forum Posts

28

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45  Edited By tuxfool

@hunkulese: Whilst I think the performance on PC is in line, as in scales from the console versions, It appears the game isn't very scalable on PC settings. There isn't a massive difference (beyond lighting and AA) of the low setting to the high setting.

Then there are other issues, which are globally not so good such as the insane npc pop in and LOD weirdness. These aren't limited to the PC but in this area a PC should theoretically be better.

We'll see if patches improve this...

Avatar image for cbyrne
CByrne

511

Forum Posts

16

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46  Edited By CByrne
Avatar image for hodor
hodor

149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I don't even think it is limited to PC. Ubisoft is just bad at optimization. Didn't someone email Brad about Unity running at 9 fps half a year ago.

Avatar image for deactivated-5f9398c1300c7
deactivated-5f9398c1300c7

3570

Forum Posts

105

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

I'm going to run it on Medium or Low, then. Problem solved.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

I think everyone should take a deep breath and just wait for the game to actually release, get those driver updates out there along with the inevitable day one patch and then see what Digital Foundry has to say about it.

Avatar image for slyspider
slyspider

1832

Forum Posts

14

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50  Edited By slyspider

I'm waiting to see what it actually performs like. I expect hot, runs like a freshman in college ported it for fun, shit like most ubi games but I'm hoping to be surprised. Hoping.