Been running on an HD 7970 Platinum Deluxe and I get a stable 60fps running mostly on high and it still looks absolutely breathtaking. I turned off the hair tech as it was hitting my fps badly, plus its an Nvidia feature so I'm guessing it isn't designed for AMD cards. That and the feature seems pretty pointless to me.
The Witcher 3: Wild Hunt
Game » consists of 27 releases. Released May 19, 2015
CD Projekt RED's third Witcher combines the series' non-linear storytelling with a sprawling open world that concludes the saga of Geralt of Rivia.
Wild Hunt PC Performance Thread
@machofantastico: It's hitting Nvidia cards pretty hard, too. But HairWorks on monsters makes them look so much better, you really should try it to see what you're missing. On the other hand, I think HairWorks makes Geralt's hair look like a plate of spaghetti with bad physics.
My specs are :
- Intel i7 920 at 3.2 GHz
- Nvidia GTX 770
- 12 GB or RAM
- Installed on a regular 7200 RPM HDD
- Running at 1920x1200
I have most things on Very High/Ultra with bells and whistles on except for Hairworks, V-sync and Antialiasing. Performance is pretty good. Framerate is usually in the 40s, it sometimes dips in the 30s but rarely lower. I'm happy with it so far.
Huh that is really interesting, my system:
- Intel i7 920 at 3.2 GHz
- Nvidia GTX 780 FTW 3GB
- 12GB RAM
- On an SSD
- Running at 1920x1080
I have everything on Ultra except Vegetation Distant set to High, Nvidia HairWorks Off, everything else is maxed and I am getting a smooth 60FPS, my 780 is OC'ed that has to be what is doing it.
- Intel i7 920 at 3.2 GHz
- Nvidia GTX 780 FTW 3GB
- 12GB RAM
- On an SSD
- Running at 1920x1080
I have everything on Ultra except Vegetation Distant set to High, Nvidia HairWorks Off, everything else is maxed and I am getting a smooth 60FPS, my 780 is OC'ed that has to be what is doing it.
I'm getting essentially the same performance as you are:
- i7-4790k at 4.5 on Maximus VII Hero w/ Noctua NH-D14
- Asus GTX 780 Ti 3GB significantly overclocked
- 16 GB DDR3, very low latency and tight timings.
- Samsung Pro EVO SSD, a fast one.
There was a note from an Nvidia employee yesterday in one of their comments sections on Nvidia.com that they were currently looking into performance issues in Wild Hunt on Kepler cards. Good news for all of us with 600 & 700-series discrete Nvidia GPUs.
I wish there was a way to run Nvidia HairWorks on monsters but not on Geralt's hair. Maybe we'll see a patch or mod soon that allows that.
Update: with my set up getting pretty much 60 fps (I'm not using a frame counter, but it's smooth) on most things on medium and a few things high (like texture quality). Only played 15 minutes, tho. So yeah...things could change.
It looks nice so far. Pretty much what I'd expect with my set up at least. So if you're not on the cutting edge and don't mind lack of perfection, it seems to be fine.
@shivermetimbers: Are you using motion blur, because that can make guessing one's actual frame rate pretty hit and miss. Why not just use MSI Afterburner and toggle the overlay on and off when you need to so you know for sure what kind of performance you're getting?
I'm kinda tempted to overclock my 980 so I can have hairworks on but also keep the 60 frames. Not sure if that will bridge the 10 frame gap that the setting creates though
Well the issues are the same as for the AMD cards. Tesselation running at crazy 64x.
@mb: I might wait next year for a card. I'm not into the idea of spending another wad of cash so soon after blowing 800 bucks on a monitor. It's funny to think that as a PC gamer my video card is the weakest link.
At least I can run The Witcher 3 just fine with 40 Chrome tabs open.
I have to know what monitor you got that cost $800. My $100 monitor from around 3 years ago is still trucking along and looks great except for a couple off colored pixels around the edge. I can't even fathom spending that much on a monitor as opposed to upgrading something more substantial, like getting a couple 970s or something. Anyway, I'm still curious what it might be.
@mb: I might wait next year for a card. I'm not into the idea of spending another wad of cash so soon after blowing 800 bucks on a monitor. It's funny to think that as a PC gamer my video card is the weakest link.
At least I can run The Witcher 3 just fine with 40 Chrome tabs open.
I have to know what monitor you got that cost $800. My $100 monitor from around 3 years ago is still trucking along and looks great except for a couple off colored pixels around the edge. I can't even fathom spending that much on a monitor as opposed to upgrading something more substantial, like getting a couple 970s or something. Anyway, I'm still curious what it might be.
My monitor was $800, I have the Asus ROG Swift PG278Q. 27" G-Sync, 2560x1440, 144Hz hotness. I'm sure $100 1080p monitors look fine until you compare them side by side to a high end monitor such as the type I'm running. Sometimes not knowing is better, though...I used to think I didn't need a good monitor until I saw a ROG Swift in person.
- AMD Phenom II x4 965 Black Edition (3.4GHz)
- AMD Sapphire Radeon 7870XT 2GB GDDR5
- 8GB DDR3 1600MHz
Runs the game great at 50+ fps with water detail on low and everything else on medium to high. Hairworks off, of course; who needs that junk?
@maginnovision: It's great, I have no complaints about it. I've heard the newer Acer XB270HU is even better, though. 1440p/144hz with Gsync IPS panel.
Had to disable hairworks since x8 tessellation made him look bald with anything but the default hair cut. Also i haven't tested it yet, but i've been having bad NPC pop in, and i think it has something to do with tessellation @ low settings. Either that or it's the latest patch.
I'm seriously thinking about doing an upgrade, but it seems nothing on the AMD side is running this game perfect even at 1080p. Not dropping another 3-5 hundred for that lol.
@mb: I might wait next year for a card. I'm not into the idea of spending another wad of cash so soon after blowing 800 bucks on a monitor. It's funny to think that as a PC gamer my video card is the weakest link.
At least I can run The Witcher 3 just fine with 40 Chrome tabs open.
I have to know what monitor you got that cost $800. My $100 monitor from around 3 years ago is still trucking along and looks great except for a couple off colored pixels around the edge. I can't even fathom spending that much on a monitor as opposed to upgrading something more substantial, like getting a couple 970s or something. Anyway, I'm still curious what it might be.
My monitor was $800, I have the Asus ROG Swift PG278Q. 27" G-Sync, 2560x1440, 144Hz hotness. I'm sure $100 1080p monitors look fine until you compare them side by side to a high end monitor such as the type I'm running. Sometimes not knowing is better, though...I used to think I didn't need a good monitor until I saw a ROG Swift in person.
I got the Acer XB270HU. It's like the Swift, except it's also IPS. And, I certainly thought my cheap monitors looked fine until I saw this one. I got it because I wanted a third monitor, and I figured I could extend the life of my 680 with a monitor that makes sub-60 fps look good, buy enough time for a really substantial card to come out.
I've been out of the country visiting family so I had to make do with my MSI GT72 laptop:
- Intel i7-4710HQ
- 24GB DDR3-1600 ram
- 2x500GB Crucial MX200 SSDs in Raid 0
- Nvidia GTX 970M
I'm currently running a mixture of high and ultra settings (mostly high) at 1080p with most of the post-processing turned on (including DOF and HBAO+) and I'm able to get around 40-60 fps, usually 50-60 fps in the wilderness, and 40-50 fps in towns. It's better performance than I expected since the GTX 970M is supposed to below a desktop GTX 770 in terms of raw performance.
@shivermetimbers: Are you using motion blur, because that can make guessing one's actual frame rate pretty hit and miss. Why not just use MSI Afterburner and toggle the overlay on and off when you need to so you know for sure what kind of performance you're getting?
Nope, don't use motion blur. And here's the thing, I can get a fluctuating 60 FPS with everything turned to medium to high OR I can turn everything to high and cap it at 30, which I kinda like better (fluctuating framerates do give me a headache even if it's minor). So it's running basically better than I expected. Oh and I have AA on because those jaggies kinda give me a headache to look at, but that's about the only thing I have on in the effects menu.
I have a question less about the pure performance and more about the look of some of the graphical options, what are people using and liking?
For example I settled on the low Sharpening setting because I like how it makes things stand out a bit more, but really dislike an am not using the high Sharpening setting because it borderlines (ha) on cell shading type of outlines for everything making everything look weird for me, even tho it does not cost any performance basically.
Kind of undecided on the AA they use in the game, the jaggies are only visible in far away objects for me because the general level of detail is on Ultra etc, and if it's on it does make them go away but at the same time just that little bit of extra blur is not a trade off i like, so I am leaving it off for now.
That is aside from me turning off every type of blur, motion and/or debpth of field in almost every game ever, where it's possible.
The hairworks stuff look really fucking nice on the do I am using on Geralt currently (shaved sides + full beard) but it's still a significant resource hog even after 2 patches, making me go from a relatively steady 60fps to a very noticeably unsteady 50fps that's hard to look at because of the fluctuations.
If any gaming laptop users are interested the 980m performs better then I expected, really pleased with it, getting a steady (99% 'ish) 60fps on high/ultra mix of settings at 1920x1080. With hairworks off obv.
Welp, first crash in 5 hours of playtime. I've heard disabling Geforce experience and changing to borderless windowed helps. So I might try that.
I'm playing at 4k... and have a few Titan Xs.. but the SLI support isn't great. Each card is using only 60%. It is enough to get about 60 with everything maxed. But it should be 150+ if the cards were fully used. Hopefully the fix the SLI support.
In the meantime I'm being a sofa jocky and playing it on the HTPC and 65" panny plasma. I have 2 titan Xs there and running ultra + hair zero problems. downside it is to sunny during the day to play on the TV.. but come night.. call the dog to my side.. and witcher time!
I hope nobody is buying titan x's for this game also, the new amd and NVidia cards may make them seem even more expensive than they are now due to the boost in performance.
Would you mind rephrasing that, or perhaps fleshing out what you were trying to say a bit more? I'm sorry, but I just went back and re-read that sentence a few times and I'm still not sure what you are getting at.
@tennmuerti: I have sharpening turned off completely. Put up with it for a day or two but even on Low it was too artificial looking for me.
@maginnovision: I'll try a GTX 780 as a PhysX card tomorrow with my 780 Ti and let you know if there is any kind of improvement.
@maginnovision: the GTX 980 Ti, if it indeed will be called that, looks to be just a Titan X but with less VRAM. So performance will probably be the same. As for the price, well it could be in the $700-750 range (since that was how the 780 Ti was priced). AMD's upcoming cards will likely also be similar in performance and price if the rumors are to be believed. So while the price will likely be a little lower than the Titan X, don't expect any increase in performance. Any significant performance increase is probably gonna only happen with a die shrink.
before people with SLI throw in a PhysX card, check to see if the cards are being fully using. My SLI setup has low GPU usage so hair on or off has no impact because there is headroom left in the cards.
Here is a random video i made with everything turned on... yes.. boobs are in there (that wasn't planned)
@rethla: because the SLI driver is failing to use the power of all the cards (it is easy to check - use afterburner or precision and monitor gpu usage) then turning on PhysX has no impact on FPS. Because the game isn't using up all the cards power there is head room for them to use PhysX. That said, if you check GPU usage and the cards are maxed out - PhysX will have an impact.
GPU power has to come from some place. If you aren't using SLI (or SLI is working they way it should and maxing the cards) using a PhysX card would help (if the game will use it)
@zurv: From what i have read the witcher 3 uses the CPU for physX and the hairworks are tesselationbased and hence uses the normal graphic render cycles. So disabling hairworks should increase performance even if you got headroom and a dedicated physX card wont help with either physX or the hairworks.
Im wondering becouse i just upgraded to GTX980 from GTX780 and im thinking about using the 780card as a physX card but its really big, hot and powerhungry so if it doesnt make any difference its not worth the hassle.
@mb: re: rog: *drools* i like my qx2710 fine especially for $300 but a tiny part of me wishes i waited for one of those (think i need a freesync model w/amd though). May try oc'ing it but apparently it typically tops out at 96hz and a little afraid of blowing it out or something.
Intel Core i5 i5-4690K.
Asus AMD Radeon R9 290 DirectCU II OC Graphics Card (4GB, GDDR5, PCI Express 3.0)
HyperX FURY Series 16 GB
Gigabyte Z97X-SLI Intel LGA1150 Z97 ATX Motherboard
Corsair Builder Series CXM 750W PSU
I can run the game maxed out (I turn off hair effects), and runs 60fps. However, whenever I play any demanding game, the screen will go black with music playing in the background. Keyboard and mouse function fine. I've been trying to fix this problem for days, but can't work out why. I've put together this PC last week, and all parts are brand new.
Ram, CPU and GPU tests were done. They came back fine. I cleared all AMD drivers, then installed fresh. Worked perfectly fine for one day. After turning PC on the following day, the issue came back. My only idea now is a faulty PSU. I have a new one due tomorrow.
Any suggestion guys?
@thesecondagent: I had some wacky problems with sound. I have a fancy AVR which can do 32bit studio sound. The game does like not like. I changed it (in windows) to boring 16bit and the problem went away. Also make sure you are play at full screen (vs boarder less windowed mode)
Anyways now i have an slightly overclocked GTX980 and im getting an acceptable framerate with everything turned on including hairworks and HBAO+. Im getting about 60fps with some dips down to 40-50.
Everything maxed exept for Vsync off, no sharpening and no motionblur for esthetic reasons.
i7-4770K@3.5Ghz, 16gb RAM, GTX980@1266Mhz
@mb: Sure i gave it a try but im not immediately noticing anything different neither in the performance or how it looks. Im also just fooling around and loosely watching the FPS counter in the corner so its not the most precise analysis. With an FPS graph readout i could probably spot an improvement. Since it looks just the same im leaving it at 4, maybe it will even out some of the FPS dips.
Edit: Fun fact, im getting slightly lower FPS in The Witcher 2 with übersampling enabled :)
On my set-up, I disabled Hairworks and instantly got 10-15 FPS extra. Kind of ridiculous how much of a performance drain that setting is. Now I have almost constant 60 FPS on 3440x1440 with everything at the maximum apart from turning motion blur off and sharpening to low.
Too bad I suck at this game. The three wraiths when you're escorting the witch while finding Ciri just destroy me. Freaking gas mushrooms. Just have to focus on ...everything.
http://www.gog.com/forum/the_witcher_3_wild_hunt/witcher_3_performanceultra_tweaks_for_both_amd_and_nv
So far i feel like a few of these settings stopped the pop in I started getting after the latest patches, haven't tested extensively yet though. I didn't do all of these only ones relating to texture streaming and card performance.
I wonder if AMD is actually going to release a driver before everybody finished the game wtf.
*edit* nope getting crazy character pop in still, seems like it's patch related. I've set everything back to normal as editing the textures settings was causing the game to reset my texture to low every start up.
Please Log In to post.
This edit will also create new pages on Giant Bomb for:
Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.Comment and Save
Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.
Log in to comment