Worth buying an expensive video card? (long-term)

#1 Posted by Caustic_Fox (112 posts) -

I was just thinking the other day of if it's really worth to buy a expensive graphics card for long term use (2-3 years). So theoretically speaking, lets say that I decide to buy a nVIDIA GTX 690(s) tomorrow so to play all of my games from now to until a few years at rather high settings. Since technology is changing every day at an alarming rate, there will be more advanced features being implemented as we more forward.

This card runs on a DirectX 11 hardware spec. How many years from now do you think until DirectX 12 will be out? Probably 2-3 at most. The main point is, even though this card will still have no problems playing the latest and greatest games in the future (lower settings this time around) and DX 12 hits the market as a requirement for the minimum specs: This graphics card that I busted an arm n' nut to get 2-3 years ago will be as good as going into the trash anyways since it does not have DX 12. It will cease to function on the mainstream level.

What are your thoughts on this?

#2 Posted by c0l0nelp0c0rn1 (1798 posts) -

@caustic_fox: Most games nowadays are still using DX9. I think you're probably alright.

#3 Posted by RollingZeppelin (1908 posts) -

The newest DX format is never the minimum requirement, games will still use DX11 for many years to come.

#4 Posted by Caustic_Fox (112 posts) -

Well tell that to Crytek that did Crysis 3. That is one game that requires DX 11 in order to run. Many folks (including myself with a GTX 260) who still use aging DX 10 graphics cards are at this point SOL, and will have to resort for an upgrade.

#5 Edited by mellotronrules (1171 posts) -

generally speaking with pc parts- the sweet spot is the mid range. in the case of gpu's, if you're interested in running a consistent level of quality across the span of a decade, you're better served buying 4x $250 cards spread out every 2.5 years, as opposed to one single $1000 card. or perhaps more realistically- 3x $300 cards over a 6-year period than one single $1000 card. not to mention the lower-priced cards are more likely to be energy efficient, so there's more savings there as well. top-of-the-line is great for those who can afford it, but value-for-money drops off pretty steeply in gpu's as you approach the higher end. not to mention technology and standards changes- it would be a bummer to drop a lot of money on a card only to be superceded by the next DX, pci-e, or GDDR standard.

#6 Posted by Stonyman65 (2569 posts) -

I think it depends on what resolution you are using. If you are running 1920x1080, probably not, but if you are running at 2550x1600 or something big than having a beefy card is going to do a lot for you.

Either way, you are going to have to get a new card every 2 or 3 years anyways, so getting the mid-high end $300 card vs getting the high-end $500 isn't going to matter much in the long run.
What you need to ask yourself is: is the extra few hundred bucks worth the 10-15% performance increase?

Online
#7 Posted by envane (1159 posts) -

its not worth buying that particular card , but i would say somethign lieka gtx titan would last you longer , mabye not the same performance etc but its acoustic performance and temperature management seems amazing and no brainer for a long term card that you dont need to mess with .. the only thing i have against the 690 is that dual gpus still always have issues until drivers are updated , so getting the most out of it requires more effort on your part , plus from experience of owning sli 580's the heat and noise is fucking insane .. im looking at a titant purely out of the fact that it will be far quieter and not struggle as much under the temps that my 580s get ( the titan is designed to run at 80c , whereas my cards get more likely to blue screen/driver reset when its over 70c which is almost all the time in a demanding game like farcry3)

#8 Edited by jdh5153 (1034 posts) -

Buy an Xbox 8. It'll last you 8 years at least, much longer than any video card or PC.

#9 Posted by TheRealMoot (335 posts) -

No. Do not buy the infamously expensive video card of legend!

Wait... is that card really $1200? $999? Am I reading that right?

Either way, here's what I do for video cards: Get the $100 to $200 one. Get one that is tried and true, research it. (Unless you have A** loads of money then jump the hell in man!) Find what works, make sure it fits in your pc and is compatible with some games you have/want.

Yeah, not the best advice from me. Just don't buy the crazy expensive cards. Had a friend who bought a video card for more than a $1000 a few years back. It was one of the first HD, Blue Ray compatible cards made. Piece of crap. Overheated, didn't run any games past or present, wonky resolutions and busted firmware. This might have been an edge case but the whole PC collapsed because the card was crap.

#10 Posted by AlisterCat (5469 posts) -

I feel like it is the best idea and wait to see how AMD and Nvidia react to the next generation of consoles. The new Maxwell set of cards isn't launching until 2014, which will be partially reactionary to the console specs.

Get a GTX460 or a 660ti for now. Nice lowish price mid range cards.

#11 Edited by Sidewalkchalk (122 posts) -

As a rule of thumb, spending ~$200 on a card will usually last you two years. It is NOT worth buying the latest/newest generation of cards. The $1k Titan? It doesn't outperform a GeForce GTX 680 at 1080p for any game on the market.

@jdh5153 If you're talking about the xbox that should be dropping this year, you're dreaming. Top of the line PC's will provide better performance within the first year of the console's life, and within 2-3 years, a ~$200 second-gen card will absolutely demolish the new console(s) provided that you also have 8+GB of DDR3 RAM and a modern CPU. I'm not trying to be a dick, but you're very wrong.

EDIT: Let me clarify in terms of longterm value. A $400 card and a $999 card will both sputter out at about the same time down the road in terms of performance. I had a Radeon HD 6870 for about a year and a half, and I bought it for probably $230 or less. I just bought a 7970 for $200 and change, and I expect to upgrade in another year and a half. If I bought a GeForce GTX 680 for $440-480, it would probably last me for 2-2.5 years for most games except the ones pushing tech boundaries late in that cycle. Part of it comes down to personal preference. I enjoy upgrading my card every year and a half or so. For ~$400, I get 3 years of staying on top of the curve, whereas if I bought the near top of the line GPU, I'd pay $450-1000 for the same 3 years.

#12 Edited by Corvak (831 posts) -

Pricing

Your best price/performance long term will almost always be between $300 and $400. And almost always its an AMD GPU. The "nVidia tax" is frequently not worth paying when it comes to performance, unless you have a deep love for its driver interface.

nVidia cards suffer less from "frame time stutter" when in a multi-card setup, but I dont think multi-GPU setups are as concerned with cost - a more powerful single card will trump a multi-card setup when it comes to price, every time. Though on the topic of multi-GPU systems, adding a second card later, when the prices drop following a new generation, is a relatively inexpensive way to get 60-80% more GPU performance.

The GTX 680 and Radeon HD 7970 are essentially the same, performance wise - and the 7970 costs about $100-150 less. But your best bet now, are the cards one step below the flagship cards - the GTX 670 or the Radeon HD 7950. Better still, wait until the next generation hits in Q4, and take advantage of inevitable price drops.

#13 Posted by Mirado (983 posts) -

@caustic_fox: It depends. Is that expensive GPU the only way to reach your performance goals right now? Forget the long term for a moment, as too many variables can move and shift between now and then. When I built my system, there was no single card on the market that could get me 60FPS @1920x1200 with everything turned up in the games that I wanted to play. The GTX 580 was the best consumer board out there, and it wouldn't do it. So I nabbed two 6950s, flashed them to become 6970s (back when that was possible), and didn't look back. That $600 investment has served me well, but now, just two years or so later, I am straining at the bounds of what is acceptable for me, again.

Now, I have what some would consider to be crazy standards for performance, but by the same token some would consider my constraints to be lax (mainly those with 30" monitors). You need to decide where you fall; if you are good with 40FPS and a resolution at 1080p or less, you can be a bit more flexible when it comes to your purchases, and only then can you get into the issue of long term investment.

What you need to ask yourself is:

is the extra few hundred bucks worth the 10-15% performance increase?

This is a salient point, but it takes on a more specific meaning when you look at it through the lens of what I am saying. Is it worth it? Yes, absolutely, if that 10-15% pushes you to your performance goal at the moment. You can realistically expect a card which performs adequately (assuming your standards are less rigid than mine) to do so for a few years. Most people can turn the AA down a bit, mess with a few sliders, or accept a sub 60 FPS within reason. However, and this is the bit that makes giving out advice so hard, if you are like me and balk when you see your frame rate start to dip, it becomes imperative to buy above where you normally would, almost in contrary to what I said above, otherwise you'll be turning cards over year after year, which isn't...healthy. It's an issue with no "one size fits all" answer.

Decide what your performance goals are, look at the minimum GPU to reach those goals, and go from there. If you need a $400 card to get what you want, you stop looking at the next step up as a $600 card but more as a $200 investment, since you'd take nothing less than that $400 GPU anyway. That's where the real guessing game begins. If you can be happy with a $200 GPU, a Titan makes little sense at $1000. But if you were going to spend $800 on two GPUs anyway...

It's not "is it worth buying an expensive video card," but rather "is a $X investment above base worth it?" And that is a much more personal and specific question.

#14 Posted by Corvak (831 posts) -

@mirado: Interestingly, flashing can still be done on some AMD cards. A reference 7970 can be flashed into a 7970 Ghz Edition, for example.

#15 Edited by Mirado (983 posts) -

@corvak: Indeed? I thought they had learned their lesson about giving people free upgrades, since they disabled the ability to do so later on in the 6950's life. Well, that's a nice added bonus for base 7970 owners.

#16 Posted by Corvak (831 posts) -

@mirado: Yeah. The Ghz Edition isn't a different reference model, but a factory overclock, really. You can and will burn out your card, if you flash it onto a card with inadequate cooling. And of course void your warranty in the process.

#17 Posted by Mirado (983 posts) -

@corvak: That's a bit different from the 6950/70 flash, where a 6950 was just a 6970 with some disabled SIMD units and a missing (but apparently unneeded) 8-pin PCI power connector. Does the 7970 have a BIOS backup? On the 6950 you could flip a switch to a protected, read-only BIOS in case the flash went bad.

#18 Posted by Mr402 (135 posts) -

I love these threads. If you spend more then 500 dollars on a video card you are a fool. Only time you need cards of the caliber of 690's or Titans are when you are attempting to run multi-monitor ultra high resolution settings. For myself I purchased a Sapphire 7970 OC edition for a little over 300 bucks. Which was complete overkill for my use since I run this on my television with maximum res of 1080p. Still I have begun to play more and more games in stereo 3D so the extra juice helps keep a smooth framerate. That being said look at your use. If you are going for some insane setup then I guess go for it. I still think that extra dough could go to plenty of other great upgrades like water cooling, large solid state drive or a shit ton of games.

#19 Edited by EXTomar (4444 posts) -

I don't know about that since I paid $500~ for my Nvidia 580 when it was brand new and it has more than made up for the cost and still performs very well today.

#20 Edited by Hunkulese (2624 posts) -

A 680 is probably going to be more than enough for the next two years. The next line of nvidia cards are just going to be a refresh and the line after that is when they'll be implementing all the new technology. There's really no reason to buy a 690 unless you're running at some astronomically high resolution and even then a 680 will be fine for 90% of the games. Put aside the money you're saving for a PS4.

#21 Edited by RobotHamster (4169 posts) -

I'll probably be upgrading my video card later this year. I'm running a 6870 which does alright but if I find a good sale I'll upgrade now and be good for the next few years.

#22 Posted by Cameron (595 posts) -

I'm with @mellotronrules on this. Even a GTX660 (just over $200) can run anything out today at 1920x1080 and high-ultra settings (maybe not Crysis 3). That may change when the new consoles come out, but then you can buy the new $200 card a year and a half from now and still have spent less than buying one 690 now.

#23 Posted by deathstriker666 (1337 posts) -

Spending anything above $400 for a card is ridiculous. High-end GPU's get cheaper and more powerful every year. For an example, the upcoming ATI 8970 will offer a little better performance over the ATI 7970 for $300-$400 cheaper. I'd recommend grabbing an 7870 if you're looking for a fantastic card that'll play anything on high/ultra at an awesome price.

#24 Posted by myketuna (1644 posts) -

I don't mean to hijack this thread, but maybe this will add to the discussion. I currently have a GTX 570 that I bought about a year ago. Looking at benchmarks of some newer games running at 1080p (Tomb Raider, Crysis 3, etc.), I see that it's not as powerful as it used be. I would like to run games at 60fps with medium-high settings at the very least.

With that in mind, I'm planning on upgrading my card around the time the new consoles come out (maybe 5-6 months after launch?) and give my 570 as a hand-me-down to my younger brother, who plays @ 1680x1050 settings, and build him a new system around it (he's currently running a 9600GT). About how long do you guys think that 570 will last him from 2014 and onward? 2 years max or something? My brother is even less of a stickler for graphics than I am.

Basically what I'm asking is would it be worth it to hand the 570 down to him or just bite the bullet and get a new graphics card along with a new system. Whenever time that ends up being.

#25 Edited by envane (1159 posts) -
@contrarygravitas said:

As a rule of thumb, spending ~$200 on a card will usually last you two years. It is NOT worth buying the latest/newest generation of cards. The $1k Titan? It doesn't outperform a GeForce GTX 680 at 1080p for any game on the market.

@jdh5153 If you're talking about the xbox that should be dropping this year, you're dreaming. Top of the line PC's will provide better performance within the first year of the console's life, and within 2-3 years, a ~$200 second-gen card will absolutely demolish the new console(s) provided that you also have 8+GB of DDR3 RAM and a modern CPU. I'm not trying to be a dick, but you're very wrong.

EDIT: Let me clarify in terms of longterm value. A $400 card and a $999 card will both sputter out at about the same time down the road in terms of performance. I had a Radeon HD 6870 for about a year and a half, and I bought it for probably $230 or less. I just bought a 7970 for $200 and change, and I expect to upgrade in another year and a half. If I bought a GeForce GTX 680 for $440-480, it would probably last me for 2-2.5 years for most games except the ones pushing tech boundaries late in that cycle. Part of it comes down to personal preference. I enjoy upgrading my card every year and a half or so. For ~$400, I get 3 years of staying on top of the curve, whereas if I bought the near top of the line GPU, I'd pay $450-1000 for the same 3 years.

is nobody seriously gonna call out this shit , the titan doesnt outperform a gtx 68 at 1080p for ANY GAME on the market ? it may be splitting hairs but im pretty sure nay results youve seenw ith 680s beating the titan was with the 680s in sli , even nvidia admit that this is the case , since theres currently very little in way of updated drivers for the titan , and sli scaling is still getting better.. so watch the benchmarks as they start to get better and better.

I agree with all the arguments for value , but theres almost a stockholm syndrome going on with those of you who claim that whatever budget card you get can RUN ANYTHING AT MAX SETTINGS (except its not max because you cant run MAX aa , and your only just squeezing in ~45fps at 1080p ..

so the OP question still remains poorly answered , if they have the money to spend, the reccomendation of a single 680 , or a titan instead of a 690 still stands for me .. if you dont have the money to spend , you wouldnt(or shouldnt) be asking if a 690 is worth it.

im sure the rest of the advice is great for anyoen who ASSUMES you need the best card to play call of duty etc .. i made it trhough crysis 3 fine with a single 580 ( i have 2 in sli but i turned one off because just like farcry 3 , it was causing them both to heat up so much it wasnt worth risking setting shit on fire) .. but i did get these 580's when they were brand spanking new at $650AUD a pop , that hurt my wallet but theyve lasted me 2+ years at the TOP with the only situations ive had trouble with is basicly now , after a long hot summer , the sound of the fans drives me insane so i am generally avoiding heating them up too much at all now .. which makes me realise the titan is perfect for me, but i got dental work that needs doing so i wont be gettign them any time soon , lets hope gpu boost 2 and the thermal target make it over to the 7 series.

and yeah ... im clearly a nvidia boy , i am aware that there are some exceedingly good amd cards out there now, but my fanboyism causes me to be lazy and not even mention them .. but im pretty sure as far as bang for the buck goes right now they are the winners.

#26 Edited by Corvak (831 posts) -

@envane: Correct. AMD has good cards, and generally wins the price/performance battle. nVidia wins the driver support and power consumption battle. nVidias also run a bit cooler. Generally, pick your favourite, there is no "winner". Theyre so close to each other right now that it's just splitting hairs.

Titan is probably worth what you get (The GTX 690 beats it by a bit, but at the price of roaring fast fans).

I consider the ambient temperature of north america, not australia - So 20-25C or thereabouts. Overclocked cards (even factory overclocks) quickly overheat when you up that number (say, 30-40, if you've got no air conditioning in a tropical summer) In this situation, a cooler running Titan is vastly better than the fan-heavy GTX 690 or two GTX 680s. AMD cards, while great, are also more dependant on air cooling.

Also I don't know where the 680 beating Titan comes from - every bench i've seen has had the Titan on top, except perhaps 680s in SLI - which is not a better option when you consider air cooling in a hot climate.

a Titan will probably last just as long as two 680s without needing to be replaced.

#27 Edited by Devildoll (876 posts) -

I would probably go with a 7970 or 680 today, their younger siblings are totally fine as well.

Dont worry about Dx12, It seems it wont be coming to windows 7, and alot of people still think the metro ui of win 8 is demented. ( even though the OS itself has an awesome taskmanager and a slight performance increase )

Similarly to developers not making vista exclusives.

You should probably stay away from multi-gpu solutions until you are informed and experienced enough in hardware to make that decision without asking anyone else.

@corvak

:

7970 GHz edition is pretty much just a 7970 with a new bios, like you said,

ive been running my own 7970 @ 1150 MHz since launchday on reference cooling. and thats without increasing the voltage, which means the temp difference is minimal.

most if not all 7970 will do 1 GHz with ease.

@mirado:

Yeah 7970's have a backup bios, but even if it didnt, you could just do it like the olden days, dump in another graphics card to give you eyes to be able to write in the flash commands to flash back to the stock bios, or if you are really hardcore, just write em in blind.

As long as the card didnt fry from the bad bios, that'll work, and if it did fry, the backup read only bios wouldn't do much good anyway.

@alistercat: Nvidia and AMD arn't reacting to the new consoles, they are constantly pushing the envelope, there isn't any higher chance of a huge leap or new tech getting discovered compared to any other year in the business.

After all, alot of the ps4 is AMD chips, which are PC parts that have been R&D over alot of years, and then ajusted for consoles, its not like AMD is going to be blindsided by its own tech.

@mr402:

well, some situations in BF3 makes my 7970 seem lacking even at 1080P, but that's just cause i want it silky smooth, everyone has their own thresholds.

@deathstriker666: do you have any actual spec on the upcoming 8000 series, or are you using it as a typical example, all ive heard about 8000 series so far is that there will only be an oem series initially, that is a complete rebrand of the 7000 series, but since its oem, it wont be sold at retail at all, only as part of branded computers.

@myketuna:

do you really have to read benchmarks to know that your card is too weak?

what about playing an actual game and just feeling that it is stuttering and inadequate?

If you still think games run great, there is no need to upgrade, just cause a benchmark said otherwise.

#28 Edited by myketuna (1644 posts) -

@devildoll: I get what you're saying. The only issue I have with that is I'd have to actually buy the game to see how the game runs on my particular setup. But then if it doesn't work, I'd have to upgrade and let the game sit worthlessly until I do so. Unless of course the game has an official benchmark that is separate from the actual game. I really wish PC demos were still a thing. Hell, I wish demos period were still a thing. Then, I could do exactly what you said every time before spending money just to see if I need to spend more money. Thanks for the answer though.

#29 Edited by Devildoll (876 posts) -

@myketuna: i buy quite a few games, so i never get any hard swings in pc performance.
you'll notice one game dipping down to 55fps, the next to 52, etc, and since i try to keep enough change around that i can upgrade on the spot if it is a good idea.

I can see it being harder if you buy games less often however.
And about the demo, i feel no shame in getting the "demo" from thee bay, if the developers don't provide one themselves.

If i play the whole game without paying even, then I'm an ass for sure, if i just boot it to get an idea of whether it is stuttering or not, i see no harm in that.

#30 Edited by MordeaniisChaos (5730 posts) -

DX12 will take a very long time to become "needed." Very few games use DX11 even today, and it's been around for quite a while.

That said, if you want to go all out, you might consider a single Titan. It'll avoid the problems of SLI, it'll be quiter by a long shot, as well as more efficient, and might actually be better for stuff like compute based effects, which we will be seeing a lot more of (which I'm a huge fan of).

But really, you should just get the high end of either the 7000 series or the 600 series single GPUs. SLI can be a double edged sword, and my 680 runs everything but ArmA flawlessly with AA. And ArmA is a special case, that would probably benefit from my overclocking my 3770K, but I need to get back to work so I can get some better cooling on that bitch first.

Dishonored looks awesome with a shit ton of AA and no HUD at a very solid and smooth 60fps.

Unless you're like me and do stuff that benefit from a crazy amount of VRAM or power (I dabble in 3D stuff, Blender loves it some good GPU power, and more complex scenes require a lot of memory), or you play on multiple high resolution screens, you don't need more than a 680 or whatever the AMD equivalent is. You'll be able to run everything, with real AA (not that awful FXAA shit), at 60FPS assuming it isn't running a mess of an engine, or isn't Metro 2033. And even that is totally playable at max settings.

I spent a ton of money on my PC, still plan on spending more (that Titan is reallllllly tempting), and would never take it back. It's awesome having a top of the line PC.

#31 Posted by myketuna (1644 posts) -

@devildoll: Yeah. I used to do the bay thing as well, but I'm trying to cut down. Not to mention, I still don't feel like downloading gigs to see if a game works. Like I said, regular demos and benchmarks would end this problem entirely. I'll probably just upgrade around the time I said anyway (early to mid-2014) and see what happens. I'm sure I'll have bought a recent game by then to see how my 570 is holding up.

#32 Edited by Kidavenger (3485 posts) -

Whatever version of DirectX ends up being in the next xbox will be the standard for PC games for the next 6-7 year, it's going to be either DX11.1 or DX12; seeing how DX11.1 will be over a year old by the time the next Xbox comes out I'd say there is a very good chance DX12 will be out and will be the standard used, and knowing Microsoft they will use it to get people to upgrade to Windows 8.

Buying an ultra high end video card for future proofing is not the way to go, sure it will last you longer, but buying high end cards and keeping them for 2-3 years, you'll get more mileage for less $$$ and you'll probably end up with a better card at the end. I wouldn't buy a 690 or a Titan unless you will actually take advantage of what it can do today i.e. play games at extremely high settings/resolutions.

#33 Posted by Subject2Change (2966 posts) -

GTX 670 and replace every 2-3 years. Sweet spot is usually the 250-400 range for a card.

#34 Posted by Caustic_Fox (112 posts) -

Thanks for the input guys. Since were still on the topic of graphics cards, I might as well ask anyways. My current system runs on DDR 2 RAM. How much of a performance loss am I looking at if I were to upgrade my graphics card to... lets say a Radeon 7850/7870/7950? Also, my board has something called Hybrid Crossfire Support. Will this help compensate for the speed loss at all? Here are my listed specs:

Mobo: MSI 785GT E-63

Processor: AMD Phenom II 960T (Stock @ 3.0 GHz X4) Overclocked & Unlocked to 3.6 GHz X6

Processor Cooler: Cooler Master Hyper 212+

RAM: 4.6 GB DDR 2

Graphics card: nVIDIA GTX 260 Overclocked

Current resolution I play my games: 1440 X 900 (Wish that these 16:10 monitors weren't so damn expensive.)

#35 Edited by Devildoll (876 posts) -

4,6 GB's? that's a a pretty stupid sum, what kind of individual sizes do you have in there?

DDR2 by itself wouldn't be that much of a problem, The cpu is far from top notch though, but since you've unlocked and overclocked it, it should be able to run even a 7970 without choking it terribly in a cpu heavy game like BF3.

but that would be pretty overkill at that resolution, a 7870 would proably do pretty okay.

#36 Posted by Caustic_Fox (112 posts) -

Why 4.6 GB's? When I built my system I used most of the OEM parts that I had from before.

Originally it came from a HP OEM system that had 3.0GB DDR 2. (2x 1GB and 2x 512MB @ lousy 533MHz speed)

So I decided to throw in a 2GB 667MHz stick in there while switching out one of the 512MB sticks.

Yeah, I know that this isn't the proper way of doing it, as I'm running it in Unganged Mode currently.

Speaking of BF3, my GTX 260 chokes enough so that I have severe frame-rate drops whenever I zoom into a sniper scope. The only game that it cannot physically run currently is Crysis 3.

#37 Edited by Devildoll (876 posts) -

@caustic_fox: yeah a 260 is quite a lot weaker than a 680 or 7970.

buy another stick of 2 GB 667, and then run the two 1 GB sticks along with the two 2 GB sticks, it'd cost the same as a pizza :P

#38 Edited by Raven10 (1696 posts) -

Okay here are some of my thoughts.

1. Anyone saying that their mid range card can run any game at max settings at 60 fps in 1080p or above is simply lying. I always keep fraps on during my gaming sessions. I have a Geforce 560 ti, which is not that much worse than the 660 or 660 ti people are saying gives them top performance. I almost never get smooth framerates at 1080p on max settings in demanding games. Just to name a few that have killed my performance - Batman Arkham City using either DX11 or High PhysX caused my framerate to dive to 30 or below in some sections. The Witcher 2 runs at about 40 fps on average with most settings on max, but still a fair number of high level stuff turned off. Crysis 2 ran at about 45 fps maxed out without DX11 and at about 15 fps with DX11. I haven't even bothered with Crysis 3. Both Empire and Shogun 2 Total War butcher my framerate at anywhere close to max settings. Alan Wake at max settings ran at between 45 and 50 fps. Planetside 2 ran between 25 and 55 fps. Serious Sam 3 I had to run at mostly medium settings just to get the game to run at 40 fps. Assassin's Creed Revelations required me to use SMAA to get around 50 fps whereas in the first three games I could get away with MSAA. Haven't tried 3 yet. Point is, I'm not naming the latest and greatest games here. I am not talking about Crysis 3 or Far Cry 3 or other really boundary pushing games. Even simple console ports have major issues running in 1080p at 60 fps.

Also note I say max settings but in none of these games did I have MSAA on. I either used FXAA or more recently SMAA. Turning on MSAA in any of these games would make the game unplayable. For reference I have a first gen Core i7 and 6 GB of RAM so not the most powerful stuff, but far beyond what any game requires or even recommends.

All that said, the question is, do you really need to run games at 1080p or above at 60 fps on max settings? Or are you willing to sacrifice framerate or visual fidelity to keep costs down? Personally I'm willing to drop down to 40 fps or so and drop some DX11 effects so a 560 ti has worked fine for me. I would bet with the release of next gen consoles, though, I'm going to have to do some major upgrading if I want to keep up. In the end that is my final recommendation. Wait until the next gen consoles are out and see what type of requirements the PC ports have. Right now games are limited by the need to run on current gen consoles. Once those restrictions are lifted I expect many of the minimum requirements for games will drastically increase. In the end I would love to run a tri-SLI Titan system but it really isn't worth the money if you have any sort of budget. People who buy Titans and 690's are people who have the money to blow. Neither of those cards are worth the money from a price to performance perspective. But if you always want the best and want to run games at resolutions higher than 1080p at 60 fps then by all means go for it. Don't let anyone fool you, you aren't going to run Crysis 3 anywhere close to max with a 670 or even a 680/7970.

EDIT: Also wanted to note that the 600 series of cards have terrible Direct Compute capabilities. If you are going to be doing anything that uses Direct Compute or OpenCL then get a 7000 series card or a 500 series card if you really want Nvidia. And example would be the TressFX effects in Tomb Raider or any number of math programs.

#39 Posted by zenmastah (870 posts) -

I have a stinking suspicion that any card with less than 4GB of memory wont cut it going into next gen games..

#40 Edited by Corvak (831 posts) -

@zenmastah:True. I don't see us using more than 2GB for games in 1080p, but the PC is not restricted by HD standards or the HDTV industry the way consoles are. There's demand for 60fps+ performance on 2560x1600 displays, and as their prices drop, the demand will only increase. And of course with more pixels comes a much higher memory requirement - I see 4GB/6GB becoming the standard for high end cards.

#41 Edited by zenmastah (870 posts) -

Yeah, was mainly thinking about the ridiculous uppage or RAM in the next gen consoles and how that would affect GPUs and their memory in PC.

i mean you take your basic console port these days and the VRAM usage is about 0.5Gigs, so when the next gen comes what is going to be the "standard" then..

Of course there are games even now that use serious amounts of VRAM, i can cap my 2GB card in Tomb Raider alone when im downsampling with SSAA.

#43 Posted by EXTomar (4444 posts) -

For "capital purchases" it is better to spend more today. If a cheap low performance card and expensive high performance card both last the same amount of time (don't break, driver support, etc) then you'll get more value and not replace the expensive high performance one as quickly. The only time this is a problem is when accidents happen and you have to eat the cost.

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.