So just what will be the new standard VRAM?

Avatar image for adequatelyprepared
AdequatelyPrepared

2522

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Hey all, I'm considering upgrading my GTX770 2GB to a GTX970. Normally, I would not even consider such an upgrade, but I have a friend who is willing to purchase the GTX770 off of me for about $200, meaning that the net expenditure on the 970 will go to about $300, which I find acceptable. However, what with recent VRAM requirements of newer games (LotR- 6GB, The Evil Within- 4GB), I'm worried that I'll be buying this card only to become completely outpaced in terms of VRAM in about half a year to a year. I thought that for a while, especially with all the other games, that the average VRAM requirement for maximum quality in games will sit at about 3-4GB, especially for this generation of consoles.

Are The Evil Within and LotR just outliers, or are they a sign of things to come for PC ports?

Avatar image for josher14
josher14

46

Forum Posts

5

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I know that the new lotr mordor game is only 6gb if you want to install the ultra high texture pack... Gamemakers and publishers definitely work with nvidia and amd to get their games to work on their cards.. So I don't see a brand new card not working with the newest games for at least a year.. As for me I still have a gtx 670 that has 4gb and am still happy with the performance after a year and a half..

You could also sli/crossfire down the road if you need more horsepower..

good luck with the decision!

josher

Avatar image for dagas
dagas

3686

Forum Posts

851

Wiki Points

0

Followers

Reviews: 1

User Lists: 8

I run Tomb Raider on my 270X with 2GB VRAM all settings maxed and it looks better than the PS4 version so I don't see why all of a sudden we need 4GB for games to match PS4. I think some games are not optimzed properly if we need 4GB of VRAM when the steam survey says less than 2 percent of people have that. You'll always run into a problem with having to upgrade on PC though. Only way to not have to worry about that is to get a console. The GTX970 should do fine for some time though I think. I plan on sticking with my card fpr awhile since it is the most powerful card I can fit in my mITX case.

Avatar image for devildoll
Devildoll

1013

Forum Posts

286

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

we'll see when these games are actually playable.

Avatar image for mannymar
MannyMAR

662

Forum Posts

3

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

#5  Edited By MannyMAR

It looks like 3-4 gigs of RAM is going to be the norm if you want to run high quality 1080+ resolution textures.

I guess developers just figure the consumer should just throw money at the issue instead of compressing the textures in a smart way.

Avatar image for chumley_marchbanks
chumley_marchbanks

228

Forum Posts

252

Wiki Points

0

Followers

Reviews: 1

User Lists: 8

@mannymar: So how exactly do you "smartly" compress textures without compromising heavily on quality? Because I'm sure these talented developers with decades worth of research to base their technology on would dearly love to know that. Good lossy compression algorithms can only get you so far before you start introducing ugly artefacts, and lossless compression is not at all suitable for realistic looking textures. What do you expect developers to do if they want to keep pushing the bar when it comes to visual fidelity?

Avatar image for the_nubster
The_Nubster

5058

Forum Posts

21

Wiki Points

0

Followers

Reviews: 3

User Lists: 1

@mannymar said:

It looks like 3-4 gigs of RAM is going to be the norm if you want to run high quality 1080+ resolution textures.

I guess developers just figure the consumer should just throw money at the issue instead of compressing the textures in a smart way.

Oh, compress them in a smart way! You should take that one all the way to the top. They've probably never thought about doing it that way before.

New consoles allow them to include the ultra high-res textures with the game. Of course a more powerful GPU is going to be needed to run those. With that being said, it's relative; Medium on a game like Shadow of Mordor or Evil Within is going to look better than High on a game like Far Cry 3 or Bioshock Infinite.

The highs are going to continue to get higher, in terms of power needed to run, but that doesn't mean that the lower settings aren't also getting better along with that, and in a way that allows most people to play and enjoy the game. It's like 2% of Steam users who even have 4GB of VRAM, and developers can realise that and build their games in a way makes it accessible to those with less powerful GPUs but rewards people who have built hulking monsters to play games with.

Avatar image for mannymar
MannyMAR

662

Forum Posts

3

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@chumley_marchbanks@the_nubster

You'd be surprised, how many times in game development someone proposes a good idea to reduce budget and have it get shot down by superiors. I've done freelance work at couple of studios, and have had heard texture artists propose different ways to reduce image size without noticeably impacting image quality. Sometimes the higher ups are willing to hear them out, other times they'll push forward to other parts of game development for the sake meeting time milestones, and then you have stubborn bosses (everybody knows what they're like.) It happens, especially to multiplatform games because the technique might work well on a PC and Xbox One, but may not play nice on the PS4 or vice versa.

Avatar image for mikey87144
mikey87144

2114

Forum Posts

3

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

I'm waiting until next year to get my new card. I've had mine for a year and it still works fine.

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

@dagas said:

I run Tomb Raider on my 270X with 2GB VRAM all settings maxed and it looks better than the PS4 version so I don't see why all of a sudden we need 4GB for games to match PS4. I think some games are not optimzed properly if we need 4GB of VRAM when the steam survey says less than 2 percent of people have that. You'll always run into a problem with having to upgrade on PC though. Only way to not have to worry about that is to get a console. The GTX970 should do fine for some time though I think. I plan on sticking with my card fpr awhile since it is the most powerful card I can fit in my mITX case.

You won't need 4gb to match a console ever, this is for "ultra" settings, but high and sometimes medium will look better than ps4/xbone

Avatar image for colourful_hippie
colourful_hippie

6335

Forum Posts

8

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Some games are asking for 3 already so I think the 3-4 range sounds about right. I'm about to sell my 770 and plan on picking up the 980

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

Honestly, the norm in a couple of years will be 5GB plus, because that's where the consoles will be sitting at. If you want to be able to run every game with the maximum level of textures, you will probably need a card that has more then 4 - 5GB's in a few years. For now, anything north of 3GB's is sufficient.

Avatar image for tourgen
tourgen

4568

Forum Posts

645

Wiki Points

0

Followers

Reviews: 4

User Lists: 11

#13  Edited By tourgen

I think you'll be fine with 4gb for a year+. Who really knows though. I think if you can get $200 for your 770 and get a 970 for ~$350 that sounds like a pretty good deal.

Avatar image for rm082e
rm082e

222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

I'm in the same boat with a 2GB 770. Planning on doing the same. Seems like the 770 is going to have a short shelf life on the used market, so I figure getting rid of it now while there is still some value left in it is the best move. I think by early next year, it's going to be almost worthless - especially if they bring out a 960 with 3GB of RAM for around $220...

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

#15  Edited By mike

@liquidprince said:

Honestly, the norm in a couple of years will be 5GB plus, because that's where the consoles will be sitting at. If you want to be able to run every game with the maximum level of textures, you will probably need a card that has more then 4 - 5GB's in a few years. For now, anything north of 3GB's is sufficient.

I'll believe this whole "Consoles are going to mean PCs need 5+gb of VRAM" thing when I see it. In theory it sounds good, but in practice there is just no evidence to support it. Look at...any multiplatform game and how poorly they run compared to a modest PC. Sleeping Dogs Definitive Edition capped out at 30 fps...when a 1.5gb discrete GPU from 2012 can handle it at 60. Shadow of Mordor again, mainly 30 fps on consoles...yet mid range PC GPUs can run it at 1080p/60 no problem, and at higher detail settings. Similar situation with Wolfenstein: The New Order on consoles...it's locked at 60, but in order to maintain that frame rate the vertical resolution has to be dynamically reduced during gameplay. I guess optimization is going to fix all that? I'll believe that when I see it, too.

No amount of VRAM or shared RAM is going to make up for the GPU and CPU being underpowered, regardless if you're talking about a PC or a console.

Avatar image for monkeyking1969
monkeyking1969

9095

Forum Posts

1241

Wiki Points

0

Followers

Reviews: 0

User Lists: 18

#16  Edited By monkeyking1969

The way I see it when you start seeing 4GB on middle of the road cards, at the $250 to $350 range, you know that will be the standard soon. If I were rebuilding a system that wasn't just a low-end steam box, I would without a doubt shoot for 4GB with a purchase now.

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

@mb said:

@liquidprince said:

Honestly, the norm in a couple of years will be 5GB plus, because that's where the consoles will be sitting at. If you want to be able to run every game with the maximum level of textures, you will probably need a card that has more then 4 - 5GB's in a few years. For now, anything north of 3GB's is sufficient.

I'll believe this whole "Consoles are going to mean PCs need 5+gb of VRAM" thing when I see it. In theory it sounds good, but in practice there is just no evidence to support it. Look at...any multiplatform game and how poorly they run compared to a modest PC. Sleeping Dogs Definitive Edition capped out at 30 fps...when a 1.5gb discrete GPU from 2012 can handle it at 60. Shadow of Mordor again, mainly 30 fps on consoles...yet mid range PC GPUs can run it at 1080p/60 no problem, and at higher detail settings. Similar situation with Wolfenstein: The New Order on consoles...it's locked at 60, but in order to maintain that frame rate the vertical resolution has to be dynamically reduced during gameplay. I guess optimization is going to fix all that? I'll believe that when I see it, too.

No amount of VRAM or shared RAM is going to make up for the GPU and CPU being underpowered, regardless if you're talking about a PC or a console.

But we're not talking about how such things will impact framerate. We're talking about games and their abilities to load in the highest resolution of textures all at once. Didn't Shadows of Mordor recommend 6GB's of VRAM? It probably doesn't need that much to run smoothly, but future games probably will need something closer to that number, especially if multiplatform games on consoles are really utilizing all of their RAM to load in the highest possible textures.

Avatar image for vackillers
VACkillers

1286

Forum Posts

82

Wiki Points

0

Followers

Reviews: 4

User Lists: 4

#18  Edited By VACkillers

I'm also currently humming about this very topic as well, I currently own a Zotac GTX 760 4GB vRAM version card, and while the vram is good course the card is not high end, looking at the new 9 series cards which both also have 4GB VRAM on them, but there is supposed to be a slew of new GPUs that are going to be higher-end versions of the 980 and 970, and expected to house upto 8GB of VRAM on them, so I'm wondering if I should just simply wait for those cards to be released in about a month or so. The idea of 4GB being plenty for the next year seems mixed and personal preference, I think 4GB will be fine for another year or so right up until 4K gaming becomes a LOT more mainstream and I still see that being several years away yet.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

#19  Edited By mike

@liquidprince: You seem to be under the assumption that consoles are using Ultra-equivalent textures, which we've seen from the games I cited above is simply not the case. So not only does the PS4's shared RAM not allow it to run Ultra level textures, it can't seem to display them at very high frame rates, either.

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

@mb said:

@liquidprince: You seem to be under the assumption that consoles are using Ultra-equivalent textures, which we've seen from the games I cited above is simply not the case. So not only does the PS4's shared RAM not allow it to run Ultra level textures, it can't seem to display them at very high frame rates, either.

Then that only means that PC's will need even more usable VRAM then consoles in the future. Imagine the same scene being rendered on PC and on consoles; if the consoles need at least 5GB's to fully load in all the textures of that room at high settings, the PC will need at least as much if not more to load in the ultra settings.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

@mb said:

@liquidprince: You seem to be under the assumption that consoles are using Ultra-equivalent textures, which we've seen from the games I cited above is simply not the case. So not only does the PS4's shared RAM not allow it to run Ultra level textures, it can't seem to display them at very high frame rates, either.

Then that only means that PC's will need even more usable VRAM then consoles in the future. Imagine the same scene being rendered on PC and on consoles; if the consoles need at least 5GB's to fully load in all the textures of that room at high settings, the PC will need at least as much if not more to load in the ultra settings.

Like I said, it makes sense on paper...and like you said, "in the future" - we haven't reached the point yet where there is any direct evidence at all to support anything you're saying, there is only theory. I'm curious as to which platforms you own now that you've posted a couple of times in this topic, but I already have my guesses.

Avatar image for liquidprince
LiquidPrince

17073

Forum Posts

-1

Wiki Points

0

Followers

Reviews: 1

User Lists: 5

#22  Edited By LiquidPrince

@mb said:

@liquidprince said:

@mb said:

@liquidprince: You seem to be under the assumption that consoles are using Ultra-equivalent textures, which we've seen from the games I cited above is simply not the case. So not only does the PS4's shared RAM not allow it to run Ultra level textures, it can't seem to display them at very high frame rates, either.

Then that only means that PC's will need even more usable VRAM then consoles in the future. Imagine the same scene being rendered on PC and on consoles; if the consoles need at least 5GB's to fully load in all the textures of that room at high settings, the PC will need at least as much if not more to load in the ultra settings.

Like I said, it makes sense on paper...and like you said, "in the future" - we haven't reached the point yet where there is any direct evidence at all to support anything you're saying, there is only theory. I'm curious as to which platforms you own now that you've posted a couple of times in this topic, but I already have my guesses.

I have a PS4, X1 and a Wii U. I also have a fairly decent gaming PC that I built like a year and a half ago.

Avatar image for cornbredx
cornbredx

7484

Forum Posts

2699

Wiki Points

0

Followers

Reviews: 0

User Lists: 15

The way things have been going (with Watch Dogs, Mordor, DR3, etc.) I suspect it will 3 or 4GB. 4GB will probably be the comfortable standard which means I'll upgrade my gpu again in like 5 years which is what I wanted so I done something right for once.

Avatar image for conmulligan
conmulligan

2292

Forum Posts

11722

Wiki Points

0

Followers

Reviews: 0

User Lists: 11

@mb said:

Like I said, it makes sense on paper...and like you said, "in the future" - we haven't reached the point yet where there is any direct evidence at all to support anything you're saying, there is only theory. I'm curious as to which platforms you own now that you've posted a couple of times in this topic, but I already have my guesses.

The amount of memory in current-gen consoles isn't going to directly affect the amount of VRAM on PC video cards, but the fact that both have unified memory banks absolutely will. At this stage, high-end gaming PCs are the odd ones out with consoles, mobile devices and lower-end PCs all having SoCs with a unified memory architecture. Multi-platform engines, and especially ones that are designed to run on mobile (which is all of them, really) are going to be optimised for this setup. This inconsistency won't last for long — NVIDIA already have cards with CPU-addressable memory on their roadmap for 2016. They haven't said, and probably don't know yet, how this will affect the amount of VRAM on Pascal cards, but if it's going to be addressable by both the CPU and GPU I have to imagine it's going to be in the 8-16GB range (Pascal will also let the GPU access system RAM more quickly, but it's unclear if that means the GPU's VRAM amount will be kept in check). Until then, we're probably going to see a small regression in the overall quality of PC ports.

Avatar image for oursin_360
OurSin_360

6675

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

My guess is this is all about "4k", so i would say for 1080p the standard should still be around 3 or 4.

Avatar image for mike
mike

18011

Forum Posts

23067

Wiki Points

0

Followers

Reviews: -1

User Lists: 6

#27  Edited By mike
@oursin_360 said:

My guess is this is all about "4k", so i would say for 1080p the standard should still be around 3 or 4.

4k for all but the highest end setups is a ways off even for PC. For consoles, they probably aren't going to see 4k games until the next console generation. The PS4 and Xbox One have trouble maintaining 1080p/60 or even 1080p/30 in some cases. 3840x2160 is four times the amount of pixels as 1920x1080, I don't see that happening for any but the most simple of games during this console cycle.

I agree though...3gb or 4gb is going to be fine at 1080p for a number of years.

Avatar image for rm082e
rm082e

222

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

In case you all haven't seen it yet, I recommend you check out the Digital Foundry report on the Ultra Texture Pack for SoM:

Eyes-on with PC Shadow of Mordor's 6GB ultra-HD textures

I probably need to turn in my MasterRace™ card here, but I can't see any tangible difference between the Ultra and High quality Textures there. There are a few specific spots on the character that they look slightly different, but neither looks better. Honestly that to me looks like it could be the difference between slightly different captures during idle animations. I don't see the differences they are talking about on the floor and wall textures.

Avatar image for mikkaq
MikkaQ

10296

Forum Posts

52

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Man I thought I'd be cool getting 2GB six months ago and now I feel like I made a bad call. But I like me some sharp textures.