Something went wrong. Try again later
    Follow

    PC

    Platform »

    The PC (Personal Computer) is a highly configurable and upgradable gaming platform that, among home systems, sports the widest variety of control methods, largest library of games, and cutting edge graphics and sound capabilities.

    8GB VRAM GPUs from AMD and Nvidia incoming.

    Avatar image for hassun
    hassun

    10300

    Forum Posts

    191

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    http://www.loadthegame.com/2014/11/03/nvidia-readying-geforce-gtx-980-8gb-imminent-release/

    As predicted, with the rise of >1080p PC gaming and some recent PC releases getting extraordinary recommended VRAM options it seems the inevitable GPU VRAM increase is upon us.

    Both AMD and Nvidia will release 8GB VRAM editions of existing GPUs soon. No word on pricing yet I'm afraid but I hope that some people who are still on the fence about getting a new graphics card will find this news interesting.

    Avatar image for onarum
    onarum

    3212

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    oh wow, I can only imagine the price will see a very considerable bump, I mean GDDR5 is pretty expensive no?

    Either way this kinda makes me happy since I ALMOST got a gtx 980 last week but decided to wait a bit instead, now I'm definitely waiting to see what's up with this 8GB craziness.

    8 GB OF FREAKING VRAM, fuck me... I can only guess that in the future devs won't even bother to use RAM for textures and stuff anymore, just dump everything directly on VRAM and call it good...

    Avatar image for greggd
    GreggD

    4596

    Forum Posts

    981

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    #3  Edited By GreggD

    Ugh. I'll probably need to build a whole new rig, at this rate. My CPU is middle of the road, so the bottlenecking will surely be drastic.

    Avatar image for justin258
    Justin258

    16684

    Forum Posts

    26

    Wiki Points

    0

    Followers

    Reviews: 11

    User Lists: 8

    My 7870 has, thus far, held out pretty well. I'll see how things go next year and buy something toward the end of next year, maybe in 2016 instead.

    Processor requirements aren't getting ridiculous, so that's good news for my wallet.

    Avatar image for brendan
    Brendan

    9414

    Forum Posts

    533

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 7

    Seems appropriate. New generation means more spending for PC gamers for a few years before the cycle returns to 2010-2013 (if they want consistently high performance that is).

    Avatar image for schrodngrsfalco
    SchrodngrsFalco

    4618

    Forum Posts

    454

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 7

    #6  Edited By SchrodngrsFalco

    Inject that VRAM right into my veins! Glad I'm taking so long to pick my parts. I expected the 8GB to be coming out in the near future, but within the next two months?! Or... I guess I'm thinking of the rehash of the 980 which is rumored to be coming out Q2 2015... either way, 8GB, hell yeah! 1440p ultra, yes please!

    Avatar image for werupenstein
    Kidavenger

    4417

    Forum Posts

    1553

    Wiki Points

    0

    Followers

    Reviews: 90

    User Lists: 33

    I wonder how they will even be able to fit that much ram on a single gpu board.

    Seems like a ridiculous jump in any event

    Avatar image for tuxfool
    tuxfool

    688

    Forum Posts

    28

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @kidavenger: They just need to use higher density RAM. The is a reason for the jump from 4GB to 8GB is due to the 512bit and 256bit bus width of the AMD and Nvidia cards. If they had a 128 or 356 bit bus width they would use 3GB and 6GB configurations (witnessed by the 780 and titan respectively).

    Avatar image for colourful_hippie
    colourful_hippie

    6335

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #9  Edited By colourful_hippie

    Alright so any of you thinking that getting one of those 8 gig cards will future proof your PC gaming think again. Yeah sure you'll be able to handle more memory as these games get more VRAM hungry as the generation keeps going but there's one big drawback. Don't expect your frames to keep up, by the time games start demanding that kind of RAM space there will be way stronger cards out there that will actually have the horsepower to drive those games at higher resolutions AND high framerates.

    When there's 4 gig 970's and 980's out there I really don't see much reason to jump on an 8 gig card this early. There isn't a demand for it and I would much rather wait for stronger cards to come out that will already have large amounts of RAM without the premium upcharge in price.

    Inject that VRAM right into my veins! Glad I'm taking so long to pick my parts. I expected the 8GB to be coming out in the near future, but within the next two months?! Or... I guess I'm thinking of the rehash of the 980 which is rumored to be coming out Q2 2015... either way, 8GB, hell yeah! 1440p ultra, yes please!

    As someone who has both a 980 (4gig) and a 1440p monitor, I'm already doing that with pretty nice framerates

    Avatar image for karkarov
    Karkarov

    3385

    Forum Posts

    3096

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Still don't see the point. Every game thus far that has claimed you needed 8 gig vram worked fine with 4 gig if your card was good and or you ran at 1080. I would sooner just use the money to go sli with two 970's.

    Avatar image for mike
    mike

    18011

    Forum Posts

    23067

    Wiki Points

    0

    Followers

    Reviews: -1

    User Lists: 6

    Can't wait until that GTX 980 Ti 8gb comes out so I can do an upgrade I definitely don't need! Anyone interested in a slightly used 780 Ti or 780?

    Avatar image for tuxfool
    tuxfool

    688

    Forum Posts

    28

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @colourful_hippie: yeah, 8GB is overkill if you're thinking of future proofing with a single gpu.

    The 8 gig cards could be useful if you plan on gaming with 4K and/or 144fps . Of course, you would be going with a crossfire or SLI setup as there are no single gpu cards that really perform adequately at 4K, unless you accept lowered IQ (sort of defeating the purpose).

    Avatar image for ajamafalous
    ajamafalous

    13992

    Forum Posts

    905

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 9

    $750 at least? That's my guess.

    Avatar image for rm082e
    rm082e

    222

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    1. VRAM production has skyrocketed since the PS4 went into production, so it should be reasonably cheap to put 8GB on a card. Certainly much cheaper than a couple of years ago.

    2. While 8GB may be overkill, we've got games that are already taking up all 4GB on the 9xx cards. The hows and whys don't matter - we can use more than 4GB, so they should put out a card with more.

    3. With the low power draw of the Maxwell architecture, SLI is a viable option for those looking to futureproof.

    4. While running games at 1080 probably won't eat up much more than 4GB, downsampling from 4K very well might. Those looking to push their pixel count as high as possible need all the VRAM they can get.

    Avatar image for mikey87144
    mikey87144

    2114

    Forum Posts

    3

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    Wow. I'm glad I decided to wait.

    Avatar image for tennmuerti
    Tennmuerti

    9465

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 7

    #16  Edited By Tennmuerti

    Huh this topic kinda weirded me out a bit, reason being that I have finally decided to get a new gaming laptop to replace my current one (yes gaming laptops trash blah blah). And I was specifically getting one with a gtx 980m that have very recently come out. Now obviously the mobile version of the 980 is nowhere near as good as the same regular gpu, but the seller I regularly use and have found reliable has it listed as a 8GB card. So upon reading this topic i did a double take and did some online "research" and sure enough apparently the mobile version of the 980 gpu already comes with 8GB of vram.

    Seems kinda strange that they have introduced this first in the mobile card rather the the much more powerful and much more common regular gpu cards.

    Avatar image for chumley_marchbanks
    chumley_marchbanks

    228

    Forum Posts

    252

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 8

    Huh this topic kinda weirded me out a bit, reason being that I have finally decided to get a new gaming laptop to replace my current one (yes gaming laptops trash blah blah). And I was specifically getting one with a gtx 980m that have very recently come out. Now obviously the mobile version of the 980 is nowhere near as good as the same regular gpu, but the seller I regularly use and have found reliable has it listed as a 8GB card. So upon reading this topic i did a double take and did some online "research" and sure enough apparently the mobile version of the 980 gpu already comes with 8GB of vram.

    Seems kinda strange that they have introduced this first in the mobile card rather the the much more powerful and much more common regular gpu cards.

    Are you sure it's 8GB of dedicated VRAM and not 8GB of shared memory?

    Avatar image for colourful_hippie
    colourful_hippie

    6335

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    There are reports that the 8 gig Nvidia cards will still have the same bus bandwidth as the original 900 series, gross. No thank you

    Avatar image for tennmuerti
    Tennmuerti

    9465

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 7

    #19  Edited By Tennmuerti

    @chumley_marchbanks: I am trying to find out currently if the 4 gigs of that is shared. The reports seem ti differ if it supports up to 4 or up to 8 ddr5, even in the goddamn same articles...

    Edit: yep there are 980m models that come with 8 gig dedicated

    Which seems kind of a bit silly. They are "future proofing" a card that has 3/4 at best (to half at worst) of the current 980 performance anyway. (tho, i'm not going to complain)

    Avatar image for jesus_phish
    Jesus_Phish

    4118

    Forum Posts

    3307

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @colourful_hippie: I can see that happening with some of these for sure. While I think eventually 8gb of VRAM will be a good idea, right now it sounds like they're just hoping people not in the know end up buying them because 8 is more than 4. It's like people who buy a 30 speed mountain bike because they assume it must be better because it has more gears, not paying any attention to the gear ratios and which ones are actually useful.

    And as another poster mentioned, but the time 8GB of VRAM becomes a standard, the cards of that time will be better suited than the cards of today.

    Avatar image for ripelivejam
    ripelivejam

    13572

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    I still don't see how the modest bump in console power equates to all this expensive craziness on the PC end. I'm of a highly paranoid mind that this is more game devs being lazy and not optimizing ports well, plus GPU manufacturers wanting theirs. I'm pretty sure something's wrong when a graphically impressive game like BF4 can manage close to 120fps on decent (not great) hardware while AC Unity has this 6gb vram recommended crap.

    Avatar image for alexw00d
    AlexW00d

    7604

    Forum Posts

    3686

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    There are already a couple of 8gb cards floating about btw...

    Avatar image for spazmaster666
    spazmaster666

    2114

    Forum Posts

    42

    Wiki Points

    0

    Followers

    Reviews: 9

    User Lists: 16

    #23  Edited By spazmaster666

    I would much rather see a full refresh (i.e. dye shrink to either 20nm or 16nm) of the current 900 series Maxwell cards with some added VRAM (be it 6GB or 8GB) that actually gives a significant performance boost over the 780 Ti rather than just a regular GTX 970/980 but with 8GB of RAM since the bottleneck for these cards are not the 8GB of VRAM at 4K resolution. As someone who currently games on a 4K GSYNC monitor with a pair of GTX 980s, VRAM is definitely not the bottleneck.

    Avatar image for monetarydread
    monetarydread

    2898

    Forum Posts

    92

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #24  Edited By monetarydread

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    Avatar image for spazmaster666
    spazmaster666

    2114

    Forum Posts

    42

    Wiki Points

    0

    Followers

    Reviews: 9

    User Lists: 16

    #25  Edited By spazmaster666

    @monetarydread said:

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    Using uncompressed textures seems to me as just a lazy way to have "Ultra" quality textures in your game without actually having to make them more detailed or a higher resolution. Take Shadow of Mordor. The "Ultra HD" textures on it are simply uncompressed versions of the "High" setting textures meaning you're not really getting more detail per se, just fewer compression artifacts from the DXT texture compression. Games that are using high-res, compressed textures, even at 4K won't really be pushing the 4GB limit a least for a little longer. And games that do will need much beefier hardware than say a GTX 980 to play those games smoothly at 4K resolution. Personally I still think 8GB is overkill on this current generation of cards since they can't perform well enough at 4K anyway (unless you SLI/Crossfire). 6GB would have been a sweet spot IMO for the current gen cards (GTX 970/980).

    Avatar image for privodotmenit
    PrivodOtmenit

    553

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #26  Edited By PrivodOtmenit
    @karkarov said:

    Still don't see the point. Every game thus far that has claimed you needed 8 gig vram worked fine with 4 gig if your card was good and or you ran at 1080. I would sooner just use the money to go sli with two 970's.

    Even 2GB has been fine when games are saying 3 or more. Shadow of Mordor wasn't even worth installing the texture pack because wow that difference is absolutely tiny.

    Avatar image for giant_gamer
    Giant_Gamer

    1007

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    It was only a matter of time since consoles have 8 gb and devs are getting lazier with pc ports every year.

    I'll wait for a new release that have been built with the 8 gb in mind.

    Avatar image for metal_mills
    metal_mills

    3604

    Forum Posts

    4049

    Wiki Points

    0

    Followers

    Reviews: 10

    User Lists: 3

    #28  Edited By metal_mills

    It was only a matter of time since consoles have 8 gb and devs are getting lazier with pc ports every year.

    I'll wait for a new release that have been built with the 8 gb in mind.

    Neither have anything close to 8gb of vram. PC's blow consoles out of the water in terms of power.

    Avatar image for colourful_hippie
    colourful_hippie

    6335

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @colourful_hippie: I can see that happening with some of these for sure. While I think eventually 8gb of VRAM will be a good idea, right now it sounds like they're just hoping people not in the know end up buying them because 8 is more than 4. It's like people who buy a 30 speed mountain bike because they assume it must be better because it has more gears, not paying any attention to the gear ratios and which ones are actually useful.

    And as another poster mentioned, but the time 8GB of VRAM becomes a standard, the cards of that time will be better suited than the cards of today.

    And the other plus side of future cards is that the extra RAM won't be a premium.

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    You're missing my point on the performance not being there yet. Just because the bigger cards can handle, at the very least, rendering at 4k doesn't mean that the performance will be there too unless you're crazy and want to burn barrels of money on SLI setups. I always prefer single card setups for more reliability.

    There's a decent amount of poor optimizing in these games coming out now demanding stupid amounts of VRAM so as optimizations get better then the real need for large amounts of VRAM can be pushed off for later. As I said earlier I much rather wait for cards in the future that will no longer charge premiums for more VRAM and will actually have the performance chops to render high resolutions at high framerates.

    Avatar image for colourful_hippie
    colourful_hippie

    6335

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @giant_gamer said:

    It was only a matter of time since consoles have 8 gb and devs are getting lazier with pc ports every year.

    I'll wait for a new release that have been built with the 8 gb in mind.

    Neither have anything close to 8gb of vram. PC's blow consoles out of the water in terms of power.

    PS4 does have 8 gigs of GDDR5 RAM that you see used in desktop video cards but the difference with the PS4 and a PC is that their 8 gig is used by the whole system versus the split in PC with VRAM in video cards and standard RAM.

    Avatar image for monetarydread
    monetarydread

    2898

    Forum Posts

    92

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #31  Edited By monetarydread

    @spazmaster666 said:

    @monetarydread said:

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    Using uncompressed textures seems to me as just a lazy way to have "Ultra" quality textures in your game without actually having to make them more detailed or a higher resolution. Take Shadow of Mordor. The "Ultra HD" textures on it are simply uncompressed versions of the "High" setting textures meaning you're not really getting more detail per se, just fewer compression artifacts from the DXT texture compression. Games that are using high-res, compressed textures, even at 4K won't really be pushing the 4GB limit a least for a little longer. And games that do will need much beefier hardware than say a GTX 980 to play those games smoothly at 4K resolution. Personally I still think 8GB is overkill on this current generation of cards since they can't perform well enough at 4K anyway (unless you SLI/Crossfire). 6GB would have been a sweet spot IMO for the current gen cards (GTX 970/980).

    I was getting 200 fps while using "high," textures at 1080p, why wouldn't I enable uncompressed textures if they were available? Even if it is a minor improvement it is still better than the alternative. The great thing about PC gaming is the idea that you have choice, and even though most people do not see a difference, I would still rather have the choice. Also, if you have the hardware to run a game with "Ultra," textures, it feels kinda awesome compared to getting 200+fps. I have certainly purchased games for the simple fact that it could stress my graphics cards (Crysis 3), or because it supports my steering wheel (Asseto Corsa), or because it looks amazing with 3D Vision 2 enabled (Assasins Creed 4) and running at 1440p.

    Remember I am almost 40 years old, I have disposable income and when you compare the price of PC gaming vs buying a motorcycle, sled, or even mountain biking (my friend just spend $6000 on a downhill pedal bike), even buying quad titans is an inexpensive hobby by comparison. Just because you are not in a financial situation that enables you to purchase more expensive hardware does not mean that the industry should avoid trying to earn my extra dollar, or that people like me are the problem with PC gaming.

    Also, you mention that 6gig is good enough for compressed textures, but that would require reengineering the memory bus (256-bit to 384-bit like the titan black), then they would have to source extra 3-gig chips instead of the 4gig chips they are currently using for the 980 (this increases the initial cost of creating this line of cards). So basically it would cost Nvidia more money to create a less powerful card, why would they do that?

    Avatar image for giant_gamer
    Giant_Gamer

    1007

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @metal_mills said:

    @giant_gamer said:

    It was only a matter of time since consoles have 8 gb and devs are getting lazier with pc ports every year.

    I'll wait for a new release that have been built with the 8 gb in mind.

    Neither have anything close to 8gb of vram. PC's blow consoles out of the water in terms of power.

    PS4 does have 8 gigs of GDDR5 RAM that you see used in desktop video cards but the difference with the PS4 and a PC is that their 8 gig is used by the whole system versus the split in PC with VRAM in video cards and standard RAM.

    Exactly, and to make any game that uses most of the allocated ps4 ram to work at the glorious 1080p and with same graphical levels you need no less than 8 gb

    Avatar image for edgaras1103
    edgaras1103

    796

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Am I dreaming or recently nvidia said that they are not going to release models with 8gb vram any time soon? Also I wonder if next year 4gb vram will be minimum because of consoles. Because it always happens with new generation. Pc specs rise up even if the games does not justify it right? I am a bit worried with my 780 ti of bad pc ports or even good ones. I mean look at AC unity specs. We still don't know far cry 4 specs right? And Witcher 3 gosh I don't even know what kind of pc specs with that game will be.

    Avatar image for jesus_phish
    Jesus_Phish

    4118

    Forum Posts

    3307

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @edgaras1103: To be fair to Witcher 3, if it's specs are mental high it'll be because the team are pushing the hardware to the very edges, not because they're lazy.

    Avatar image for trafalgarlaw
    TrafalgarLaw

    1715

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    I think people are forgetting the now ancient ddr3 system ram is bottlenecking PC's. There simply aren't fast access busses from CPU to the GPU's VRAM. The choice then is simple, do a lot more on the VRAM on GPU's by increasing it to 8 GB.

    Avatar image for corvak
    Corvak

    2048

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #36  Edited By Corvak

    8GB seems like it's more about futureproofing for someone wanting true 4k (2160p)

    PC specs tend to be written very conservatively, as they can't assume every PC with those specs will have the same level of efficiency as their test environment.

    Avatar image for tuxfool
    tuxfool

    688

    Forum Posts

    28

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @giant_gamer: The consoles will never be able to allocate the full 8GB to graphics. They currently have 3GB set aside for the operating system, of the 5GB remaining you're going to maybe be using something like 1GB for non graphical data. That leaves 4GB for graphics at most, which seems reasonable as the consoles also have nowhere near the same memory bandwidth as mid-high end graphics cards.

    As time goes by they will reduce the os ram requirements, but it will never be 0 (I suspect they'll release 1GB ~ 1.5GB at most).

    Avatar image for monetarydread
    monetarydread

    2898

    Forum Posts

    92

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #39  Edited By monetarydread
    @colourful_hippie said:


    @monetarydread said:

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    You're missing my point on the performance not being there yet. Just because the bigger cards can handle, at the very least, rendering at 4k doesn't mean that the performance will be there too unless you're crazy and want to burn barrels of money on SLI setups. I always prefer single card setups for more reliability.

    There's a decent amount of poor optimizing in these games coming out now demanding stupid amounts of VRAM so as optimizations get better then the real need for large amounts of VRAM can be pushed off for later. As I said earlier I much rather wait for cards in the future that will no longer charge premiums for more VRAM and will actually have the performance chops to render high resolutions at high framerates.

    I agree that single card setups are better than SLI thats why it makes sense that companies increase VRAM for people with high rez monitors. I have a 980, and even though my monitor only goes up to 1440p, I can enable 4k rendering at the driver level and downsample. Rendering Shadows of Mordor at 4k with "ultra," settings except for uncompressed textures, medium setting for ambient occlusion, and no anti-aliasing, nets me a constant 30-40fps and the vram of my 980 is saturated. Now, my monitor supports G-sync so 30-40fps is as smooth as 60fps without g-sync but I would love to be able to enable AA or uncompressed textures, I would be able to if I had waited for an 8gig version of the card. I agree that this is a specialized product, but if you look at previous generations of Geforce cards with double the VRAM the cost is only an extra $50 - $100 and that is chump-change for people looking to build a powerful single-GPU machine.

    I also disagree that there is a lot of poor optimizing of PC games lately. Go to the internet way-back machine and look at posts from 2006, 2001, or 97 (basically pre-and-post next-gen console releases), you will see nerds on the internet giving PC building advice that says "this should be good enough for a gaming PC," then a year later those same nerds complain about a massive boost in system requirements that excludes them from playing games on their new PC's. After seeing this for so many generations now I am starting to believe that its the nerds giving bad PC advice and hardware manufacturers overstating the performance of their hardware, not "Poor Optimisation," from the developers.

    Before the new consoles were even announced Epic games were saying that Unreal Engine 4 was designed around a 680 as a minimum requirement, the consoles have an 8-core CPU so engines are now designed around being able to push 8 threads at a time (even though for gaming a faster clock-speed has traditionally been more important than multi-threaded performance) so that is now becomming a minimum requirement, or how consoles have 8-gig of ram so that is now a minimum requirement, or that all consoles support blu-ray so game sizes will be the size of a blu-ray (55gig). You have to remember that game engines are started years before a new console has finalized specs and developers have to predict where the performance levels of hardware will be.

    I understand the frustrations of people who built a PC last year because they were told that, "now is the time to build," but if you go back an read pretty much ANY pc thread from a year ago, I was predicting this kind of spec jump and warning people of the impending spec-requirement bump. I hate to do this, but people always argued with me by saying I was wrong, but a year later PC gamers are in the predicted situation, doing everything I said they would.

    Edit: I also completely agree with you that it would be better to wait for the next generation of video cards instead, but again, this is a specialized product, not the new norm, so I believe that there is a genuine place for this product.

    I remember spending the money to buy an 8800GT when it was released in October 2008. That card was amazing for the price, but a month later they released 8800GTX which was basically double the VRAM and an extra cuda core (remember when cuda cores were in the dozen range, not the thousands of cores range). For an extra $50 you received about a 5fps boost in performance and most people said that the extra money was not worth the 5 fps boost. Well I owned that card for almost four years and that last year and a half I was consistantly around 5 fps away from being able to run games on medium settings at 1080p 30fps. By the end of that card I had wished that I spend the extra $50 because it was exactly what i needed in the end, even though it was not ideal at the time of purchase.

    Avatar image for colourful_hippie
    colourful_hippie

    6335

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @colourful_hippie said:


    @monetarydread said:

    @colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.

    You're missing my point on the performance not being there yet. Just because the bigger cards can handle, at the very least, rendering at 4k doesn't mean that the performance will be there too unless you're crazy and want to burn barrels of money on SLI setups. I always prefer single card setups for more reliability.

    There's a decent amount of poor optimizing in these games coming out now demanding stupid amounts of VRAM so as optimizations get better then the real need for large amounts of VRAM can be pushed off for later. As I said earlier I much rather wait for cards in the future that will no longer charge premiums for more VRAM and will actually have the performance chops to render high resolutions at high framerates.

    Now, my monitor supports G-sync so 30-40fps is as smooth as 60fps without g-sync but I would love to be able to enable AA or uncompressed textures,

    Game performance preferences definitely play a big role in the kind of setup you're looking for. I definitely would like to have a G-Sync monitor but even then I would still prefer being as close to 60 fps as possible with almost all settings maxed. Yeah G-Sync will make the frames smoother but the game is still running slower than what I would normally like. Because of that I would much rather wait for cards that can have the higher memory capacity and the horsepower to get the standards that I'm looking for without having to resort to SLI setups.I will admit though that SLI will most likely continue to get better and better at least with Nvidia because it will be in their business interest to keep SLI profiles up to date to encourage buyers to have more than one card.

    I just want to say though that these 8 gig cards are for a very specific person, you won't get your money's worth if you don't have a 4k setup and considering a SLI config or plan on getting one in the very near future.

    Avatar image for tuxfool
    tuxfool

    688

    Forum Posts

    28

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    I'm fairly certain that if the game is running at 30 fps, even with g-sync it won't look like locked 60fps. It should also be mentioned that g-sync also works best when the frame rate does not experience sudden or highly variable frame rate and it works best when the dips fall within 10 to 15 fps of the desired baseline refresh rate.

    Basically the minimum frame rate a card puts out is still important even with g-sync.

    Avatar image for giant_gamer
    Giant_Gamer

    1007

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #42  Edited By Giant_Gamer

    @tuxfool said:

    @giant_gamer: The consoles will never be able to allocate the full 8GB to graphics. They currently have 3GB set aside for the operating system, of the 5GB remaining you're going to maybe be using something like 1GB for non graphical data. That leaves 4GB for graphics at most, which seems reasonable as the consoles also have nowhere near the same memory bandwidth as mid-high end graphics cards.

    As time goes by they will reduce the os ram requirements, but it will never be 0 (I suspect they'll release 1GB ~ 1.5GB at most).

    So that leaves us nearly 6GB for graphical data. Taking into account that games on consoles will most likely be sub-1080p and they won't be optimized for PCs. I think that an 8GB will be a minimum requirement in 3 years from now, since high end PCs will go for higher resolution.

    Avatar image for tuxfool
    tuxfool

    688

    Forum Posts

    28

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #43  Edited By tuxfool
    @giant_gamer said:
    I think that an 8GB will be a minimum requirement in 3 years from now, since high end PCs will go for higher resolution.

    Assuming that in 3 years everybody will be 1440p -> 4k on pc which is highly unlikely. I should also note that the estimate for memory usage of general purpose data is extremely conservative. Judging by multiplatform games etc. 2GB+ will come into more prevalence. I doubt data structures will differ wildly between PC and console versions and probably only applies to more complex open world games which need to track a lot of data.

    Avatar image for hockeychris10
    hockeychris10

    9

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @mb: I just bought a 780ti and am thinking of getting a second for SLI, how much? :P

    Avatar image for monetarydread
    monetarydread

    2898

    Forum Posts

    92

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #45  Edited By monetarydread

    @colourful_hippie: I agree with you completely. It's a specialized card for a specialized purpose.

    @tuxfool said:

    I'm fairly certain that if the game is running at 30 fps, even with g-sync it won't look like locked 60fps. It should also be mentioned that g-sync also works best when the frame rate does not experience sudden or highly variable frame rate and it works best when the dips fall within 10 to 15 fps of the desired baseline refresh rate.

    Basically the minimum frame rate a card puts out is still important even with g-sync.

    Yup, you are correct on the most part. Unfortunately G-sync does not work under 30fps... it will always be choppy. But, when people talk about how great a locked 60fps is they are usually mistaken on the most important aspect of that feature. The important part of that statement is the word locked, as in the framerate is a consistent 60fps and if it dips it will still be above 30. The reality is if you are not able to lock your game at 60fps then you usually end up with a variable frame rate and this variance makes the game seem choppy because it is changing from 45fps to 30fps then back again once a second, or creates screen tearing because the video card is out of sync with the monitors refresh rate.

    Edit: Oops, I misread your original statement @tuxfool , yup you are correct. It works best under the certain scenarios you mentioned and is not a cure-all, but in practice the problems I find in 30fps games are gone by the time you reach 45fps and consistancy is still important, but G-Sync will remove the micro-stutter and frame-tearing while reducing input lag. I can wholeheartedly say that this is as big to the experience as HD was, and is certainly a more important jump than 120hz monitors were. It is also a great way to make 1440p, or larger, resolution gaming more reasonable for ppl with lesser hardware, this is because the problem with multi-GPU gaming is that your minimum frame-rates drop and it introduces micro-stutter. That becomes very annoying if you know what those artifacts and glitches look like.

    Here is an article that explains this better than I can.

    TL:DR: Gsync is a variable refresh rate so images look as smooth as a locked 60fps even if you do not have the hardware to do so and as long as you are able to go above 30fps.

    Avatar image for schrodngrsfalco
    SchrodngrsFalco

    4618

    Forum Posts

    454

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 7

    @spazmaster666: @mb: Should be coming in Q2 '15, IIRC. Very tempted to wait till them to get my card.

    I think the main dispute with these cards is how well they'll truly perform, so the only thing we can do right now, besides debating speculations, is to wait for them to release and get some benchmarks!

    Avatar image for giant_gamer
    Giant_Gamer

    1007

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @tuxfool said:
    @giant_gamer said:
    I think that an 8GB will be a minimum requirement in 3 years from now, since high end PCs will go for higher resolution.

    Assuming that in 3 years everybody will be 1440p -> 4k on pc which is highly unlikely. I should also note that the estimate for memory usage of general purpose data is extremely conservative. Judging by multiplatform games etc. 2GB+ will come into more prevalence. I doubt data structures will differ wildly between PC and console versions and probably only applies to more complex open world games which need to track a lot of data.

    I'm not saying that 1440+p will be the norm but it will be what enthusiasts expect from high end PCs .

    I don't think that devs will care about the usage of vram once we have 8gb cards in the market . So you'll see a lot of games using a lot of memory even if it can be easily avoided by for example compressing the textures .

    Avatar image for vackillers
    VACkillers

    1286

    Forum Posts

    82

    Wiki Points

    0

    Followers

    Reviews: 4

    User Lists: 4

    To be honest I think the regular 980s should have already come with 8GB of VRAM anyway. The problem is by the time you'll find a card that will actually make proper use of the 8GB of vram on it, the new line of GPUs will be out by then anyway and those will come with 8GB at standard. I just got my 980 recently but I'm not willing to go over $600 plus just to get an 8GB version, the price point for the 980 was one of its most attractions to me because its blazingly fast GPU that is in the 500$ range, not 6-700$ range... I'm still happy with my GPU and have no regrets waiting for the 8GB version because of the price point its most likely going to be targeted at....

    This edit will also create new pages on Giant Bomb for:

    Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

    Comment and Save

    Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.