@colourful_hippie said:
@monetarydread said:
@colourful_hippie: These cards are designed for 4K gaming; they went with 8gig chips because 4k, and uncompressed textures, can already saturate the 6gigs of a Titan Black.
You're missing my point on the performance not being there yet. Just because the bigger cards can handle, at the very least, rendering at 4k doesn't mean that the performance will be there too unless you're crazy and want to burn barrels of money on SLI setups. I always prefer single card setups for more reliability.
There's a decent amount of poor optimizing in these games coming out now demanding stupid amounts of VRAM so as optimizations get better then the real need for large amounts of VRAM can be pushed off for later. As I said earlier I much rather wait for cards in the future that will no longer charge premiums for more VRAM and will actually have the performance chops to render high resolutions at high framerates.
I agree that single card setups are better than SLI thats why it makes sense that companies increase VRAM for people with high rez monitors. I have a 980, and even though my monitor only goes up to 1440p, I can enable 4k rendering at the driver level and downsample. Rendering Shadows of Mordor at 4k with "ultra," settings except for uncompressed textures, medium setting for ambient occlusion, and no anti-aliasing, nets me a constant 30-40fps and the vram of my 980 is saturated. Now, my monitor supports G-sync so 30-40fps is as smooth as 60fps without g-sync but I would love to be able to enable AA or uncompressed textures, I would be able to if I had waited for an 8gig version of the card. I agree that this is a specialized product, but if you look at previous generations of Geforce cards with double the VRAM the cost is only an extra $50 - $100 and that is chump-change for people looking to build a powerful single-GPU machine.
I also disagree that there is a lot of poor optimizing of PC games lately. Go to the internet way-back machine and look at posts from 2006, 2001, or 97 (basically pre-and-post next-gen console releases), you will see nerds on the internet giving PC building advice that says "this should be good enough for a gaming PC," then a year later those same nerds complain about a massive boost in system requirements that excludes them from playing games on their new PC's. After seeing this for so many generations now I am starting to believe that its the nerds giving bad PC advice and hardware manufacturers overstating the performance of their hardware, not "Poor Optimisation," from the developers.
Before the new consoles were even announced Epic games were saying that Unreal Engine 4 was designed around a 680 as a minimum requirement, the consoles have an 8-core CPU so engines are now designed around being able to push 8 threads at a time (even though for gaming a faster clock-speed has traditionally been more important than multi-threaded performance) so that is now becomming a minimum requirement, or how consoles have 8-gig of ram so that is now a minimum requirement, or that all consoles support blu-ray so game sizes will be the size of a blu-ray (55gig). You have to remember that game engines are started years before a new console has finalized specs and developers have to predict where the performance levels of hardware will be.
I understand the frustrations of people who built a PC last year because they were told that, "now is the time to build," but if you go back an read pretty much ANY pc thread from a year ago, I was predicting this kind of spec jump and warning people of the impending spec-requirement bump. I hate to do this, but people always argued with me by saying I was wrong, but a year later PC gamers are in the predicted situation, doing everything I said they would.
Edit: I also completely agree with you that it would be better to wait for the next generation of video cards instead, but again, this is a specialized product, not the new norm, so I believe that there is a genuine place for this product.
I remember spending the money to buy an 8800GT when it was released in October 2008. That card was amazing for the price, but a month later they released 8800GTX which was basically double the VRAM and an extra cuda core (remember when cuda cores were in the dozen range, not the thousands of cores range). For an extra $50 you received about a 5fps boost in performance and most people said that the extra money was not worth the 5 fps boost. Well I owned that card for almost four years and that last year and a half I was consistantly around 5 fps away from being able to run games on medium settings at 1080p 30fps. By the end of that card I had wished that I spend the extra $50 because it was exactly what i needed in the end, even though it was not ideal at the time of purchase.
Log in to comment