Something went wrong. Try again later
    Follow

    PC

    Platform »

    The PC (Personal Computer) is a highly configurable and upgradable gaming platform that, among home systems, sports the widest variety of control methods, largest library of games, and cutting edge graphics and sound capabilities.

    GTX 980TI vs GTX 970 (and potential Pascal cards in 2016)

    Avatar image for echo13791
    echo13791

    101

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    So here's my situation.

    I just bought a computer with a GTX 980TI, but after reading some stuff about Pascal (next gen GPUs) next year, I'm reconsidering my decision. I know the 980TI isn't quite ready for 4k, nor are monitors ready yet, with 4K monitors only getting 60 hz. Anyway, my question is, do you think I should have got a 970 (way cheaper) and stuck with 1080p and waited until next year for pascal with the potential for 4k, or bought the TI and got a 1440p monitor now?

    Avatar image for ry_ry
    Ry_Ry

    1929

    Forum Posts

    153

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Be happy with what you have. I wouldn't worry about 4K for a few years.

    Avatar image for oursin_360
    OurSin_360

    6675

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    60hz seems fine to me, no card even next gen is probably going to get over 60fps at 4k anyway.

    Avatar image for nime
    Nime

    567

    Forum Posts

    386

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 3

    All depends on your budget, timeframe, and preferences imo.

    I think 1440p is probably good for now and wouldn't worry too much about 4k for the time being. But it really comes down to what you're looking for.

    Avatar image for hmoney001
    hmoney001

    1254

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    It is a never ending battle trying to keep up with graphics cards.

    Like SkullPanda1 said, "be happy with what you have."

    As for your 980ti its perfect for 1440p on ultra settings for most games.

    Avatar image for exk4
    ExK4

    119

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    You can probably push 1440p at 144 Hz with a 980ti. Check out these monitors, they'll probably make great use of your horsepower.

    Avatar image for mike
    mike

    18011

    Forum Posts

    23067

    Wiki Points

    0

    Followers

    Reviews: -1

    User Lists: 6

    60hz seems fine to me, no card even next gen is probably going to get over 60fps at 4k anyway.

    60 hz is "fine" until you sit down in front of a nice 144 hz G-Sync monitor, then you realize what everyone else was raving about. It's that good.

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Who gives a shit about 4K? Seriously, 1440 is already more than enough if you're not using a 32 inch 4K monitor 2 feet from your face.

    4K is as bullshit as 3D

    Avatar image for ch3burashka
    ch3burashka

    6086

    Forum Posts

    100

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 3

    Be happy with what you have. I wouldn't worry about 4K for a few years.

    Amen. There will always be some new hotness within reach (6 months-1 year).

    Avatar image for stonyman65
    stonyman65

    3818

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    @pyrodactyl: Now it is, but things are moving fast. Another year or two and 4k will be pretty common. The problem now is that hardware isn't quite there yet, and up until last year the only decent 4k monitor was like $800.

    Obviously in a year or two the hardware will catch up and we'll start seeing good 4k monitors around the $400 mark where 1440p monitors are now.

    So until then, stick to a good 1080p or 1440p monitor with an IPS panel and a 120hz or 140hz refresh rate. That'll do ya if you can afford it.

    Avatar image for hunkulese
    Hunkulese

    4225

    Forum Posts

    310

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    If you already bought it, don't worry about it. You have a great card. But it's definitely a waste of money for the majority of people buying it.

    Avatar image for echo13791
    echo13791

    101

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @stonyman65 Where can you find a good 1440p monitor for $400? Every single one I find in that price range is on 60hz, which I don't want. Everything that I'm seeing above that is 144 HZ and costs upwards of $630, which is wigging me out when 4k monitors aren't much more expensive. Also for clarification, I don't already have the card, but I customized a computer that has one in it. I'm sure I could call the company and have them swap it out and change the price.

    Avatar image for jayzilla
    Jayzilla

    2709

    Forum Posts

    18

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 7

    We all have our different time frames. I am waiting to do an upgrade once the second wave of cards comes out for VR. I think they will be much more adept at handling VR at that point. I really want to just say heck with it and get a card and VR at the same time, but I think patience should win out for the best experience witht he format.

    Avatar image for echo13791
    echo13791

    101

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Ugh, so stressful (even though it shouldn't be, haha)

    Avatar image for gaspower
    GaspoweR

    4904

    Forum Posts

    272

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 2

    #15  Edited By GaspoweR

    @echo13791: Just stick with it. By the time you're ready to change, the tech would have been more reliable by then and prices would be more affordable.

    Avatar image for soimadeanaccount
    soimadeanaccount

    687

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #16  Edited By soimadeanaccount

    The issue with 980ti isn't necessarily pascal coming up next, every year will likely have a new generation video card. The issue is how DX12 will play out. If you have been paying attention there were (is?) a sizable fiasco with nvidia card's trouble with asynchronous computation. TL;DR: 980ti is still the king of the hill, but the differences between them and AMD is not worth the extra $200-300 price tag. AMD on the other hand gets a significant boost.

    970 also had its own issue with 3.5gb vs 4gb vram a while back. TL;DR: Card performs well when it is using 3.5gb, when it fills up the last 500mb shit gets weird.

    A top of the line card can usually withstand 2 to 3 generations of good performance, but this is becoming questionable with the issue at hand. Now this doesn't necessarily mean 980ti will be dead in the water after next gen, but there's a lost on confidence in its ability. There are a lot of things to consider.

    No consumer grade card can really do 4k really well right now and with VR on the horizon I would say that buying any hardware at this time is a bad idea...that said I just bought a new video card because my old card is simply being out class. I did at one point consider the 980ti, if it isn't for the low stock and the dx12 issue showing up at the 11th hour I would be using it right now.

    If you already have the 980ti and your older card really isn't up to snuff then keep it. If not I would actually consider waiting for next gen hardware altogether.

    Avatar image for oursin_360
    OurSin_360

    6675

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #17  Edited By OurSin_360

    @Mike said:
    @oursin_360 said:

    60hz seems fine to me, no card even next gen is probably going to get over 60fps at 4k anyway.

    60 hz is "fine" until you sit down in front of a nice 144 hz G-Sync monitor, then you realize what everyone else was raving about. It's that good.

    Yeah i'm sure, i just mean no card can handle it anyway and next gen will probably not go over 60 at 4k

    That said, do you think a 120hz-144hz 1080p gsync is worth it? Or would a 1440p without either be a better upgrade for a single 980? I doubt i could get over 60fps at 1440p but not sure if the extra resolution is better than the gysnc/144hz?

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #18  Edited By pyrodactyl

    @stonyman65: my point was that 4K will only ever appeal to crazy people. The horsepower required to drive 4K will always be better spent elsewhere. Places where you can see real improvement in the visuals/gameplay without having a huge monitor inchs from your face.

    Like all this 120-144 Hz nonsense going on in this thread. Sure, if you live on the bleeding edge by all means buy a 4K/144Hz monitor and a titan card or whatever. Just don't ever expect those things to become the norm.

    Avatar image for bollard
    Bollard

    8298

    Forum Posts

    118

    Wiki Points

    0

    Followers

    Reviews: 3

    User Lists: 12

    with 4K monitors onlygetting 60 hz.

    Only?! Are you mad? I'll see you in 3 years when GPUs are able to render 4k at 120fps on medium settings.

    Avatar image for justin258
    Justin258

    16685

    Forum Posts

    26

    Wiki Points

    0

    Followers

    Reviews: 11

    User Lists: 8

    Who gives a shit about 4K? Seriously, 1440 is already more than enough if you're not using a 32 inch 4K monitor 2 feet from your face.

    4K is as bullshit as 3D

    @stonyman65: my point was that 4K will only ever appeal to crazy people. The horsepower required to drive 4K will always be better spent elsewhere. Places where you can see real improvement in the visuals/gameplay without having a huge monitor inchs from your face.

    Like all this 120-144 Hz nonsense going on in this thread. Sure, if you live on the bleeding edge by all means buy a 4K/144Hz monitor and a titan card or whatever. Just don't ever expect those things to become the norm.

    People were saying this exact same thing when 1080p TV's were on the rise.

    Also I just bought a 144Hz monitor with a 1ms response time and holy fucking shit that makes PC games run sooooo smooth! It's so great! I don't think I'll ever be able to go back to using a TV as a computer monitor! I mean, it's only a 1080p monitor and it's also only 24" but I wouldn't go higher if it meant I wouldn't be getting the same refresh rate and response time.

    Anyway, as technology improves and 8K/240hz technologies become the next big thing, you'll find that 4K monitors and TV's become more reasonably priced and more people will buy them. You'll probably find yourself with one at some point in your life because "oh, hey, I need a new TV and here's a good one on sale!"

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    1-People were saying this exact same thing when 1080p TV's were on the rise.

    2-Also I just bought a 144Hz monitor with a 1ms response time and holy fucking shit that makes PC games run sooooo smooth! It's so great! I don't think I'll ever be able to go back to using a TV as a computer monitor! I mean, it's only a 1080p monitor and it's also only 24" but I wouldn't go higher if it meant I wouldn't be getting the same refresh rate and response time.

    3-Anyway, as technology improves and 8K/240hz technologies become the next big thing, you'll find that 4K monitors and TV's become more reasonably priced and more people will buy them. You'll probably find yourself with one at some point in your life because "oh, hey, I need a new TV and here's a good one on sale!"

    1-1080 is a marginal improvement over 720p but it's not that expensive to drive comparatively and you can still tell the difference on a normal person's screen. Same for 1440. If I had a 4K TV that was the size of my current HD TV there would literally be no physical way for me to tell the difference. The only way to get the visual improvements is to buy 60 inch TV or a huge ass monitor and sit really close to it.

    2-Game designers still design games at 30fps half the time so they can push the envelope elsewhere. Most people don't even care about 60fps. I'm not disputing that 120fps is smoother. I'm just saying it will remain the realm of hardcore enthusiasts for the foreseeable future.

    3-There are a million problems with 4K. From bandwidth constraints for getting you content (TV or internet) to the fact that you can't really tell the difference unless you buy a huge ass screen. Unless they just phase out HD TVs there is just no reason for the general public to be interested in 4K.

    So yeah, you are right we will probably end up with 4K screens at some point but that point is just way farther than you think it is. People will end up with 4K beause TV companies will push it down their throats for years the same way they ended up with 3D TVs. Like 3D, 4K is just an expensive gimmick.

    Avatar image for cameron
    Cameron

    1056

    Forum Posts

    837

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 2

    The only thing I'll add to this is that Pascal probably will be a big jump, almost certainly more than any recent generational jump. They are finally moving from the 28nm process to the 16nm FinFET process, which is the first process shrink in four or so years. Combined with HMB2 memory, we might finally see new cards that offer a big jump over what's been around since 28nm tech was released. If you're interested in 4K, I'd buy a 970 now, and then start saving for a Pascal card. There is no single card solution today that will get you 60+ fps at 4K, so play at 1080p on the 970 until Pascal comes out, then go to 4K from there. If you're happy with 1440p, then stick with the 980TI, since it should be good at that resolution for a few years at least.

    Avatar image for barrock
    Barrock

    4185

    Forum Posts

    133

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    Trying to decide if the 980 is worth the extra $100 compared to a 970. Upgrading from a 570. But I also have consoles so I could just wait for next year?

    Avatar image for neozeon
    NeoZeon

    769

    Forum Posts

    40

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #24  Edited By NeoZeon

    @barrock: I'm in the same boat, moving up from an older card I mean. I think it really depends on the resolution you plan to run in. Personally I'm using my TV as a monitor so 1080p is as high as I'm going to go. If you're in the same situation, resolution-wise, I doubt you need the 980 and could afford to save the money for something else down the line. Especially if you have consoles: You could wait it out and see how the prices level off later on.

    Avatar image for barrock
    Barrock

    4185

    Forum Posts

    133

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @neozeon said:

    @barrock: I'm in the same boat, moving up from an older card I mean. I think it really depends on the resolution you plan to run in. Personally I'm using my TV as a monitor so 1080p is as high as I'm going to go. If you're in the same situation, resolution-wise, I doubt you need the 980 and could afford to save the money for something else down the line. Especially if you have consoles: You could wait it out and see how the prices level off later on.

    Yeah, I'll be using my Steam Link to play on my TV. An SSD would probably be smarter to get with the 970.

    Avatar image for neozeon
    NeoZeon

    769

    Forum Posts

    40

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    @barrock: Sounds like a plan. Seems like SSD prices keep dropping too so you can probably get more storage than you expect. Might even be able to stumble on a Black Friday deal or two as well.

    Avatar image for korwin
    korwin

    3919

    Forum Posts

    25

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #27  Edited By korwin

    @pyrodactyl said:
    @believer258 said:

    1-People were saying this exact same thing when 1080p TV's were on the rise.

    2-Also I just bought a 144Hz monitor with a 1ms response time and holy fucking shit that makes PC games run sooooo smooth! It's so great! I don't think I'll ever be able to go back to using a TV as a computer monitor! I mean, it's only a 1080p monitor and it's also only 24" but I wouldn't go higher if it meant I wouldn't be getting the same refresh rate and response time.

    3-Anyway, as technology improves and 8K/240hz technologies become the next big thing, you'll find that 4K monitors and TV's become more reasonably priced and more people will buy them. You'll probably find yourself with one at some point in your life because "oh, hey, I need a new TV and here's a good one on sale!"

    1-1080 is a marginal improvement over 720p but it's not that expensive to drive comparatively and you can still tell the difference on a normal person's screen. Same for 1440. If I had a 4K TV that was the size of my current HD TV there would literally be no physical way for me to tell the difference. The only way to get the visual improvements is to buy 60 inch TV or a huge ass monitor and sit really close to it.

    1080p is literally over double the resolution of 720p, 1440p is 4 times 720p. In what world is a 125% and 300% increase marginal?

    Anyway, as it stands we are now getting close to the tipping point where someone would be better off waiting for the next product rather than buying in now. Pascal will be the first die shrink we've had since 2012 which should make for significant gains in the raw horse power department, not to mention the ludicrous bandwidth that HBM2 offers (hell even HBM1 on AMD's Fury X offers a massive leap over DDR5).

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #28  Edited By pyrodactyl

    @korwin said:
    @pyrodactyl said:
    @believer258 said:

    1-People were saying this exact same thing when 1080p TV's were on the rise.

    2-Also I just bought a 144Hz monitor with a 1ms response time and holy fucking shit that makes PC games run sooooo smooth! It's so great! I don't think I'll ever be able to go back to using a TV as a computer monitor! I mean, it's only a 1080p monitor and it's also only 24" but I wouldn't go higher if it meant I wouldn't be getting the same refresh rate and response time.

    3-Anyway, as technology improves and 8K/240hz technologies become the next big thing, you'll find that 4K monitors and TV's become more reasonably priced and more people will buy them. You'll probably find yourself with one at some point in your life because "oh, hey, I need a new TV and here's a good one on sale!"

    1-1080 is a marginal improvement over 720p but it's not that expensive to drive comparatively and you can still tell the difference on a normal person's screen. Same for 1440. If I had a 4K TV that was the size of my current HD TV there would literally be no physical way for me to tell the difference. The only way to get the visual improvements is to buy 60 inch TV or a huge ass monitor and sit really close to it.

    1080p is literally over double the resolution of 720p, 1440p is 4 times 720p. In what world is a 125% and 300% increase marginal?

    Anyway, as it stands we are now getting close to the tipping point where someone would be better off waiting for the next product rather than buying in now. Pascal will be the first die shrink we've had since 2012 which should make for significant gains in the raw horse power department, not to mention the ludicrous bandwidth that HBM2 offers (hell even HBM1 on AMD's Fury X offers a massive leap over DDR5).

    What are you even talking about? Congratulations on strawmaning my entire argument and going off the deep end talking about high end PC specs that aren't even out yet. The kinda thing that has no relation to a wide adoption of 4K as a technology.

    Did you even read the full quote you posted? ''If I had a 4K TV that was the size of my current HD TV there would literally be no physical way for me to tell the difference.''

    That's the real problem with 4K.

    Avatar image for korwin
    korwin

    3919

    Forum Posts

    25

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @korwin said:
    @pyrodactyl said:
    @believer258 said:

    1-People were saying this exact same thing when 1080p TV's were on the rise.

    2-Also I just bought a 144Hz monitor with a 1ms response time and holy fucking shit that makes PC games run sooooo smooth! It's so great! I don't think I'll ever be able to go back to using a TV as a computer monitor! I mean, it's only a 1080p monitor and it's also only 24" but I wouldn't go higher if it meant I wouldn't be getting the same refresh rate and response time.

    3-Anyway, as technology improves and 8K/240hz technologies become the next big thing, you'll find that 4K monitors and TV's become more reasonably priced and more people will buy them. You'll probably find yourself with one at some point in your life because "oh, hey, I need a new TV and here's a good one on sale!"

    1-1080 is a marginal improvement over 720p but it's not that expensive to drive comparatively and you can still tell the difference on a normal person's screen. Same for 1440. If I had a 4K TV that was the size of my current HD TV there would literally be no physical way for me to tell the difference. The only way to get the visual improvements is to buy 60 inch TV or a huge ass monitor and sit really close to it.

    1080p is literally over double the resolution of 720p, 1440p is 4 times 720p. In what world is a 125% and 300% increase marginal?

    Anyway, as it stands we are now getting close to the tipping point where someone would be better off waiting for the next product rather than buying in now. Pascal will be the first die shrink we've had since 2012 which should make for significant gains in the raw horse power department, not to mention the ludicrous bandwidth that HBM2 offers (hell even HBM1 on AMD's Fury X offers a massive leap over DDR5).

    What are you even talking about? Congratulations on strawmaning my entire argument and going off the deep end talking about high end PC specs that aren't even out yet. The kinda thing that has no relation to a wide adoption of 4K as a technology.

    I simply disagree with calling such large gains in pixel count marginal. If some put out a car tomorrow that was 125% faster than Veyron I doubt people would call that a marginal increase in top speed. The second half of my post has nothing to do with yours.

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @korwin: For TVs it was definitely marginal.

    Avatar image for justin258
    Justin258

    16685

    Forum Posts

    26

    Wiki Points

    0

    Followers

    Reviews: 11

    User Lists: 8

    @korwin: For TVs it was definitely marginal.

    Math says it's not a marginal increase and you can't argue with math! Well, you can, but 1280 x 720 (that'd be 720p) = 921,600 pixels on screen and 1920 x 1080 (that'd be 1080p) = 2,073,600 pixels on screen. I mean, if you want to start throwing around the word "definitely", there's a "definitely" that's actually definite and not just an argument on whether or not 1080p is that much clearer than 720p. On that note, just because 1080p looks like a "marginal improvement" to you doesn't mean that everyone else only sees a "marginal improvement".

    Avatar image for korwin
    korwin

    3919

    Forum Posts

    25

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #32  Edited By korwin

    @pyrodactyl said:

    @korwin: For TVs it was definitely marginal.

    Math says it's not a marginal increase and you can't argue with math! Well, you can, but 1280 x 720 (that'd be 720p) = 921,600 pixels on screen and 1920 x 1080 (that'd be 1080p) = 2,073,600 pixels on screen. I mean, if you want to start throwing around the word "definitely", there's a "definitely" that's actually definite and not just an argument on whether or not 1080p is that much clearer than 720p. On that note, just because 1080p looks like a "marginal improvement" to you doesn't mean that everyone else only sees a "marginal improvement".

    I think the word we're searching for is perceptible.

    Avatar image for deactivated-64162a4f80e83
    deactivated-64162a4f80e83

    2637

    Forum Posts

    39

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    @pyrodactyl: was it? I can certainly see the difference on my TV. Granted it's 50 inch but when steaming at 720 it's looks noticeably less sharp than 1080 and I can tell you that 4K looks pretty great compared to 1080

    Avatar image for shivoa
    Shivoa

    1602

    Forum Posts

    334

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 6

    @barrock said:

    Trying to decide if the 980 is worth the extra $100 compared to a 970. Upgrading from a 570. But I also have consoles so I could just wait for next year?

    I think it's pretty hard to justify the extra price (it was almost impossible to when the gap was $200 at launch - where I am the price gap is still way higher than $100). The 970 is a good card with reasonable overclocks if you want them (often factory applied) that can get you to a decent place with 1080p-1440p. The 980Ti costs a fortune but can just about get you to 4K in any game (maybe with a few options moved from Ultra to High to keep it smooth). The 980 definitely can't reliably get you to 4K and is nothing much more than a slightly shinier 970 (ok, the memory is a bit less fraught but really we're talking about 10% percent faster results in any game you throw at the two of them). The 980 isn't a bad card, but I have to think keeping $100 in your pocket means you'll have saved up enough for your next upgrade a lot sooner (without spending a penny more in total).

    As to waiting 3-9 months for the new generation of cards, I think Pascal cards are going to be pretty exciting (I'm sitting on a GTX760 that really should get replaced at some point and going for an x60 rather than my normal choice of an x70 model was a mistake but I hope Pascal will show off what shrunk chips can offer to move beyond Maxwell 2 efficiency while offering much bigger (in terms of transistors and so processing units) chips for the same price to jump performance forward), but I'm also not sure exactly when they're going to land and even nVidia sound like they're twiddling their thumbs until the RAM yields are ready (at least in the higher end products that will move the HBM2 and so leapfrog AMD's tech in their Fury cards - I'd hope the replacement for the 970/980 tier uses HBM2 to avoid memory bottlenecks and it won't just be reserved for the $600+ 980Ti tier models).

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @pyrodactyl: was it? I can certainly see the difference on my TV. Granted it's 50 inch but when steaming at 720 it's looks noticeably less sharp than 1080 and I can tell you that 4K looks pretty great compared to 1080

    Have you heard of the placebo effect? If I placed you at a normal distance from a normal person's TV screen (like 37 inch) with 10 screens in front of you, 5 1080 and 5 4K you would be hard pressed to tell the difference.

    That's why all 4K screen are gigantic.

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #36  Edited By pyrodactyl

    @yesiamaduck: @korwin: @believer258: I sit about 68 inchs from my 37 inch screen. The angular resolution of the human eye is 0.02 degrees (https://en.wikipedia.org/wiki/Naked_eye).

    So I can distinguish between 2 points that are 0.023 inches appart at a distance of 68 inchs ( 68*tan(0.02) ).

    Pixels on my TV are 0.01666 inchs appart because it's 32 inches horizontal for 1920 pixels (32/1920).

    So yeah, impossible to tell the difference between 1080 and any higher res screen with that size at that distance. You can start to tell with a 50 inch screen at a distance of 75 inches or closer but the improvement will be extremely minor and 75 inchs is really close to be sitting from your 50 inch screen.

    MATH!

    Avatar image for shivoa
    Shivoa

    1602

    Forum Posts

    334

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 6

    #37  Edited By Shivoa

    @pyrodactyl: You might want to consider wearing glasses.

    That way you'd get the benefits of being a cyborg and not living with unaugmented vision. A healthy young person without major optical issues should be able to get to 20:12 vision (if you're lucky, 20:10 is very much attainable). and so be able to see things with the clarity of a "naked eye" from 20ft away from a screen that 20:20 vision only resolve when 12ft away. So half the angular resolution of any calculation you're making for people who care about getting the most out of their optical system (the future is cool and so many people live with augmented vision).

    And then read this and realise that all of that doesn't matter because your optical system actually cares about hyperacuity. The interesting follow up article to that makes a good case for requiring about 4x linear resolution vs visual acuity for the screen (the other shortfall in getting to hyperacuity can be made up with super-sampling the source image so great anti-aliasing gets us there, probably). It's all interesting research that's going to get really nailed down as we get better VR tech and more research about screens that can get impressive pixel densities to us.

    So we're talking about someone with good corrective glasses to maximise their optical system (20:10, half an arcminute, 0.00833(recurring) degrees to be precise) and a screen needs to fit four pixels into that, so about 0.0021 degrees. 68*tan(0.0021 degrees) ~= 0.0025 inches.

    Looks like your TV doesn't have nearly enough dots if you're sitting 68 inches away. At least according to reasonable science and maths. Maybe not (this is conjecture), but it's not an unreasonable requirement. It obviously won't be apparent for every possible image what the limits of your vision even are (looking at a pure white screen, the pixel count is functionally 1 - we have brains that are good at adding detail that doesn't exist as many optical illusions show where eg things you're not looking directly at can have their saturation changed without you seeing because your brain makes up the colour of things your cones are not actually pointing at towards the edge of your vision) but as the first link makes clear, you can see an aliased line far further away than you can resolve a single pixel of black in a sea of white.

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #38  Edited By pyrodactyl

    @shivoa: Look, people who care about aliasing are not going to be the ones deciding when there is widespread adoption of 4K tech. I would love a serious study with a random group sample showcasing the real distance from where people actually see the difference between 1080 and 4K.

    Not side by side either, that's too easy and doesn't actually measure any improvement in the quality of the experience. What we need is to compare is: when are people able to tell their experience with 4K is better than their experience with 1080?

    I think we can both agree this experiment would be pretty clear cut with SD vs HD. On modern size screens, SD looks like blurry garbage while HD looks way crisper and higher quality.

    What would happen with 4K though? Is the jump noticeable enough to make a difference? I'm pretty sure it's not outside of giant displays. What you and a bunch of people in this thread are doing in arguing for 4K is buying into the giant indiscriminate tech marketing machine that tells you you need the new stuff because it's better than the old stuff. While it might be technically true (4K is twice the resolution so it's mathematicaly better) there's a huge ''diminishing returns'' factor at play here.

    Avatar image for shivoa
    Shivoa

    1602

    Forum Posts

    334

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 6

    @pyrodactyl: for someone who was so quick to say "impossible" and "MATH!", you seem mighty eager to ignore science and maths, just look for people rating their experiential feelings, as soon as it turns out the numbers you leant on are wrong.

    People can (scientifically speaking) see 4K in lots of settings, even phones aren't maxed out by 1080p (but clearly Apple's "retina" PR lies got to your understanding of what pixel counts should matter despite their 20:20 assumption and lack of concern with acuity), and if people want to invest in that then telling them it's impossible for them to see a difference with bunk science seems highly dubious. If someone wants 4K, how does that hurt you?

    Avatar image for betterley
    betterley

    222

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @echo13791:
    I wouldn't be disappointed with your 980TI at all, that's a beastly card. I can run Fallout 4 at 4K with one 980TI... pretty impressive.
    Also, the advantage of a Gsync monitor isn't a super high refresh rate. It's the fact the the refresh rate is directly linked to your video card's FPS. With the refresh rate tied to the FPS, it eliminates screen tearing and makes lower frame rates more playable and responsive, which makes 4K more achievable. Since I've spoiled myself and gotten a 4K Gsync monitor, I can honestly say I can't go back to 1080p or a monitor without Gsync- there's a huge difference.
    I should also note that I have an SLI setup, I gave Fallout as and example with a single card because there isn't an official SLI update for it yet.

    Avatar image for pyrodactyl
    pyrodactyl

    4223

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @shivoa: I might've been mistaken on the whole ''impossible'' thing but again, no one has shown if 4K is actually a meaningful improvement.

    I don't know I bother anymore. It's not like anyone who's bought into the idea of bleeding edge tech is ever going to change their mind.

    Avatar image for deactivated-64162a4f80e83
    deactivated-64162a4f80e83

    2637

    Forum Posts

    39

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    @pyrodactyl: look it's not placebo, league of legends is a game I can run at 4K and it's looks a good degree sharper than it does at 1080p. Party up scaling at fault here I am aware but it looks a magnitude sharper than it does on my 1080 display as well. The difference is night and day in this example.

    Avatar image for facelessvixen
    FacelessVixen

    4009

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    I have a 750 Ti.

    When you buy a Pascal card, I'll give you $300 and ritualistically kill two chickens and a goat for your 980 Ti.

    Avatar image for monkeyking1969
    monkeyking1969

    9098

    Forum Posts

    1241

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 18

    I would just hold with what you have. You have a nice card now, and I do wonder if Pascal will actually arrive on time. That switch you are waiting for could get delayed. I think Intel just delayed Cannonlake 10nm CPUs for a year, so things like that do happen.

    The hardest think about the speed which PCs evolve is that something really good is always just a year away. I'm going to jump into everything new in 2017, I figure by that point PCI 4.0 will be out with video cards that can use the lane.

    Avatar image for shivoa
    Shivoa

    1602

    Forum Posts

    334

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 6

    #45  Edited By Shivoa

    @monkeyking1969: I wouldn't be too surprised to not get anything PCI-E 4.0 with a 2017 build. There will be a finalised spec by then, but possibly no hardware (that will be arriving within the next few years but it wouldn't be unreasonable if you didn't see mainstream consumer adoption not arriving until 2019).

    Luckily, that probably won't actually really matter for gaming. The CPU-GPU bus is rarely the bottleneck in rendering tasks to the extent that running an older PCI-E bus will only really cut 2-3% off your framerates. It takes two GPUs in SLI and using a PCI-E 1.0 16x bus to actually bring out a visible difference (20%+). It's not that we're not really getting to the limits of PCI-E 3.0, it's that really we don't see a high need for anyone to be using anything faster than PCI-E 2.0 in 2015. As soon as your GPU has exhausted the dedicated RAM it has on the card and starts using the PCI-E bus to access system RAM to juggle assets, you're already seeing the framerates in the sink so a faster bus (which is a long pipe compared to GPU RAM - ie comparatively very laggy waiting for the data to arrive after requesting it, something that doesn't automatically get any better with a fatter pipe: more water arrives down the hose but it still takes exactly as long for the first drop from when you turn the tap on) really only makes a bad situation slightly less bad.

    I think there'll be exciting cards to buy in 2017, but it won't be because they're PCI-E 4.0.

    Pascal (GP1xx) is getting ready for volume manufacture. It's the HBM2 RAM that some models need that will likely cause any delays to an early 2016 launch - who knows how long nVidia will sit waiting for RAM yields to bring the price down considering their current market dominance despite AMD already launching HBM1 based products (Fury). The GP2xx chips will presumably be coming online in 2017 (for people waiting for then).

    Avatar image for deactivated-63b0572095437
    deactivated-63b0572095437

    1607

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    There's always something better around the corner. Enjoy what you have. Don't worry about what you "should have" bought.

    Avatar image for huntin 4 games
    Huntin 4 Games

    28

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    I tend to do the same thing you're doing every time i make a GPU upgrade, haha. Having gone from a 970, to 970 SLI, to the gtx 980 TI, I can tell you it really is an amazing card. Pascal does sound amazing, but I'm sure that 980 TI will last for a long time to come. Ultimately new technology is always around the corner, and whatever you buy, whenever you buy it, is going to most likely be out of date in 6 months to a year. Ultimately, if you're ready for an upgrade, go for it. It's the first highly priced GPU that I've felt had a good value.

    This edit will also create new pages on Giant Bomb for:

    Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

    Comment and Save

    Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.