#1 Posted by MarkM (288 posts) -

http://blogs.nvidia.com/blog/2013/10/18/g-sync/

This is it folks. This coming generation is going to be the giant leap none of us expected.

In related news, Sell all your amd stock.

#2 Posted by alanm26v5 (444 posts) -

I don't see why this is a big deal, but I'm not the market for this as I always go for the mid tier cards like a 660Ti, and never SLI. I couldn't run higher than 60 FPS in games that mattered, and if I could I'd rather keep the temps and fan noise down anyway. Maybe I'd have to see it in person to understand.

#3 Posted by mikey87144 (1719 posts) -

Wow, this is amazing stuff. I guess I'll hold off on buying a new monitor until this comes out.

#4 Posted by ZeForgotten (10397 posts) -

Someone hasn't been following what amd's doing.
Not that I'm that surprised at all given how you worded the OP but still.

#5 Edited by MarkM (288 posts) -

Amd lol

#6 Posted by MarkM (288 posts) -

They ruined ATi

#7 Posted by Ostratego (35 posts) -

Wait, isn't screen tearing just the fault of the way Windows handles frames of video? Nvidia's just fixing Microsoft's problems for them.

#8 Posted by MarkM (288 posts) -

No, its bc your gpu takes longer to finish rendering a frame than the monitor does to display it. Every single frame takes a slightly different amount of time to render. G-SYNC basically makes your refresh rate variable and depends on the gpu to tell it what it's refresh rate should be each frame. Or something?

#9 Edited by Damodar (1331 posts) -

As much as I hate tearing, I'm not going to go out and buy a new monitor just to try and get rid of it. Oh well, might benefit me down the road, I guess.

#10 Edited by Beforet (2916 posts) -

If I'm ever in the market for a new monitor, I'll keep this in mind. Honestly, I'm more interested by the whole "no stuttering" promise; 24 frames is a fine number, and that shouldn't cause my game to jump around like some sort of gamer rabbit demon. Thing.

#11 Posted by Barrock (3525 posts) -

I do need a new monitor... this old Gateway is pretty bad when it comes to screen tearing. And it's only 1680!

#12 Edited by EXTomar (4641 posts) -

Tearing happens when there is a mismatch between the graphics card's "display buffer" is not synced against when the display is ready to display. If you have a graphics card rendering scenes faster than 200hz while your monitor is refreshing at 60 you won't see any screen tearing. However if the scene changes and your graphics card is only rendering scenes at 15hz the chances are high there is going to be screen tearing because the monitor is displaying what the graphics card has to display with part of the buffer in the current render and part of it in the last render.

What NVidia has been doing with their Adaptive Sync has helped but the real solution is to make the montor refresh in sync with the graphics card which is what G-Sync appears to be trying to do. Telling the monitor to speed up or slowdown should be a more elegant solution than making the graphics card "throttle down" when it can't match refresh rates. The "pie in the sky" solution is to allow all of them: Slave the monitor to graphics (G Sync), slave graphics to the monitor (Adaptive), unrestricted (free run, ie what most people do now).

#13 Edited by Andorski (5249 posts) -

So for this to work monitor manufacturers have to install hardware designed by nVidia? If that is the case, my guess is that OEMs who market towards gamers (BenQ, Asus) will use this device on their gaming line, which means I would never buy it. Gaming LED monitors are great for those looking to shave off any perceivable lag, but their picture quality is always terrible. I'll take a 60Hz IPS or VA panel with relatively low lag and ghosting over 120/144Hz fast-as-shit monitors that have their color reproduction be complete ass.

And to get some GPU flamebait going, how does it feel to have the $1000 Titan be beaten by AMD's ~$700 290x? =P

#14 Edited by MikeJFlick (441 posts) -

Awesome, I have decent setup of 3, 22" that I picked up for cheap(relatively) I have no problems passing them down to friends or family to get some of these bad boys!

#15 Edited by EXTomar (4641 posts) -

Yeah just like the whole 3D thing, the big downside is that you will need to buy a new more costly monitor to use it. So it is nice to have this technology I'm not entirely sure it justifies replacing your old monitor let alone getting the standard into HDTV in general which will be important for adoption outside of the PC gamer niche.

#16 Posted by subyman (595 posts) -

No more tearing YAY!

Have to buy a new monitor BOO!

#17 Posted by MarkM (288 posts) -

How can anything be better than the titan?

#18 Posted by MonetaryDread (2007 posts) -

From what I have read on PCPer.Com, it is just a chip that needs to be installed in a monitor. The G-Sync module is a mod that can be soldered into a monitor, like if you were to install a mod chip into an old PSX / PS2. It's not the most elegant solution, but you can convert any monitor in your house to use the new technology, as long as it supports Displayport. They are apparently working on a module that does not involve soldering, but the most used version of the tech is going to come pre-built into monitors starting 2014.

The thing that I like about this is that it is not just about screen-tearing. I read on Anandtech that it completely eliminates ghosting, screen-tearing, stuttering, and any noticeable display-lag.

#19 Posted by Andorski (5249 posts) -

@markm said:

How can anything be better than the titan?

It's over a half year old - something is going to beat it eventually. It's a borderline scam though that nVidia and it's OEM partners keep it at the $1000 MSRP. The cost to performance ratio is disgusting.

#20 Edited by MarkM (288 posts) -

Oh in terms of value it's horrible. But there must always be a crazy video card. Always.

One card to measure others to and to stand in awe of.

Also God damn that titan is a beast

#21 Posted by MideonNViscera (2257 posts) -

This thread is like listening to Reed Richards talk to himself.

#22 Posted by MarkM (288 posts) -

But also its not like having one company make all the 3d cards is a good thing. I don't want Nvidia to have a monopoly on this but they have zero competition right now.

#23 Edited by MonetaryDread (2007 posts) -

@andorski said:

And to get some GPU flamebait going, how does it feel to have the $1000 Titan be beaten by AMD's ~$700 290x? =P

It's over a half year old - something is going to beat it eventually. It's a borderline scam though that nVidia and it's OEM partners keep it at the $1000 MSRP. The cost to performance ratio is disgusting.

I agree that the card is long in the tooth already, but there is no evidence that it is being beat by the 290x. There are two benchmark scores, released by AMD, that show the 290x is 6fps faster than the 780 for a similar price. There are no benchmarks that put the 290x against the titan.

Secondly, I agree that the Titan costs too much money for the average person, but you have to realize that the Titan isn't:

A - A gaming card, the Titan is just a re-branded Tesla card with a focus on CUDA / GPU Compute performance instead of gaming performance. $1000 for a workstation card is an incredible value because those GPU's tend to start at $3000.

B - Though it is slightly slower than a 690 (the same price as a Titan), you are getting similar performance out of a single-GPU. That is huge. Anyone who knows anything about PC gaming knows that Multi-GPU configurations are best to be avoided whenever possible. If I had the money, I would rather buy a titan than run dual-680's or a 690.

C - Because a 690 is a dual-GPU, you are limited to two 690's in a system. Since the Tesla is a single card, you can install 4 Tesla's into a system for crazy performance. I would never do something that silly, but then again, if you have the money to spare.

Though I will admit again, the Titan is mostly worthless to gamers in a world the the 780 and 290x exist.

#24 Edited by MarkM (288 posts) -

The whole multi gpu thing was a fad.

#25 Posted by Vigorousjammer (2491 posts) -

Seems cool. Sounds like something I might jump onboard with a while down the road.
However, echoing some of the sentiments of @andorski, I don't just use my compy for gaming. I'd need a monitor that would also be good for art & junk, so I'll hold off and see how things pan out.

#26 Edited by Andorski (5249 posts) -

Was watching the latest episode of The WAN Show and they said that Gsync is currently only for monitors with TN panels. Huge pass for me if they are correct. Hope they find a way to make that tech work for IPS, PLS, and VA panels sometime down the line.

#27 Posted by Colourful_Hippie (4337 posts) -

Finally being able to say fuck off to v sync? I'm extremely tempted.

#28 Posted by Colourful_Hippie (4337 posts) -

@andorski said:

Was watching the latest episode of The WAN Show and they said that Gsync is currently only for monitors with TN panels. Huge pass for me if they are correct. Hope they find a way to make that tech work for IPS, PLS, and VA panels sometime down the line.

Boooooo, if true

#29 Posted by Levio (1784 posts) -

Finally I can get away with a 30 hz monitor because I can install one of these reduce my card's output without losing frames to tearing. Plus, if I get one monitor with it, I can buy my 2nd and 3rd monitor without it and save some cash. Hopefully the device comes with plenty of ram so I don't have to install more.

#30 Posted by tourgen (4461 posts) -

@andorski said:

Was watching the latest episode of The WAN Show and they said that Gsync is currently only for monitors with TN panels. Huge pass for me if they are correct. Hope they find a way to make that tech work for IPS, PLS, and VA panels sometime down the line.

wow that's unfortunate.

#31 Posted by MarkM (288 posts) -

I might buy one of the monitors it's compatible with and try to install it for great success.

#32 Edited by BeachThunder (11809 posts) -

I'm surprised this didn't happen earlier.

#33 Posted by LiquidPrince (15902 posts) -

@markm said:

How can anything be better than the titan?

AMD 290X?

#34 Edited by SniperXan (223 posts) -

Considering AMD is in...all consoles next-gen (I think it's in the Wii U as well if we wanna count that...lol) as well as the Wii and 360 from this gen. If I had stock or cared I wouldn't be selling. High-end graphics cards are niche. They are certainly not the big money makers for either company, but an AMD gpu in every console? Yup... sounds like AMD will be doing JUUUST fine.

#35 Posted by notdavid (836 posts) -

Get back to me when Nvidia figures out what type of sync the people REALLY want.

Online