Graphical Glitches. Red, Yellow Squares. Weirdness

#1 Edited by AlisterCat (5470 posts) -

After my first GTX 660Ti I sent it back, and got a new one from EVGA. Hasn't been causing the same problems I had last time, which is great. Unfortunately, new problems. Well, maybe.

First I noticed a flickering red outline effect over everything in Don't Starve. Not the game, appears to be a hardware problem. In Orcs Must Die 2 I am getting yellow squares flickering around the screen randomly. Now, in Far Cry 3 I am getting red squares flickering around the screen.

Is this a GPU problem, and should I send it back again?

#2 Posted by Garfield518 (403 posts) -

Your graphics card is overheating.

#3 Posted by AlisterCat (5470 posts) -

@Garfield518: OK. Well, it isn't doing anything it doesn't normally do. Why would it overheating during Orcs Must Die 2? That game doesn't require much.

#4 Posted by ajamafalous (11813 posts) -

Yeah, sounds like it's overheating.
 
@AlisterCat said:

@Garfield518: Why would it overheating during Orcs Must Die 2? That game doesn't require much.

Doesn't really matter how graphically-intensive the game is. Download a GPU temp monitor (MSI Afterburner, RealTemp, GPU-Z, etc.) and let us know what temperature your graphics card is at when it starts artifacting.
#5 Edited by AlisterCat (5470 posts) -

@ajamafalous: My logic is that graphically intensive games require more power, and generate more heat as well.

Idle: 33 degrees

Ocrs Must Die 2: Averaged 70 during fighting. No graphical glitches first game as it climbed from 30 to 60. Started second game at 60 degrees and lots of black and yellow squares.

Far Cry 3: 75 degrees during intro scene, couldn't get any artifacting to happen this time.

Fan speed was below 50% at all times.

#6 Posted by FritzDude (2251 posts) -

33 Celsius idle & 60 to 75 Celsius is perfectly normal for a GPU under load, so it's not overheating, unless those numbers are lying. Raise Your eyebrow when you see it at over 85 Celsius. But as far as the Logic With graphically intensive games; if you have unlimited frames on a game - usually done by turning off Vsync or playing games in windowed mode - it has the same effect since the GPU will read more frames - if it's fast enough - even if the refresh rate can't pick it up. Obviously it has a lot to do With how intense those frames are. A game rendered in 120 frames will require more Power, causing more heat, than the same game rendered in 60 frames. But I digress, I'm not quite sure what the problem here is, but make sure you have clean updated the drivers & that anything is plugged in correctly.

#7 Posted by AlisterCat (5470 posts) -

@FritzDude: I did a clean install of the most recent drivers before recording those temperatures. I almost always enable Vsync as well. You can force it on the control panel now, which is useful.

Only a few games seem to suffer from this. I'm not sure why.

#8 Posted by AlisterCat (5470 posts) -

@FritzDude: @ajamafalous: Bit of an update. Had another crack at farcry 3. With no Vsync and Vsync 1 frame I had the artifacting. With Vsync 2 frames I didn't, but I had a much lower framerate. Then I switched to no vsync and watched a bit more of the intro, the image froze, went to desktop with the exe still running. It did this before with Orcs Must Die 2 today, except then it reported a driver crash. This is on a fresh driver install, but this time I didn't get a reported crashing. I've also had some kind of driver crash while playing flash videos as well, and since they're GPU accelerated I am going to assume it's my GPU that's messing that up too.

#9 Posted by august (3825 posts) -

try reseating it?

#10 Posted by ajamafalous (11813 posts) -
@AlisterCat: Well, as @FritzDude: explained on both points:
 
1) If those are your temps, your card isn't overheating. nVidia cards run pretty hot, so you're okay until you get around 85-90C.
2) As a basic explanation, Fritz hit it pretty well; if you're rendering more frames, even if they aren't really graphically-intensive, it's still going to generate more heat, because your card is rendering those frames, even though they aren't being displayed. This is why a number of games have a "max frames rendered" option. This also, if I'm remembering correctly, is what contributed to the situation that was causing StarCraft II to burn up everyone's cards. Because the game has relatively low requirements, it was rendering way more than 60 frames per second, and a bad driver released by nVidia messed up the automatic fan profile, so the fans weren't increasing to help cool the card.
 
Pretty big tangent I went on there. Unfortunately, I'm not really sure what's happening with your card. Colored artifacting is usually a sign of overheating, but if your temps are 33 idle and 60-75 load then your card is actually cooling really well. I'm at a loss for what's going on.
#11 Posted by Jams (2959 posts) -

@AlisterCat said:

After my first GTX 660Ti I sent it back, and got a new one from EVGA. Hasn't been causing the same problems I had last time, which is great. Unfortunately, new problems. Well, maybe.

First I noticed a flickering red outline effect over everything in Don't Starve. Not the game, appears to be a hardware problem. In Orcs Must Die 2 I am getting yellow squares flickering around the screen randomly. Now, in Far Cry 3 I am getting red squares flickering around the screen.

Is this a GPU problem, and should I send it back again?

I had artifacting like that a long time ago on a graphics card. I'm pretty sure it's done for.

#12 Posted by Zomgfruitbunnies (727 posts) -

Does the problem present itself in a variety of other games, as well? I'm asking because I have a similar problem with OMD2 where the game starts to perform poorly after the first game (dip in framerate, stuttering, etc., restarting the game fixes this, but as soon as you finish a map the problem will occur again). Started having this problem after the most recent game patch that was released a couple of weeks ago, and I also use a nVidia card.

#13 Posted by warxsnake (2634 posts) -

Is it a factory overclocked version? If so lower the clock speeds to standard (research what the standard 660ti clock speeds are). Usually companies will overclock cards without touching the voltage, causing instability. 

#14 Posted by Subjugation (4716 posts) -

@AlisterCat: I'll second the question of whether your clock speeds are stock or not. If they have been modified and it wasn't done carefully, that could very well explain the artifacting. I don't think messing with gpu clock speeds is worth it personally.

#15 Posted by Chrjz (317 posts) -

I had a card that had similar artifacting issues. I had to lower the clock speed to get it to work; I would recommend returning the card if it's not functioning to spec.

#16 Posted by AlisterCat (5470 posts) -

@warxsnake: @Subjugation: @Chrjz: It is the 'Super Clocked' edition by EVGA. I haven't altered any of the clock speeds manually. It is clocked at 980Mhz, as described, and running at 0.987 V, but I don't know what the regular version runs at. I'm a bit dubious at increasing the voltage myself for a factory overclock of 65Mhz (the regular runs at 915Mhz).

#17 Posted by AlisterCat (5470 posts) -

@Zomgfruitbunnies: Only in those that I have mentioned. Most games are fine. Played a few hours of hitman. Played all of Dishonored. Plenty of Borderlands 2. Saints Row 3. Dungeon Defenders. That's just in the past week or so.

#18 Posted by warxsnake (2634 posts) -
@AlisterCat said:

@warxsnake: @Subjugation: @Chrjz: It is the 'Super Clocked' edition by EVGA. I haven't altered any of the clock speeds manually. It is clocked at 980Mhz, as described, and running at 0.987 V, but I don't know what the regular version runs at. I'm a bit dubious at increasing the voltage myself for a factory overclock of 65Mhz (the regular runs at 915Mhz).

My EVGA "FTW" edition was factory overclocked. All games worked correctly. My computer would hard lock/freeze only with Sleeping Dogs and some other game in DX11 mode. After a week of looking around forums with no solution, I just downclocked my card to standard spec and also increased the voltage slightly.  
I haven't had a problem since in any game. 
#19 Posted by AlisterCat (5470 posts) -

@warxsnake: What would constitute a little for voltage? From 0.987 to 1.000 or 0.988. Might try that.

#20 Edited by warxsnake (2634 posts) -
@AlisterCat said:

@warxsnake: What would constitute a little for voltage? From 0.987 to 1.000 or 0.988. Might try that.

First just try only downclocking to standard. Since that will alleviate voltage use. And downclocking is the safest thing you can do. 
If that still doesnt work bump it a tiny amount to those numbers you listed, nothing more. (I bumped my GTX570 voltage from 1000mV to 1026mV)
I'm assuming you're using EVGA Precision X. It's a good tool for all of this.  
Here is the reference spec for the 660ti http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-660ti/specifications

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.