Warning for PC users

  • 58 results
  • 1
  • 2
#1 Posted by Cataphract1014 (1316 posts) -

I was reading some people saying that their Video cards were overheating while playing the game and causing their system to shutdown. I pulled up my hardware monitor and saw that my GPU was topping out a nearly 90C. Every other game takes it to around 60, maybe 65. The cause for this is because Vsync isn't on, and the Vsync toggle in the game doesn't actually work. I would recommend forcing Vsync through your drivers just to be sure you don't cause any damage.

#2 Posted by kidman (473 posts) -

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

#3 Posted by Aleryn (705 posts) -

Good to know that VSYNC isn't working correctly in that way, thank you.

#4 Posted by Cataphract1014 (1316 posts) -

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Every other game it stays around 60c.

Darksiders 2 without forced Vsync: 90c

Darksiders 2 with forced Vsync: 64c

Clearly my card is faulty and there is nothing else going on that could cause it.

#5 Posted by Subjugation (4734 posts) -

If this is true they better hotfix that business asap. Damaging people's hardware is uuuuunnnnnnacceptable.

#6 Edited by MB (12943 posts) -

@Cataphract1014 said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Every other game it stays around 60c.

Darksiders 2 without forced Vsync: 90c

Darksiders 2 with forced Vsync: 64c

Clearly my card is faulty and there is nothing else going on that could cause it.

Try updating your video card driver, or if your'e on the most current one, using an older version or perhaps an alternative driver. It's probably not a hardware problem, at least not a physical one.

Moderator
#7 Posted by Cataphract1014 (1316 posts) -

@MB said:

@Cataphract1014 said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Every other game it stays around 60c.

Darksiders 2 without forced Vsync: 90c

Darksiders 2 with forced Vsync: 64c

Clearly my card is faulty and there is nothing else going on that could cause it.

Try a different video card driver - it's probably not a hardware issue.

Drivers are up to date. Installed on August 4th.

#8 Posted by kidman (473 posts) -

Only thing I wanted to put out there is that the game is not gonna damage your card, so might as well play at that 90'c. If it does break, well it was going to anyway.

#9 Posted by Curval (63 posts) -

I just played this for 3 hours straight on PC and had zero issues.

#10 Posted by SamStrife (1286 posts) -

People who are saying this is a faulty video card issue are straight up wrong. There's been other games where a glitch in the code (often revolving around the Vsync) causes the game to render far too many frames needlessly, causing issues. I experienced the same thing here where forcing the vsync through Nvidea control panel (because the in game one didn't work) caused my GPU temperature to drop from around 70 down to mid 50's.

#11 Posted by Deusx (1910 posts) -

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Software also has to do with this. Badly optimized applications can overheat a PC really easily, that´s a fact.

@MB said:

@Cataphract1014 said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Every other game it stays around 60c.

Darksiders 2 without forced Vsync: 90c

Darksiders 2 with forced Vsync: 64c

Clearly my card is faulty and there is nothing else going on that could cause it.

Try updating your video card driver, or if your'e on the most current one, using an older version or perhaps an alternative driver. It's probably not a hardware problem, at least not a physical one.

This has happened to numerous people including me. I play BF3 1920x1200 on ultra and get 65 from GPU and 60 from CPU. Darksiders 2 gets my GPU to 80 and my CPU to 75. The game is badly (horribly) optimized. I´ll test with the Vsync on and see if that helps. Thanks OP for posting this.

#12 Posted by Bio2hazard (144 posts) -

Fact is, software should not be able to damage hardware. Overclocking is a different case, and often is handled through the drivers, so when I say software, I am not talking about overclocking.

Yes, without VSync, a game or software can push your hardware to it's limit, but if it restarts your computer, it means your cooling is insufficient. Clean those fans.

If your hardware gets damaged, then there was something wrong with either your hardware or your drivers. The emergency shutdown is there to prevent damage, and usually kicks in a good 10 Degrees before damage would occur.

If we lived in a world where software could just smoke hardware nilly-willy as you guys are making it out to be, then we'd have a lot of viruses and malware that would smoke peoples computers for shits and giggles - thankfully, we don't.

#13 Posted by Mcfart (1718 posts) -

If your GPU is able to properly control fan speed to manage temps, then your GPU shoulden't overheat. This is the same as running any modern PC game with Vsync off...DS2 won't overheat your GPU any more then Witcher 2 with Vsync off.

#14 Posted by Deusx (1910 posts) -

@Bio2hazard said:

Fact is, software should not be able to damage hardware. Overclocking is a different case, and often is handled through the drivers, so when I say software, I am not talking about overclocking.

Yes, without VSync, a game or software can push your hardware to it's limit, but if it restarts your computer, it means your cooling is insufficient. Clean those fans.

If your hardware gets damaged, then there was something wrong with either your hardware or your drivers. The emergency shutdown is there to prevent damage, and usually kicks in a good 10 Degrees before damage would occur.

If we lived in a world where software could just smoke hardware nilly-willy as you guys are making it out to be, then we'd have a lot of viruses and malware that would smoke peoples computers for shits and giggles - thankfully, we don't.

Uhm there are viruses and malware that can do that bro. Software CAN overheat your system.

#15 Posted by SamStrife (1286 posts) -

If we lived in a world where software could just smoke hardware nilly-willy as you guys are making it out to be, then we'd have a lot of viruses and malware that would smoke peoples computers for shits and giggles - thankfully, we don't.

No one's saying it's happening here there and everywhere. We're just saying it's possible and that Darksiders 2 can cause this if you don't force vsync through your graphics card control panel...

#16 Edited by ichthy (577 posts) -

@Bio2hazard said:

Fact is, software should not be able to damage hardware. Overclocking is a different case, and often is handled through the drivers, so when I say software, I am not talking about overclocking.

Yes, without VSync, a game or software can push your hardware to it's limit, but if it restarts your computer, it means your cooling is insufficient. Clean those fans.

If your hardware gets damaged, then there was something wrong with either your hardware or your drivers. The emergency shutdown is there to prevent damage, and usually kicks in a good 10 Degrees before damage would occur.

If we lived in a world where software could just smoke hardware nilly-willy as you guys are making it out to be, then we'd have a lot of viruses and malware that would smoke peoples computers for shits and giggles - thankfully, we don't.

http://www.gameinformer.com/b/news/archive/2010/07/28/blizzard-confirms-starcraft-ii-overheating-bug.aspx

I beg to differ.

#17 Posted by Bio2hazard (144 posts) -

@ichthy: That's the same thing. The framerate is not limited, hence the system runs at max performance and generates more heat. A proper system with proper cooling will not incur any damage or forced shut down. Trust me, I played SC2 on launch day ( german amazon shipped the game to me a week early - i even posted on the GB forums about it ) and my computer did not burst into flames. Computers that did burst into flames definitely had problems that go beyond the game.

I'm not saying that stressful software can't raise the temperature, I'm merely saying that a maintained system will neither crash nor melt.

As for @Deusx: I've been on the internet for about 15 years, I've helped people remove Dialers, Sasser, MS Blaster - all the big name and widespread infections I've dealt with. I have not had to deal with anything that destroyed hardware. And honestly - knowing the internet - smoking someones hardware would cause plenty of "lulz", so I rest my case that if it were possible to do this on a large scale, we probably would have heard about it.

Again, I'm not saying that poorly optimized software can't raise a system's temperature. I am saying that under normal circumstances ( this means the fans and case are somewhat dust free, you make sure your fans actually spin and maintain them every once in a while, you make sure the room the PC is in isn't too hot - common sense stuff ) a computer shouldn't overheat or melt. If you put 2 notoriously hot Geforce 295 GTX in a microATX case and run it in quad SLI, you don't have to be surprised if your computer crashes due to heat, but that's not a proper configuration - the cards need room to breathe, and a case which supports more airflow.

#18 Posted by clstirens (847 posts) -

Everyone here seems to forget this happened with Starcraft 2 on the menus.

#19 Edited by emem (1972 posts) -
@Bio2hazard said:

@ichthy: That's the same thing. The framerate is not limited, hence the system runs at max performance and generates more heat. A proper system with proper cooling will not incur any damage or forced shut down. Trust me, I played SC2 on launch day ( german amazon shipped the game to me a week early - i even posted on the GB forums about it ) and my computer did not burst into flames. Computers that did burst into flames definitely had problems that go beyond the game.

I'm not saying that stressful software can't raise the temperature, I'm merely saying that a maintained system will neither crash nor melt.

Yup, I've never had an issue with my cards running at max performance level either, but for those who would actually like to play without (forcing) vsync for whatever reason... just use a FPS limiter like DxtoryI think Afterburner has one included and I'm sure there are other ways to do it as well, just google it. So unless the game is incredibly demanding (like The Witcher 2 for example) your GPU won't get that hot any longer.
#20 Posted by benspyda (2043 posts) -

I did notice the vsync option did nothing. That's why the game looks kinda bad.

#21 Posted by Shivoa (643 posts) -

Yep, lots of reports that DS2 is pushing GPUs as far as they can go on PC (thermally). People who are getting failures are people who don't run Furmark/SuperPi to confirm they have adequate cooling installed in their machine for the silicon they purchased. Please always check you have to cooling you require for a machine when it is new rather than letting a game find it for you.

nVidia's and AMD's latest cards do a good job of dynamically overclocking to maintain their thermal envelopes as well as the age old linking of fan speed with temp. You should be safe as these cards can operate at temps in the 90s C. nVidia also have their newest drivers (300 series) enabled with a stepped v-sync so you get 60fps v-sync locked at the frame cap and then tearing rather than a drop to 30fps if you go below it. This seems to indicate that the future of v-sync off will likely still lock performance v-sync at the screen cap to avoid issues like this where the GPU can render a few frames per tick and run as fast as possible to do it (with a minor decrease in lag/increase in responsiveness from doing so).

#22 Posted by JoeyRavn (5007 posts) -

I have VSync and Triple Buffering enabled by default on my NVIDIA Control Panel, so I guess I'll be safe. Anyone with this problem should use FRAPS to see what FPS they are getting without VSync. I assume that if the GPU is overheating it's because it's drawing a shitload more FPS than it should.

#23 Posted by phrosnite (3518 posts) -

Good to know that the in-game vsynch is busted. I hate screen tearing.

#24 Posted by insane_shadowblade85 (1499 posts) -

I have an AMD card. How do I enable VSync? I'm staring at my Catalyst Control Center and can't find the option @_@

#25 Posted by Humanity (9871 posts) -

Thanks for the warning, also as a side note, being a long time pc gamer from the past that since converted to console just to escape these sort of situations - hearing "try updating your graphic drivers" literally sends shivers of rage down my spine.

#26 Posted by Shivoa (643 posts) -

@Humanity: Don't worry, just connect tot PSN and get the latest patch and your console game will stop crashing :)

PCs may have tried to become easier to use (Windows Update now also grabs GPU driver updates) but the last gen really saw consoles become PCs with a razor-blade business model and closed ecosystem.

#27 Edited by SharkEthic (1062 posts) -

@Bio2hazard said:

Fact is, software should not be able to damage hardware.

Writing "Fact is" doesn't make the subsequent statement true...though fact is you're wrong.

#28 Edited by Humanity (9871 posts) -

@Shivoa said:

@Humanity: Don't worry, just connect tot PSN and get the latest patch and your console game will stop crashing :)

PCs may have tried to become easier to use (Windows Update now also grabs GPU driver updates) but the last gen really saw consoles become PCs with a razor-blade business model and closed ecosystem.

It's for the best I think. I honestly don't have the patience to mess around with mods anymore so I don't care too much about that getting cut. I know people play Fallout games, or specifically Bethesda games with literally hundreds of mods running at the same time. I don't care for that and would rather just play the game vanilla as it was meant to be played for better and worse. The single best feature of consoles is the strict adherence to hardware specs which enable developers to squeeze out every inch of performance out of their products. In addition, if you load your game up, and theres some crazy crashing bug, you're not longer Joe Schmoe on the internet who has this weird problem - millions of people around the world are having this problem and it leads to it getting fixed a lot quicker.

I am slightly bitter because there were a few really good gaming experiences that just got ruined for me by PC specific bugs. I loved Morrowind but the game would just NOT stop crashing for me and it had something to do with my inherent hardware setup so there was nothing I could do. I went back to the game a while later when I put together a new PC and it ran like a dream but by then the magic was gone and I finished the main storyline somewhat begrudgingly.

#29 Posted by Dagbiker (6978 posts) -

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Thats not true at all.

#30 Posted by llamaegg (230 posts) -

@insane_shadowblade85 said:

I have an AMD card. How do I enable VSync? I'm staring at my Catalyst Control Center and can't find the option @_@

Under Gaming>3D Application Settings.

Also, maybe it's because I have a gazillion of fans in my case, I haven't had this issue.

#31 Posted by alternate (2719 posts) -

Clearly a software bug but I kinda agree that even if it is forcing unnecessary rendering then the card should still be able to run under 100% load indefinably without shutting down. Unless you have OC'ed it too far or something.

#32 Posted by kidman (473 posts) -

@Dagbiker said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Thats not true at all.

Ok then, how would that work? If GFX card is not able to handle the game then it will simply run with fewer frames per second - that's it. How is it supposed to overheat the card? Modify the fan speed? If the card overheats it's a mechanical failure - how can it be otherwise?

#33 Edited by Marz (5668 posts) -

it's because the game runs at like 150FPS with Vsync off, so therefore it's using all of your GPU power. It's nothing wrong with the way the game was designed (v-sync not working is sort of a bug i guess), all games can do this technically if there is no CPU bottleneck and your gpu is allowed to go full power.

#34 Posted by Dourin (234 posts) -

@Humanity said:

@Shivoa said:

@Humanity: Don't worry, just connect tot PSN and get the latest patch and your console game will stop crashing :)

PCs may have tried to become easier to use (Windows Update now also grabs GPU driver updates) but the last gen really saw consoles become PCs with a razor-blade business model and closed ecosystem.

It's for the best I think. I honestly don't have the patience to mess around with mods anymore so I don't care too much about that getting cut. I know people play Fallout games, or specifically Bethesda games with literally hundreds of mods running at the same time. I don't care for that and would rather just play the game vanilla as it was meant to be played for better and worse. The single best feature of consoles is the strict adherence to hardware specs which enable developers to squeeze out every inch of performance out of their products. In addition, if you load your game up, and theres some crazy crashing bug, you're not longer Joe Schmoe on the internet who has this weird problem - millions of people around the world are having this problem and it leads to it getting fixed a lot quicker.

I am slightly bitter because there were a few really good gaming experiences that just got ruined for me by PC specific bugs. I loved Morrowind but the game would just NOT stop crashing for me and it had something to do with my inherent hardware setup so there was nothing I could do. I went back to the game a while later when I put together a new PC and it ran like a dream but by then the magic was gone and I finished the main storyline somewhat begrudgingly.

While I'll agree that more people are likely to have the same hardware config as you on your 360 compared to your PC config, let's not forget that not all 360's are made equal. There have been multiple hardware revisions with the 360, resulting in PC-like testing and troubleshooting being required by developers when hardware-related issues are discovered. And mind, I don't just mean old white 360's compared to the slim 360's. Even within the original Xbox 360 elites, there were more than one hardware configurations going on, and I believe the slims are already up to 2 different configs.

#35 Edited by Shivoa (643 posts) -

@kidman said:

@Dagbiker said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Thats not true at all.

Ok then, how would that work? If GFX card is not able to handle the game then it will simply run with fewer frames per second - that's it. How is it supposed to overheat the card? Modify the fan speed? If the card overheats it's a mechanical failure - how can it be otherwise?

Furmark was called a hardware virus by AMD (or ATi, how long ago did this all blow up? Edit: A Google indicates it might actually have been nVidia who called it a power virus first, they now detect stuff like this at a driver level so all hardware should be safe from overheating too badly AS LONG AS YOU INSTALLED THE CARD INTO AN ADEQUATELY VENTED CASE FOR THE WATTAGE OF YOUR GPU). Basically a GPU is made of many similar processing units but they are bottlenecked to some degree by the other operations (rasterization ROPs, texture lookups TEXs, the top and tail to that incredibly wide processing block) and by the lack of a perfectly parallelizable task that means some of the processor is going to be idle for a fraction of the total time. This means you can make a GPU that can't deal with everything being used at the same time and stay within the thermal envelope with normal fan speeds. The card is designed to work for typical to highly demanding game code and not to survive well with 'attacks' that dodge any of these bottlenecks and truly show you how much power the card can suck up. This is why you see some benchmarks off a new card where the total computer was eating 320W at the wall running a high end DX11 game but Furmark causes it to use 400W. Thing is not normal and makes card go beyond what you could expect of any actual game.

More recent cards have had the same tech as CPUs use and so go beyond changing the fan speed to match the thermal conditions but actually change the clock speed and so power and heat generated to manage the situation and avoid being susceptible to these kinds of things. Cards should crash out or power down when they hit a max temperature so these are not a serious fire risk, but manufacturers do consider some stuff exploiting their designs to overwork their cards as not good practice.

Edit: In this case then a game without a v-sync is expected to render frame after frame for all time and the cards should be expected to deal with it (if not you need more fans in your case as that card should be able to survive this game). You may only see 60 frames per second but your card can render thousands of them if given a low enough workload in any one frame and a game that doesn't lock its world update cycles (which means the render you get every 1/60th of a second is closer in time to one that started 16ms earlier based on the world/input state at that time). But more modern drivers like nVidia's (and AMD's latest?) can give you an option where the card does lock itself to v-sync at the max refresh rate of the screen you're using to give itself some resting time before being asked to render the next frame (while rendering with tearing below the refresh rate) and this helps these dynamically clocked cards as that inactive time can give them some time to cool down a fraction so they can push even harder when given work - if you are only limited by thermal conditions and you generate a fixed unit of heat per work done (not true for processors but say theoretically) then the ideal way of operating a GPU would to instantly do all the work needed for the next frame the instant before the monitor needed it and sit idle the rest of the time.

#36 Posted by Bwast (1342 posts) -

I have like a billion case fans so my cooling is good enough. I am still upset with this port, however. The game looks like shit on my TV. What's the point of running it in 1080p when all it does is show how bad the textures and models are? And about 40-50% of the time the game will just crap out on me when I back out from a menu. It doesn't lock, it just sits on this grey screen forever. I can't do anything but close it and restart. If they patch it, great. Until then, it's 50$ down the shitter. I thought we were past piss poor PC ports. Grumble.

#37 Posted by Cataphract1014 (1316 posts) -

@JoeyRavn said:

I have VSync and Triple Buffering enabled by default on my NVIDIA Control Panel, so I guess I'll be safe. Anyone with this problem should use FRAPS to see what FPS they are getting without VSync. I assume that if the GPU is overheating it's because it's drawing a shitload more FPS than it should.

I just turned it on without vsync on to see what my FPS was. I was getting 600FPS on loading screens, 300FPS in the menu, and 100~ while running around.

#38 Posted by JoeyRavn (5007 posts) -

@kidman said:

@Dagbiker said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Thats not true at all.

Ok then, how would that work? If GFX card is not able to handle the game then it will simply run with fewer frames per second - that's it. How is it supposed to overheat the card? Modify the fan speed? If the card overheats it's a mechanical failure - how can it be otherwise?

Well, gave you the large explanation, so I'll give you the short one. Take any game and any GPU capable of maxing it out with all the bells and whistles. My GTX 570 and Darksiders II, for example. I'm using a pretty basic 1920x1080x60 ASUS monitor. I put on FRAPS to measure my FPS and start up Darksiders II. Without VSync, my GPU will draw every frame that it is able to draw all the time, even if my monitor can't see it. In other words, FRAPS will be telling me that my game is running at, say, 120 FPS, but the refresh rate of my monitor is still 60Hz, so it can only display 60 frames per second. But the fact that I can't see those 60 frames that are "lost" doesn't mean they don't exist: my GPU is "creating" them. And, as you may imagine, the power needed to draw 120 frames per second is a lot higher than the power needed to draw 60 frames per second. For the GPU, power is electricity, and electricity produces heat. More and more heat, and the card overheats.

That's just a simple scenario, but there are more. Something that almost anyone can try: get a laptop with a mobile GPU and try playing a graphic intensive game. You'll see how quickly the temperatures rise, even if the game is barely running at a decent speed.

#39 Posted by Shivoa (643 posts) -

@Cataphract1014 said:

@JoeyRavn said:

I have VSync and Triple Buffering enabled by default on my NVIDIA Control Panel, so I guess I'll be safe. Anyone with this problem should use FRAPS to see what FPS they are getting without VSync. I assume that if the GPU is overheating it's because it's drawing a shitload more FPS than it should.

I just turned it on without vsync on to see what my FPS was. I was getting 600FPS on loading screens, 300FPS in the menu, and 100~ while running around.

Ok, they might need to patch the game. A power virus can be a menu/loading system that doesn't have a sleep timer. It sounds like it could be running the renderer at full whack with that simple loading screen (because you want to load stuff and so spend most of your useful work - in other threads - on doing that loading you normally set the thing that is rendering the load screen to sleep and so free up the CPU to manage throwing GBs of texture data about and churn the mutable data set) and so generating way more heat than is normal when not doing anything complex (so no bottlenecking) and also slowing down all the load times on PC. We've seen games released before with similar common high temp issues expressed on launch and then they get patched to put a v-sync or sleep timer on the menu/loading areas to rein it in; that said the in-game shouldn't actually be pushing your card too far (needs more cooling/cleaning all the dust from between the slot fins/fan) so that should only really be an issue if you're cooking your card by sitting it in the menus all the time.

#40 Posted by Humanity (9871 posts) -

@Dourin: There are 7 revisions up until now and some of those were changes like bigger heatsinks or smaller parts which would limit it to roughly 4 major configurations to test on. Having to Q & A 4 different models is infinitely easier and less time consuming than a seemingly endless amount of configurations that any given PC user can come up with. Console gaming is a much more closed circuit thats beneficial to everyone.

#41 Posted by Rolyatkcinmai (2699 posts) -

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

This is not true. Deus Ex last year was notorious for burning up perfectly fine video cards. Don't be an idiot.

#42 Posted by Nethlem (436 posts) -

@JoeyRavn: That not at all what Shivoa explained, if the card overheats because of this then it hasn't been properly cooled.

The other issue that could pop up if the game just randomly restarts the computer, in that case the PSU is not up to the task to deliver enough juice for the GPU.

But a well setup system won't overheat trough this. One of the reasons i usually give all the rigs i build a worst-case scenario test with Prime/FurMark to see if cooling and performance is up to the task. It's still not an great bug because it drives up your electricity bill needlessly, but it won't break anything that hasn't been faulty to begin with.

#43 Edited by kidman (473 posts) -

@Shivoa said:

@kidman said:

@Dagbiker said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

Thats not true at all.

Ok then, how would that work? If GFX card is not able to handle the game then it will simply run with fewer frames per second - that's it. How is it supposed to overheat the card? Modify the fan speed? If the card overheats it's a mechanical failure - how can it be otherwise?

Furmark was called a hardware virus by AMD (or ATi, how long ago did this all blow up? Edit: A Google indicates it might actually have been nVidia who called it a power virus first, they now detect stuff like this at a driver level so all hardware should be safe from overheating too badly AS LONG AS YOU INSTALLED THE CARD INTO AN ADEQUATELY VENTED CASE FOR THE WATTAGE OF YOUR GPU). Basically a GPU is made of many similar processing units but they are bottlenecked to some degree by the other operations (rasterization ROPs, texture lookups TEXs, the top and tail to that incredibly wide processing block) and by the lack of a perfectly parallelizable task that means some of the processor is going to be idle for a fraction of the total time. This means you can make a GPU that can't deal with everything being used at the same time and stay within the thermal envelope with normal fan speeds. The card is designed to work for typical to highly demanding game code and not to survive well with 'attacks' that dodge any of these bottlenecks and truly show you how much power the card can suck up. This is why you see some benchmarks off a new card where the total computer was eating 320W at the wall running a high end DX11 game but Furmark causes it to use 400W. Thing is not normal and makes card go beyond what you could expect of any actual game.

More recent cards have had the same tech as CPUs use and so go beyond changing the fan speed to match the thermal conditions but actually change the clock speed and so power and heat generated to manage the situation and avoid being susceptible to these kinds of things. Cards should crash out or power down when they hit a max temperature so these are not a serious fire risk, but manufacturers do consider some stuff exploiting their designs to overwork their cards as not good practice.

Edit: In this case then a game without a v-sync is expected to render frame after frame for all time and the cards should be expected to deal with it (if not you need more fans in your case as that card should be able to survive this game). You may only see 60 frames per second but your card can render thousands of them if given a low enough workload in any one frame and a game that doesn't lock its world update cycles (which means the render you get every 1/60th of a second is closer in time to one that started 16ms earlier based on the world/input state at that time). But more modern drivers like nVidia's (and AMD's latest?) can give you an option where the card does lock itself to v-sync at the max refresh rate of the screen you're using to give itself some resting time before being asked to render the next frame (while rendering with tearing below the refresh rate) and this helps these dynamically clocked cards as that inactive time can give them some time to cool down a fraction so they can push even harder when given work - if you are only limited by thermal conditions and you generate a fixed unit of heat per work done (not true for processors but say theoretically) then the ideal way of operating a GPU would to instantly do all the work needed for the next frame the instant before the monitor needed it and sit idle the rest of the time.

Thanks a lot for that, very informative.

@Rolyatkcinmai said:

@kidman said:

If your videocard overheats it means it's faulty, nothing to do with the game - simple as that.

This is not true. Deus Ex last year was notorious for burning up perfectly fine video cards. Don't be an idiot.

As you can see I was already given a perfectly reasonable explanation, without name calling too. You should try that some time.

#44 Edited by JoeyRavn (5007 posts) -

@Nethlem: I never said I was summing up his words As I said, I gave just a case where a GPU could overheat because of the game, like in the edited part of his post... which seems to be what is happening here, given those FPS readings.

#45 Posted by Dourin (234 posts) -

@Humanity: Beneficial, sure, but many of us are willing to deal with the occasional hiccup for the freedom that PC gaming allows compared to consoles. I think things like the $40,000 title update costs on the Xbox are a good reason to be a (mainly) PC gamer. That, and not having to wait for a patch to go through cert before getting to me, or having to pay for free dlc. Sure, PC gaming has its share of headaches as well, but very, very rarely do they keep you from being able to play the game the way it was intended (hardware limitations of lower-end PC's aside).

Honestly, when it comes down to it, mediocre to bad PC ports are becoming less and less frequent, as are unstable releases to the PC. It's kind of like plane crashes: it happens rarely enough that when it does occur, it's all over the media (and forums).

#46 Posted by Krakn3Dfx (2502 posts) -

Between this and Sleeping Dogs, Sleeping Dogs seems to be the PC port to pick up this week. Glad I made the right choice.

#47 Edited by Humanity (9871 posts) -

@Dourin: While I agree that if you want to mod your games then you are probably willing to put up with a lot more hassle than the average gamer - I hardly see how console patch costs are a reason to remain a PC gamer. Indie developers aside, these companies are big boys - they can and should strive to release games without the need to patch them. I think it's a good thing and ensures games will go through more rigorous testing before they are released. If you are an indie developer and don't have the time/staff/money to test your game for horrible game breaking bugs then maybe don't release on consoles (both since the Playstation 3 has similar costs associated with patches as the XBOX 360). If you are a big name developer and you should shipped your 50 million dollar title and it will require a $40,000 patch then no biggie, since you just spent 50 million dollars and guess what if the game does well they will make almost 10 times as much. Those costs are a deterrant so that you don't just release a buggy game in order to meet deadlines with the "oh we'll just patch that later" mentality. Sorry if I seem a little heated on the matter but I just don't think people riffing on Microsoft as this big greedy company trying to extort money out of poor little Activision or Epic Games is such a grand issue.

Also this discussion is getting horribly off topic so I'm happy to continue discussing it in PM's so as to not bog up the thread with my views of the Microsoft certification process when the thread is about Darksiders 2 bogging down your pc.

#48 Posted by Doctorchimp (4078 posts) -

@Humanity said:

@Dourin: There are 7 revisions up until now and some of those were changes like bigger heatsinks or smaller parts which would limit it to roughly 4 major configurations to test on. Having to Q & A 4 different models is infinitely easier and less time consuming than a seemingly endless amount of configurations that any given PC user can come up with. Console gaming is a much more closed circuit thats beneficial to everyone.

Then how does Skyrim on PS3 or that Silent Hill collection thing happen?

#49 Edited by Dourin (234 posts) -

@Humanity: Don't bother with continuing this in pm's. If you're going to defend something as ridiculous as charging tens of thousands of dollars for a patch then I think this conversation is over.

#50 Posted by Cataphract1014 (1316 posts) -

@Krakn3Dfx said:

Between this and Sleeping Dogs, Sleeping Dogs seems to be the PC port to pick up this week. Glad I made the right choice.

Nothing wrong with the game itself. I personally think it is fantastic.

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.