@strikealight: Reports (including mine) are that v-sync does not help.
Shivoa's forum posts
I suspect the game is rendering as many frames as possible (theoretically this can reduce input lag but only when you can render 2+ complete frames per screen refresh and aren't being clever and passing in input data at the latest part of the render process) and so running GPUs at their maximum. Driver may not have any sort of profile for it so not even applying the normal Unity driver tweaks (if nVidia have any - I manually changed the AA compatibility flags for the game to match Unity3D).
These issues are being reported across nVidia and AMD on the official tech support forums (which Intel GPUs are kinda not working with the game right now apparently). But it doesn't appear to be everyone (or maybe most people aren't running their cards that close to the maximum performance and don't notice the fans running a bi louder than normal).
The GTX470 is possibly one of the hottest running GPUs that you can buy (got my old one here) but the new cards aren't exactly magical (a GTX760 or even 770 if you want to get another three and a half years out of an upgrade would be the nVidia upgrade path), despite being somewhat lower power (they dynamically overclock until they hit a maximum power draw or temperature to wring the most performance out of the silicon). I hope this game gets a patch at some point that fixes this issue.
This game is so hot it's... UNRATED!
Someone get FOX News on the line (I'm going to be really depressed if FOX have actually covered this as a "scandalous anti-discrimination* content in our kids' games" item).
* Pronounced "pro-gay".
@hunter5024: GAY SEXUAL CONTENT? Call the ESRB ASAP!
This is certainly exciting news, have to see how it develops with regard to the leadership role at id.
@nictel: Thanks, I found that too with my old G5 (which was SetPoint, their old software, I think) but the G400 doesn't seem to remember even DPI/USB rate settings when the software isn't loaded (and I've had left shift/ctrl on the thumb buttons and other keys for the DPI changing buttons forever so I can use them in games without the game needing to be aware of mouse 4/5 or custom DPI buttons being bindable and I think x-mouse type rebinders can only see mouse 4/5 and not the DPI buttons around the wheel).
@somejerk: Note that I'm talking about a card that changes boost bins (dynamic overclocking with overvolting) based on both maximum power draw and thermal limits (which is how the card exposes overclocking, via targets for power and thermals and an optional extra voltage level above that unlocks a top bin). So it isn't a driver issue that a card asked to render 1000 fps actually does what it is asked rather than having a limiter in place and more importantly these modern designs with very rapid monitoring of the power and thermal situation are not capable of being broken by what you describe (and both nVidia and AMD talked about things like FurMark as 'power viruses' when they came to prominence and pushed parts of GPUs in ways they had not expected, forcing this more advanced regulating mechanism on GPUs).
The current nVidia drivers expose an adaptive v-sync option to allow people who prefer the responsiveness of a tearing scene (where they get partial frames) to get a v-sync off option that also caps out at the refresh rate of the connected monitor (so in this mode the thermal regulation can cut in and idle the card once it has saturated the monitor connection rather than redrawing extra frames that can't be used once you hit the refresh rate of the screen). Obviously having v-sync on (as I prefer) means this protection is on all the time. This does not change the issue of requesting a surface (not the final buffer exported to the screen) and running an endless loop requesting it be refreshed and it is totally normal to update such a buffer beyond the rate at which frames are being exported to the screen.
Edit: as most GPUs come with very nice warranties and people can easily switch brand if their card caught fire when it wasn't them overclocking and messing to get round the power/thermal regulation then it wouldn't be in nVidia's interest to try and sabotage their customers cards. 'Making more money on customers that get hit by this' is pure chemtrails talk.
It seems to be limited to the software (so any Logitech peripheral that asks you to install the Logitech Gaming Software) as I've tried it without the peripheral being connected and it was the same (hardware acceleration kills the low power mode on the desktop just as surely as running a game does). Of course, with component reuse then it could be that other Logitech software also uses this method of accelerating the UI and so has exactly the same issue but I only have the software that provides drivers/options for my peripherals.
This seems to be a not-uncommon strategy for smaller/indie multiplayer games. Also see titles like Frozen Synapse, Achron, or The Ship (two free on that one) for this '2 for 1' way to push numbers up.
'cos I also was like, "Wow, someone else called in, like, the time we've been sitting here. That's awesome."
Dude, we're idiots.
I cut this edit from the Top 40 as edited by This Year, originally from The Hotspot on GameSpot.com.
I couldn't find an easy way to link into the middle of the mp3 file on This Year so I thought I'd do an embeddable to share here. This has had has me in tears on more than one occasion; I have many great memories of Ryan, but this is the one I'd like to share.