Giant Bomb News

178 Comments

Rage's PC Launch Problems Attributed to Driver Issues [UPDATED]

id Software, Bethesda working with hardware creators to smooth things out.

Expect a full review of Rage from Mr. Shoemaker pretty soon.

UPDATED: ATI/AMD have issued updated drivers for Rage, which you can download here.

We've installed the updated drivers on our in-office machine, and the frame rate and texture pop-in seems much improved. There's still texture pop-in, but that's just part of the engine at this point. There does appear to be some significant tearing, however, which proved pretty distracting.

--

If there was one thing people expected from Rage, it's that the PC version would be impeccable on day one. If PC owners could count on any developer to knock it out of the park, it would be the father of the first-person-shooter. Right? Right?

Unfortunately, for reasons partially out of id Software's control, Rage has had a bumpy launch since its midnight unlocking around the Internet.

The response from players has been...loud. Just take a look at the gaming section of Reddit today.

The main issue is performance, which id Software and Bethesda Softworks are currently squarely blaming on driver issues. Players are experiencing an incredible amount of texture pop-in, especially so on ATI/AMD cards. We can confirm this, having loaded Rage on our in-office PC. It's inexcusably bad right now.

The texture pop-in is not exclusive to PC; it's also very much present on Xbox 360 and PlayStation 3.

There are problems with both ATI/AMD and nVidia cards, but it's reportedly most egregious on ATI/AMD.

"While many folks are playing RAGE on PC and not running into issues," said the company on its blog, "we’re aware that some of you are experiencing issues with screen tearing and texture issues. These problems can be attributed to driver issues, and we’re currently working with Nvidia and AMD to resolve them as quickly as possible."

For the moment, it's recommended you do not update the ATI/AMD drivers and wait for the updated drivers to come "in a few hours," according to AMD's Catalyst developer on Twitter. Also, if you're using the Battlefield 3-specific drivers, the CTD (Crash to Desktop) rate is apparently 100%. A driver/battlefield-3/61-27006/ rollback is necessary there.

nVidia owners should be using the current beta drivers, which will be automatically include more Rage-specific improvements as they happen, so long as you've flipped updates on.

Expect more updates on this throughout the day.

Patrick Klepek on Google+
178 Comments
  • 178 results
  • 1
  • 2
  • 3
  • 4
Edited by Declarius

Just nabbed the game now, did nothing but add a .cfg file in the Base folder, added the lines

vt_pageimagesizeuniquediffuseonly2 8192

vt_pageimagesizeuniquediffuseonly 8192

vt_pageimagesizeunique 8192

vt_pageimagesizevmtr 8192

vt_restart

to it, started up the game, and it runs flawless. Even when I spin around full speed I can barely catch texture pop in, absolutely a non issue, even if I'm looking for it, it's hard to notice. Also upped the FoV to 95. Only issue so far was that the game kept opening on my second moniter and not my primary, had to disable my secondary moniter to get it right.

To add some detail, I'm running on an i7 860, 5870, 6GB RAM. With the newest Rage performance drivers (the real ones, not the mistake ones from earlier). Didn't force anything through CCC.

Posted by mr_otas

@k4el: WRONG! Developers are often given a "dev" version of the next generation of hardware and drivers so that when they release the game, they can guarantee that the game will run smoothly on a future system. But when the hardware vendors fuck up their release-version the results may be a disaster, and it's hardly the game developers fault. The game might run like a dream under the development drivers.

Posted by mekon

Why do UK users have to wait until Friday to play this game? I've already preloaded it and this just seems retarded.

Edited by k4el

@mr_otas: You are partially correct here. How ever, dev drivers or not it is terrible testing methodology not to try the game from the end user perspective.

All the info out actually points to ID having tested correctly actually. I'm not posting here saying ID did a bad job, just pointing out how much of the blame lies with AMD on this. Every indication is that ID knew, AMD promised to fix it and failed to update drivers correctly as their reputation with such work suggested they would.

I'm sure you feel quite clever with your "WRONG!" but... I know what I'm talking about. I do it every day.

Posted by mr_otas

@SomeJerk: Carmack knows his way around Direct X, but there is a number of reasons to choose OpenGL over DX:

1. OpenGL is -- and has always been -- more powerful than DX.

2. OpenGL supports the latest and greatest technology. Some of the features in DX11 has been in OpenGL for more than three years.

3. OpenGL is supported by Windows XP, Vista, 7, Linux, Playstation 3 and Wii. DX10/11 is supported by Windows Vista and 7.

4. DX9 has wider support but is dated and sports a bloated API.*

5. OpenGL is open source software, and some of us prefers an open system before a closed one.

*From what I have read, DX10 and 11 has an API that rivals OpenGL, witch is good from a developer perspective meaning less chance of rendering artifacts.

Edited by mr_otas

@k4el: I screamed wrong because I thought that you had laid all the blame on Id, like that they should have caught the error and fixed it them selves. That is why I pointed out that they usually develop with development drivers. But it seems like I was a jerk and raged before I fully understood your post, and for that I am sorry.

Posted by k4el

@mr_otas: Well... honestly there probably is at least some blame on ID's side of things. It's likely the QA department brought the issues up and ID took a risk and launched the game hopping that AMD wouldn't let them down.

It's just a guess but in my experience seems like a realistic scenario. ID had to have known about the issue, their response may or may not have been comprehensive or thorough.

Edited by Saltank

I can't use anything but the latest WHQL nVIDIA drivers because my ASUS GTX570 Direct CU II goes batshit crazy and everything crashes, including lots of artifacting in RAGE.

The latest nvidia beta or WHQL drivers don't fix the game for me.

Posted by bybeach

I'm pretty ignorant on how to alter driver files etc. Wish I knew more of that sort of thing, not sure how to approach it. But I have run the game to up into the beginning of the first mission. Us vault dwellers(deja-vu) must be super duper killers... But except for a bit of screen tearing in the beginning before the mission, this game is beautiful and responsive. i do not see the artifact/texture pop-in issue yet. If anything I am very impressed!

Posted by MrKlorox

Everybody should have D3DOverrider (which comes with RivaTuner). Screen tearing has never once been a problem since I started using it.

Posted by Mumrik

@Lucidlife said:

It doesn't surprise me that Giantbomb is having PC gaming issues. Stick to consoles. It's what you know.

What a useless post. If it is a joke, then it's a bad one. If you're serious, then you didn't read the article.

They're extremely console-centric, but that has nothing to do with this story.

Posted by Korwin
Posted by bybeach

Well I found force V-sync and all that, I should look harder. But I do not see the need for it. And yes, I should download one of the overclocking tools one of these days, though I havn't felt the need yet for that either w/what I got. Hard reset got a touch slow in one spot..

Edited by buckybit
  • there are still people around here, who think it's "ok" to have to massively tinker with their own hardware and graphic card settings to make the game work? While I am glad for you, the fact - that you are so tech savvy - this is nothing you should do, when you purchase a "Games For Windows" game (or ANY retail game)? You are justifying the "flaws" of the publisher and game developer.
  • as it was said earlier, the 'problem' lies in the software architecture approach - not just a AMD/ATI driver issue. It's on all platforms. Again, look at the slides from the Siggraph 2009 talk. There is a shift in game development going on from OOP (Object Oriented Programming) towards data itself (Data Oriented Design - see DICE slides by Daniel Collin). The idea behind is to more efficiently utilize the multi-core and many-core future of hardware - a world, we are already living in - running software in parallel and concurrently. ID Software and their id Tech 5 engine (with this Megatexture approach) is one way towards this. The software architecture behind it allows to run on diverse hardware (cross-platform) but comes with a 'cost'. This has nothing to do with 'drivers'. And yet, yes, of course it has to do with drivers - being able to use the device drivers to "work" with this (different) approach. AMD/ATI and NVIDIA (along with Intel and others) are preaching the new parallel and data streaming approach. Making this "work" = being able to have the performance and framerate you desire, is the brain-melting part. Shipping a product before you make sure it works well enough to justify taking gamers (or their parents) money, is a decision the publisher makes?

I bought the game to look at how good (or bad) the tech is. I don't care about "playing the game", aside from checking the game mechanics. But seeing 'consumers' out there, forcing their PC hardware to work with the game (many, not really knowing, what their are really doing) makes me feel sad. This is NOT the solution.

Posted by Nethlem

Blaming this on drivers that had been aviable for weeks/months prior to the games launch is borderline retarded...

They can try whatever they want, the simple matter of fact is that they had plenty of time to test the game with current non-beta drivers for Nvidia and AMD cards. The game performing as badly as it does on these clearly shows where the blame is to be put.

I'm just getting tired of this bullshit, every time some developer screws up his PC version by delivering sub-par support for one of the two manufacturers (ATI or Nvidia) because he chose to stick with one of the two's developers programms all the "PC hardware fanboys" jump out of the bushwork to tell everybody how "ATI allways had bad drivers" and that's why they "allways buy XYZ hardware". (Hint: Both of them managed to deliver crappy drivers and hardware at times, none of the two are especially good/bad at it)

That kind behavior belongs to console platform wars, but any PC gamer worth his hardware should know better then to let a choice of brand go above more important issues like Price/performance ratio and unique features.

Posted by EXCellR8

when i first started this game up it was awful, but after some tweaking and updating drivers it's pretty damn good. a few minor graphical glitches remain but for the most part the game is fixed and lookin' good!

Posted by Corvak

pop-in is an engine problem. I've seen it on the 360 version too. Haven't noticed any of the other issues yet (my drivers were updated before I really started playing).

The real problem, is that they didn't allocate enough hours to PC testing, I think. At least, not in a real world compressed and compiled version of the game. A bumpy launch, but not worth going online and losing your shit over.

Also, why are we so surprised that a Bethesda published game has bugs at launch? Skyrim will be buggy too.

Posted by LimpBishop

@Corvak said:

Skyrim will be buggy too.

Posted by MeatSim

Weird times we live in where a PC version of a id game is having the most problems.

Posted by Undeadpool

@Vegsen said:

Remember when games used to work perfectly on day one?

No. But I remember a time when they got released broken and could never be subsequently fixed.

Edited by SeriouslyNow

@Nethlem said:

Blaming this on drivers that had been aviable for weeks/months prior to the games launch is borderline retarded...

They can try whatever they want, the simple matter of fact is that they had plenty of time to test the game with current non-beta drivers for Nvidia and AMD cards. The game performing as badly as it does on these clearly shows where the blame is to be put.

I'm just getting tired of this bullshit, every time some developer screws up his PC version by delivering sub-par support for one of the two manufacturers (ATI or Nvidia) because he chose to stick with one of the two's developers programms all the "PC hardware fanboys" jump out of the bushwork to tell everybody how "ATI allways had bad drivers" and that's why they "allways buy XYZ hardware". (Hint: Both of them managed to deliver crappy drivers and hardware at times, none of the two are especially good/bad at it)

That kind behavior belongs to console platform wars, but any PC gamer worth his hardware should know better then to let a choice of brand go above more important issues like Price/performance ratio and unique features.

ATI has always had buggy OpenGL drivers. Always.

ATI has lied about OpenGL performance issues since the days of the 8500:-

Several hardware review sites discovered that the performance of the Radeon 8500 in some actual game tests was lower than benchmarks reflected. For example, ATI was detecting the executable "Quake3.exe" and forcing the texture filtering quality to much lower than normally produced by the card. HardOCP was the first hardware review web site to bring the issue to the community, and proved its existence by renaming all instances of "Quake" in the executable to "Quack." The result was improved image quality, but lower performance.

I want it to be clear that this not a brand vs brand thing. This is about a company who has repeatedly failed to deliver working, stable and high performance OpenGL drivers and has even been caught lying wrt this driver too.

Posted by elpesado

I'm playing the game without any problem. Also, there is a way to force the game to use high-res textures all the time instead of using the auto-balancer.

Posted by prestonhedges

Yup. It's like a racing game when you stop and look around and realize it looks pretty bad, except all the time.

Posted by Nethlem

@SeriouslyNow: You are using 5 year old examples to show how "broken" OpenGL support is? You do know that these cards back then worked on a totaly different architecture and DX API's?

Examples like that happen on both sides of the fence, nobody remembers Batman:AA having issues with Radeon cards because the game had been in the "Nvidia ment to be played" programm which lead to the developers supporting ATI cards worse by Nvidia's instructions. Nvidia also tried to pull that one under the rug.

Every single year both these companies make headlines with driver issues, Nvidia ain't holy either or have people allready forgotten the driver update for BC2 that killed Nvidia cards because it broke the fan control? Nah let's just ignore that... or let's just ignore the whole line of 9XXX cards that had been basicly rebranded G80 chips with smaller memory bandwith. That's ripping off customers big style and something Nvidia is famous for, rebranding old hardware and selling it as new. Both companies do this, but Nvidia basicly invented this and took it to whole new levels on the OEM market.

Sorry but anybody who goes: "ATI allways had bad drivers/Nvidia allways had loud/power hungry cards" without taking a look at the individual products at this point of time is just a blind fanboy who's too stuck with his choice of brand. Both these companies offer competive cards at fair prices, both companies have their issues with different engine's and filters. This isn't exclusive to one of them, they are both similiar "evil".

My basic point still stands: ID had access to all the current signed drivers. In this article they say themself that they knew the game would have issues with these drivers on release. Yet they chose to leave it like that and shift the work to Nvidia/ATI (Remember: Issues exist on both companies cards with this game!).

Edited by SeriouslyNow

@Nethlem said:

@SeriouslyNow: You are using 5 year old examples to show how "broken" OpenGL support is? You do know that these cards back then worked on a totaly different architecture and DX API's?

Examples like that happen on both sides of the fence, nobody remembers Batman:AA having issues with Radeon cards because the game had been in the "Nvidia ment to be played" programm which lead to the developers supporting ATI cards worse by Nvidia's instructions. Nvidia also tried to pull that one under the rug.

Every single year both these companies make headlines with driver issues, Nvidia ain't holy either or have people allready forgotten the driver update for BC2 that killed Nvidia cards because it broke the fan control? Nah let's just ignore that... or let's just ignore the whole line of 9XXX cards that had been basicly rebranded G80 chips with smaller memory bandwith. That's ripping off customers big style and something Nvidia is famous for, rebranding old hardware and selling it as new. Both companies do this, but Nvidia basicly invented this and took it to whole new levels on the OEM market.

Sorry but anybody who goes: "ATI allways had bad drivers/Nvidia allways had loud/power hungry cards" without taking a look at the individual products at this point of time is just a blind fanboy who's too stuck with his choice of brand. Both these companies offer competive cards at fair prices, both companies have their issues with different engine's and filters. This isn't exclusive to one of them, they are both similiar "evil".

My basic point still stands: ID had access to all the current signed drivers. In this article they say themself that they knew the game would have issues with these drivers on release. Yet they chose to leave it like that and shift the work to Nvidia/ATI (Remember: Issues exist on both companies cards with this game!).

I'm illustrating a point; that ATI consistently failed to deliver stable OpenGL drivers. Your arguments are just deflective and don't address that fact, but it is a fact. This has been an ongoing issue for ATI and crops up with every single id game since Doom III. Issues existed for the both cards in this game due to the engine constraints on some configs because the way the engine scales its own settings but it was ATI who posted then removed their supposed Rage 'hotfix' driver (because it used an older OpenGL driver). This is ATI through and through when it comes to OpenGL. This is an issue which plagues Linux users (Linux is OpenGL only) who use ATI hardware right now. Still an ATI OpenGL issue, even on a different platform altogether.

Posted by Naeberius

It's definitely a good thing that they fixed it asap, shows they do actually care.

Posted by pete1666

Just deleted atiglxx.dll from the rage directory and it works flawlessly.