R9 290x $549. More than Titan. Half the price.

#1 Edited by AlexGlass (688 posts) -

Anandtech review: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

$579 with Battefield 4 at Newegg:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058

Thoughts?

#2 Edited by SamStrife (1282 posts) -

I think it runs super hot and whilst it undercuts all of Nvidea prices, it doesn't hold it's value against other AMD cards (it's a lot more expensive than other AMD ones for diminishing returns in performance). Whilst I don't have a problem with AMD drivers, Nvidea is doing a ton of work with their suites, such as the Gefore Experience and that new recording tool they're launcing next week.

If you want a top end GPU, I'd honestly wait a year for the 800 series Nvidea cards to come out with their Maxwell architecture which, to me at least, looks pretty darn exciting. If you absolutely cannot wait the R9 290x is pretty good but it certainly has its caveats. The GTX 760/770 would do you just as well or (if you really want an AMD card) the 7870 XT with boost Sapphire does is a pretty excellent card for the price that would tie you over nicely til the later models of these new batches drop.

#3 Posted by PandaBear (1344 posts) -
#4 Posted by AlexGlass (688 posts) -
#5 Edited by SlashDance (1812 posts) -

Is this a good time to upgrade, though?

I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out? (edit: I guess Crysis 1 is a better example, but you get the point)

#6 Edited by Slaegar (697 posts) -

This is good news for everyone. Nvidia is going to have to stop gouging their customers if they want to compete with AMD.

But still have their ancient fears of AMD drivers (even though one of the last Nvidia WHQL drivers was causing games to be unplayable and bricking some cards) so they like to stick with team green even when it can cost $100+ for the same performance. Heck the last two video cards that died on me were Nvidia.

Not like it matters much for me, though, I doubt I'll ever be in a position to comfortably afford a flagship video card.

There was some furmark result that had it at 94 Celsius with is pretty hot, but furmark pushes video cards passed what any game ever will. I also suspect non reference cards to keep temperatures down. One might suspect AMD pushed the clock speeds of the card a little higher than they initially planned so they could fit the card where the wanted in benchmarks.

Is this a good time to upgrade, though?

I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out?

This card is far beyond the power of the next generation of consoles even after you account for low level optimizations. They might get close, but I'm not sure if even the next Gears of War bump in prettiness will antique this card.

#7 Posted by Huey2k2 (488 posts) -

I bought a GTX670 about a year ago, I figure I can afford to wait a year or so to see what the 800 series has to offer. My GTX670 should be able to handle most of what I need until then. Or at least I hope it does, because I really need a new processor sometime in the near future.

#8 Edited by AlexGlass (688 posts) -

@Slaegar Yeah, I don't see how Nvidia can afford to maintain those prices now for much longer, noise level and all. It's just too big a price gap.

@slashdance said:

Is this a good time to upgrade, though?

I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out? (edit: I guess Crysis 1 is a better example, but you get the point)

For me, Brigade is going to dictate if I make the jump back into PC gaming. After the recent demo, with more noise due to fresnel effects and needing 2 Titans, I think I'll be waiting to see what Nvidia and Maxwell has to offer. Hopefully they go heavy in MMID architecture and develop something that can really clean up that noise.

But if you're worried there's going to be any type of tech developed on the console side, that might not run on well on this card I highly doubt it. I doubt there's going to be anything developed this generation on the console side that this card would not been able to run as its PC equivalent.

#9 Posted by kolayamit (3 posts) -

can't wait to buy one of this..

#10 Posted by Xeiphyer (5601 posts) -

Looks shiny.

Not planning on upgrading until next year though. Hopefully the new console releases will cause a nice bump in quality from next year's cards. Also wondering about integration of any new indirectX's and whathaveyous depending on what the consoles are running and what ends up in the PC market.

#11 Edited by Devildoll (877 posts) -

I think the chip is pretty impressive, the amount of stuff they managed to cram into it while only making it about 100 mm2 bigger than tahiti.

Seems like the cooler is pretty bad though, kind of hot and loud, leaving almost no room to oc.

It'll be interesting to see how far you can push the chip with adequate cooling.

Here in sweden, the card landed about 100 bucks cheaper than i thought it would be placed, it costs less than a 7970 did at launch, which is pretty cool.
Not quite sure i want to buy one though.

#12 Edited by Andorski (5239 posts) -

Checking several game bench marks, the 290x looks to perform between the GTX 780 and Titan at 1080p/1200p. Going up in resolution at 1440p/1600p and the 290x starts beating the Titan and at 4k resolution the 290x destroys anything nVidia is offering. That kind of performance at $579 USD is remarkable. The only gripes I've heard about the card is that it gets damn hot (it hits +90C during benchmarks) and the OC ceiling is a bit low.

I hope this pushes nVidia to drop the price of 780. With the announcement of the 780Ti, I think it would be best for nVidia to price the 780 at $550, the 780Ti at $650 (assuming that it is at or above the 290x in performance), and price whatever the hell they want with the Titan.

#13 Edited by Droop (1845 posts) -

They said 95C is a normal operating temperature, and the fans can reach like 70dB (at 100% which is a rare case).

Will be interesting to see some non-stock coolers on the GPU.

#14 Posted by Sinusoidal (1379 posts) -

... prosumer...

I hate this word with every fiber of my being. What? You're a pro at buying shit? Whoopty-fucking-doo!

#15 Posted by Scampbell (496 posts) -

@slaegar said:

This is good news for everyone. Nvidia is going to have to stop gouging their customers if they want to compete with AMD.

But still have their ancient fears of AMD drivers (even though one of the last Nvidia WHQL drivers was causing games to be unplayable and bricking some cards) so they like to stick with team green even when it can cost $100+ for the same performance. Heck the last two video cards that died on me were Nvidia.

Not like it matters much for me, though, I doubt I'll ever be in a position to comfortably afford a flagship video card.

There was some furmark result that had it at 94 Celsius with is pretty hot, but furmark pushes video cards passed what any game ever will. I also suspect non reference cards to keep temperatures down. One might suspect AMD pushed the clock speeds of the card a little higher than they initially planned so they could fit the card where the wanted in benchmarks.

@slashdance said:

Is this a good time to upgrade, though?

I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out?

This card is far beyond the power of the next generation of consoles even after you account for low level optimizations. They might get close, but I'm not sure if even the next Gears of War bump in prettiness will antique this card.

And with the help of Mantle low level console optimization might not mean much compared to a GCN-based GPU on PC.

#16 Posted by Fistfulofmetal (680 posts) -

I'll likely wait until NVIDIA releases the specs on the GTX 880 next year and see how we are.

#17 Edited by CaLe (3944 posts) -

PlayStation have 8gegabit DD5 and this only 4 (half) = sux. Also made from Hawaii who have no computer history. SUX. How it expects to compete with PS4 when it launches after it in USA? AMD made a big mistake on this.

AMD after this big mistake.

#18 Edited by chiablo (918 posts) -

I've been an nVidia fan for years now. My last video card was an AMD 6970 and I've had nothing but problems with it.

Short list of my grievances:

  • Locking up my DisplayPort monitor during every reboot.
  • Dual Display crashing the AMD driver if I play DotA on high settings.
  • Triple Display crashing the AMD driver constantly.
  • No DisplayPort output when in the BIOS or while doing the POST.
  • Noise on my DisplayPort at lower resolutions. I haven't a clue how this is possible because it's an all-digital interface. WTF?
  • Sounds like a jet engine.

I'm really excited for this card though, because it will force nVidia to step up their game and make their top-end cards less expensive. :p

Also, this is the most annoying thing ever and is a known bug with AMD cards: http://dev.dota2.com/showthread.php?t=33614

#19 Posted by AlexW00d (6227 posts) -

@chiablo said:

I've been an nVidia fan for years now. My last video card was an AMD 6970 and I've had nothing but problems with it.

Short list of my grievances:

  • Locking up my DisplayPort monitor during every reboot.
  • Dual Display crashing the AMD driver if I play DotA on high settings.
  • Triple Display crashing the AMD driver constantly.
  • No DisplayPort output when in the BIOS or while doing the POST.
  • Noise on my DisplayPort at lower resolutions. I haven't a clue how this is possible because it's an all-digital interface. WTF?
  • Sounds like a jet engine.

I'm really excited for this card though, because it will force nVidia to step up their game and make their top-end cards less expensive. :p

Maybe you should have RMAd it cause that's clearly a fucking busted card you have...

I have had a 6870 for 3 years now and had 0 problems.

Anyway, it's a shame the 290x is $550 for americans, and fucking $700 in europe. Yay capitalism.

Online
#20 Edited by MonetaryDread (2007 posts) -

The card is a little expensive in Canada =)

I am going to wait until I see Nvidias 8xx cards next year. I am more than happy with my 680 and I do not feel like I need an upgrade quite yet. I have had nothing but issues with ATI cards in the past (I used to have a 4890, so maybe things have changed) Also, I have a 3D Vision 2 monitor that I use quite regularly and I would hate to be forced into using ATI's solution instead because the quality of the experience is a fraction of what Nvidia offers. Plus, there is a DIY-kit coming that will allow my monitor to take advantage of G-Sync.

#21 Posted by EXTomar (4629 posts) -

Whether or not it is a good time to upgrade depends on your current card. By coincidence both ATI and Nvidia are rotating their products around so you might be able to find nice bargains on some high performance cards that aren't bleeding edge.

#22 Posted by jsnyder82 (730 posts) -

It's a great price and everything, but I think my 660ti will be okay until the next line of nVidia cards come out. I've only had good experiences with nVidia, and only bad with AMD.

#23 Posted by Karmum (11517 posts) -

@cale said:

PlayStation have 8gegabit DD5 and this only 4 (half) = sux. Also made from Hawaii who have no computer history. SUX. How it expects to compete with PS4 when it launches after it in USA? AMD made a big mistake on this.

AMD after this big mistake.

I'm guessing you're not being serious, because the PS4 definitely doesn't have 8GB of...

VRAM...

#24 Posted by geirr (2508 posts) -

It is noisy!

#25 Edited by Devildoll (877 posts) -

@alexw00d: well do you use 3 monitors and displayport?

cause all his problems might have been tied to that.

And yeah, the reference cooler does make a ruckus. thats no question.

#26 Posted by ajamafalous (11938 posts) -

I've had too many problems with multiple ATI cards in the past to consider buying anything but Nvidia at this point.

I'm not going to sit here and tell you not to buy an AMD card (I even put one in my brother's computer because it was way cheaper) but I, personally, am just gonna wait for Nvidia's 800 series and then see where I'm at.

#27 Posted by chiablo (918 posts) -

I've had too many problems with multiple ATI cards in the past to consider buying anything but Nvidia at this point.

I'm not going to sit here and tell you not to buy an AMD card (I even put one in my brother's computer because it was way cheaper) but I, personally, am just gonna wait for Nvidia's 800 series and then see where I'm at.

Dat 20nm fabrication...

This edit will also create new pages on Giant Bomb for:

Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

Comment and Save

Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.