Something went wrong. Try again later
    Follow

    PC

    Platform »

    The PC (Personal Computer) is a highly configurable and upgradable gaming platform that, among home systems, sports the widest variety of control methods, largest library of games, and cutting edge graphics and sound capabilities.

    R9 290x $549. More than Titan. Half the price.

    Avatar image for alexglass
    AlexGlass

    704

    Forum Posts

    5

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #1  Edited By AlexGlass
    No Caption Provided

    No Caption Provided

    Anandtech review: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review

    Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

    GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

    $579 with Battefield 4 at Newegg:

    http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058

    Thoughts?

    Avatar image for samstrife
    SamStrife

    1332

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #2  Edited By SamStrife

    I think it runs super hot and whilst it undercuts all of Nvidea prices, it doesn't hold it's value against other AMD cards (it's a lot more expensive than other AMD ones for diminishing returns in performance). Whilst I don't have a problem with AMD drivers, Nvidea is doing a ton of work with their suites, such as the Gefore Experience and that new recording tool they're launcing next week.

    If you want a top end GPU, I'd honestly wait a year for the 800 series Nvidea cards to come out with their Maxwell architecture which, to me at least, looks pretty darn exciting. If you absolutely cannot wait the R9 290x is pretty good but it certainly has its caveats. The GTX 760/770 would do you just as well or (if you really want an AMD card) the 7870 XT with boost Sapphire does is a pretty excellent card for the price that would tie you over nicely til the later models of these new batches drop.

    Avatar image for pandabear
    PandaBear

    1484

    Forum Posts

    238

    Wiki Points

    0

    Followers

    Reviews: 5

    User Lists: 0

    Avatar image for alexglass
    AlexGlass

    704

    Forum Posts

    5

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Avatar image for slashdance
    SlashDance

    1867

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #5  Edited By SlashDance

    Is this a good time to upgrade, though?

    I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out? (edit: I guess Crysis 1 is a better example, but you get the point)

    Avatar image for slaegar
    Slaegar

    935

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #6  Edited By Slaegar

    This is good news for everyone. Nvidia is going to have to stop gouging their customers if they want to compete with AMD.

    But still have their ancient fears of AMD drivers (even though one of the last Nvidia WHQL drivers was causing games to be unplayable and bricking some cards) so they like to stick with team green even when it can cost $100+ for the same performance. Heck the last two video cards that died on me were Nvidia.

    Not like it matters much for me, though, I doubt I'll ever be in a position to comfortably afford a flagship video card.

    There was some furmark result that had it at 94 Celsius with is pretty hot, but furmark pushes video cards passed what any game ever will. I also suspect non reference cards to keep temperatures down. One might suspect AMD pushed the clock speeds of the card a little higher than they initially planned so they could fit the card where the wanted in benchmarks.

    Is this a good time to upgrade, though?

    I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out?

    This card is far beyond the power of the next generation of consoles even after you account for low level optimizations. They might get close, but I'm not sure if even the next Gears of War bump in prettiness will antique this card.

    Avatar image for huey2k2
    Huey2k2

    528

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    I bought a GTX670 about a year ago, I figure I can afford to wait a year or so to see what the 800 series has to offer. My GTX670 should be able to handle most of what I need until then. Or at least I hope it does, because I really need a new processor sometime in the near future.

    Avatar image for alexglass
    AlexGlass

    704

    Forum Posts

    5

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #8  Edited By AlexGlass

    @Slaegar Yeah, I don't see how Nvidia can afford to maintain those prices now for much longer, noise level and all. It's just too big a price gap.

    @slashdance said:

    Is this a good time to upgrade, though?

    I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out? (edit: I guess Crysis 1 is a better example, but you get the point)

    For me, Brigade is going to dictate if I make the jump back into PC gaming. After the recent demo, with more noise due to fresnel effects and needing 2 Titans, I think I'll be waiting to see what Nvidia and Maxwell has to offer. Hopefully they go heavy in MMID architecture and develop something that can really clean up that noise.

    But if you're worried there's going to be any type of tech developed on the console side, that might not run on well on this card I highly doubt it. I doubt there's going to be anything developed this generation on the console side that this card would not been able to run as its PC equivalent.

    Avatar image for kolayamit
    kolayamit

    4

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    can't wait to buy one of this..

    Avatar image for xeiphyer
    Xeiphyer

    5962

    Forum Posts

    1193

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 8

    Looks shiny.

    Not planning on upgrading until next year though. Hopefully the new console releases will cause a nice bump in quality from next year's cards. Also wondering about integration of any new indirectX's and whathaveyous depending on what the consoles are running and what ends up in the PC market.

    Avatar image for devildoll
    Devildoll

    1013

    Forum Posts

    286

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #11  Edited By Devildoll

    I think the chip is pretty impressive, the amount of stuff they managed to cram into it while only making it about 100 mm2 bigger than tahiti.

    Seems like the cooler is pretty bad though, kind of hot and loud, leaving almost no room to oc.

    It'll be interesting to see how far you can push the chip with adequate cooling.

    Here in sweden, the card landed about 100 bucks cheaper than i thought it would be placed, it costs less than a 7970 did at launch, which is pretty cool.
    Not quite sure i want to buy one though.

    Avatar image for andorski
    Andorski

    5482

    Forum Posts

    2310

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 3

    #12  Edited By Andorski

    Checking several game bench marks, the 290x looks to perform between the GTX 780 and Titan at 1080p/1200p. Going up in resolution at 1440p/1600p and the 290x starts beating the Titan and at 4k resolution the 290x destroys anything nVidia is offering. That kind of performance at $579 USD is remarkable. The only gripes I've heard about the card is that it gets damn hot (it hits +90C during benchmarks) and the OC ceiling is a bit low.

    I hope this pushes nVidia to drop the price of 780. With the announcement of the 780Ti, I think it would be best for nVidia to price the 780 at $550, the 780Ti at $650 (assuming that it is at or above the 290x in performance), and price whatever the hell they want with the Titan.

    Avatar image for droop
    droop

    1929

    Forum Posts

    710

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 7

    #13  Edited By droop

    They said 95C is a normal operating temperature, and the fans can reach like 70dB (at 100% which is a rare case).

    Will be interesting to see some non-stock coolers on the GPU.

    Avatar image for sinusoidal
    Sinusoidal

    3608

    Forum Posts

    20

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    ... prosumer...

    I hate this word with every fiber of my being. What? You're a pro at buying shit? Whoopty-fucking-doo!

    Avatar image for scampbell
    Scampbell

    517

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @slaegar said:

    This is good news for everyone. Nvidia is going to have to stop gouging their customers if they want to compete with AMD.

    But still have their ancient fears of AMD drivers (even though one of the last Nvidia WHQL drivers was causing games to be unplayable and bricking some cards) so they like to stick with team green even when it can cost $100+ for the same performance. Heck the last two video cards that died on me were Nvidia.

    Not like it matters much for me, though, I doubt I'll ever be in a position to comfortably afford a flagship video card.

    There was some furmark result that had it at 94 Celsius with is pretty hot, but furmark pushes video cards passed what any game ever will. I also suspect non reference cards to keep temperatures down. One might suspect AMD pushed the clock speeds of the card a little higher than they initially planned so they could fit the card where the wanted in benchmarks.

    @slashdance said:

    Is this a good time to upgrade, though?

    I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out?

    This card is far beyond the power of the next generation of consoles even after you account for low level optimizations. They might get close, but I'm not sure if even the next Gears of War bump in prettiness will antique this card.

    And with the help of Mantle low level console optimization might not mean much compared to a GCN-based GPU on PC.

    Avatar image for fistfulofmetal
    fistfulofmetal

    763

    Forum Posts

    2

    Wiki Points

    0

    Followers

    Reviews: 2

    User Lists: 29

    I'll likely wait until NVIDIA releases the specs on the GTX 880 next year and see how we are.

    Avatar image for cale
    CaLe

    4567

    Forum Posts

    516

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #17  Edited By CaLe

    PlayStation have 8gegabit DD5 and this only 4 (half) = sux. Also made from Hawaii who have no computer history. SUX. How it expects to compete with PS4 when it launches after it in USA? AMD made a big mistake on this.

    AMD after this big mistake.
    AMD after this big mistake.

    Avatar image for chiablo
    chiablo

    1052

    Forum Posts

    41

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #18  Edited By chiablo

    I've been an nVidia fan for years now. My last video card was an AMD 6970 and I've had nothing but problems with it.

    Short list of my grievances:

    • Locking up my DisplayPort monitor during every reboot.
    • Dual Display crashing the AMD driver if I play DotA on high settings.
    • Triple Display crashing the AMD driver constantly.
    • No DisplayPort output when in the BIOS or while doing the POST.
    • Noise on my DisplayPort at lower resolutions. I haven't a clue how this is possible because it's an all-digital interface. WTF?
    • Sounds like a jet engine.

    I'm really excited for this card though, because it will force nVidia to step up their game and make their top-end cards less expensive. :p

    Also, this is the most annoying thing ever and is a known bug with AMD cards: http://dev.dota2.com/showthread.php?t=33614

    Avatar image for alexw00d
    AlexW00d

    7604

    Forum Posts

    3686

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    @chiablo said:

    I've been an nVidia fan for years now. My last video card was an AMD 6970 and I've had nothing but problems with it.

    Short list of my grievances:

    • Locking up my DisplayPort monitor during every reboot.
    • Dual Display crashing the AMD driver if I play DotA on high settings.
    • Triple Display crashing the AMD driver constantly.
    • No DisplayPort output when in the BIOS or while doing the POST.
    • Noise on my DisplayPort at lower resolutions. I haven't a clue how this is possible because it's an all-digital interface. WTF?
    • Sounds like a jet engine.

    I'm really excited for this card though, because it will force nVidia to step up their game and make their top-end cards less expensive. :p

    Maybe you should have RMAd it cause that's clearly a fucking busted card you have...

    I have had a 6870 for 3 years now and had 0 problems.

    Anyway, it's a shame the 290x is $550 for americans, and fucking $700 in europe. Yay capitalism.

    Avatar image for monetarydread
    monetarydread

    2898

    Forum Posts

    92

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #20  Edited By monetarydread
    No Caption Provided

    The card is a little expensive in Canada =)

    I am going to wait until I see Nvidias 8xx cards next year. I am more than happy with my 680 and I do not feel like I need an upgrade quite yet. I have had nothing but issues with ATI cards in the past (I used to have a 4890, so maybe things have changed) Also, I have a 3D Vision 2 monitor that I use quite regularly and I would hate to be forced into using ATI's solution instead because the quality of the experience is a fraction of what Nvidia offers. Plus, there is a DIY-kit coming that will allow my monitor to take advantage of G-Sync.

    Avatar image for extomar
    EXTomar

    5047

    Forum Posts

    4

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Whether or not it is a good time to upgrade depends on your current card. By coincidence both ATI and Nvidia are rotating their products around so you might be able to find nice bargains on some high performance cards that aren't bleeding edge.

    Avatar image for jsnyder82
    jsnyder82

    871

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    It's a great price and everything, but I think my 660ti will be okay until the next line of nVidia cards come out. I've only had good experiences with nVidia, and only bad with AMD.

    Avatar image for karmum
    Karmum

    11514

    Forum Posts

    479

    Wiki Points

    0

    Followers

    Reviews: 4

    User Lists: 1

    @cale said:

    PlayStation have 8gegabit DD5 and this only 4 (half) = sux. Also made from Hawaii who have no computer history. SUX. How it expects to compete with PS4 when it launches after it in USA? AMD made a big mistake on this.

    AMD after this big mistake.
    AMD after this big mistake.

    I'm guessing you're not being serious, because the PS4 definitely doesn't have 8GB of...

    VRAM...

    Avatar image for geirr
    geirr

    4166

    Forum Posts

    717

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 5

    It is noisy!

    Avatar image for devildoll
    Devildoll

    1013

    Forum Posts

    286

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #25  Edited By Devildoll

    @alexw00d: well do you use 3 monitors and displayport?

    cause all his problems might have been tied to that.

    And yeah, the reference cooler does make a ruckus. thats no question.

    Avatar image for ajamafalous
    ajamafalous

    13992

    Forum Posts

    905

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 9

    I've had too many problems with multiple ATI cards in the past to consider buying anything but Nvidia at this point.

    I'm not going to sit here and tell you not to buy an AMD card (I even put one in my brother's computer because it was way cheaper) but I, personally, am just gonna wait for Nvidia's 800 series and then see where I'm at.

    Avatar image for chiablo
    chiablo

    1052

    Forum Posts

    41

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    I've had too many problems with multiple ATI cards in the past to consider buying anything but Nvidia at this point.

    I'm not going to sit here and tell you not to buy an AMD card (I even put one in my brother's computer because it was way cheaper) but I, personally, am just gonna wait for Nvidia's 800 series and then see where I'm at.

    Dat 20nm fabrication...

    This edit will also create new pages on Giant Bomb for:

    Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

    Comment and Save

    Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.