New NVIDIA cards will cost $900/$1600

Avatar image for rorie
rorie

7887

Forum Posts

1502

Wiki Points

0

Followers

Reviews: 4

User Lists: 3

https://arstechnica.com/gadgets/2022/09/nvidias-ada-lovelace-gpu-generation-1599-for-rtx-4090-899-and-up-for-4080/

After weeks of teases, Nvidia's newest computer graphics cards, the "Ada Lovelace" generation of RTX 4000 GPUs, are here. Nvidia CEO Jensen Huang debuted two new models on Tuesday: theRTX 4090,which will start at a whopping $1,599, and theRTX 4080, which will launch in two configurations.

That's the facts, jack. After years of aftermarket sticker shock on 3080s, it's "nice" to know their replacements will only be a couple of hundred dollars more than the MSRP on an original 3080. NVIDIA's slides say the 4080 will be "2-4x faster than a 3080ti" which is...a pretty big range!

Meanwhile, the RTX 4080 will follow in November in two SKUs: a 12GB GDDR6X model (192-bit bus) starting at $899, plus a 16GB GDDR6X model (256-bit bus) starting at $1,199. Based on how different the specs are between these two models, however, Nvidia appears to be launching two entirely different chipsets under the same "4080" banner; traditionally, this much Nvidia hardware differentiation has come with separate model names (i.e. last generation's 3070 and 3080). We're awaiting firmer confirmation from Nvidia on whether the 4080 models do or do not share a chipset.

That's also kind of interesting. Obviously for most people with a gaming monitor, a 3080 will run pretty much anything at 60+ FPS even at ultrawide resolutions, so the difference between a 4070 and a 4080 isn't going to matter much, but if you want to future proof your video card purchase, it seems like the 4080 w/ more memory might be worth the splurge. God only knows what resolutions we'll be playing games on in a few years.

My fervent hope is that this generation finally makes RTX "free" from a processing perspective, i.e. you can flip it on without losing any framerate. I don't know if that's possible or if I'm just dumb but with my 3080 I always keep RTX off because it's not worth the framerate hit. Hopefully it either looks nicer or doesn't come with a performance hit on these new cards.

Avatar image for cikame
cikame

4473

Forum Posts

10

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2  Edited By cikame

I feel in a world of advanced upscaling getting the latest card has never been less appealing, especially as prices rise.

I still have a 1080p monitor and my 2080ti hasn't struggled with anything, except Flight Sim in busy airports, as you say it's more about resolutions and getting more performance at 4k or maybe even higher, but as upscaling keeps improving even hitting those resolutions (non natively) is becoming easier.

Edit: Upon further testing my performance in Flight Sim is limited by my CPU not the graphics card, so currently nothing is stressing my card.

Avatar image for chaser324
chaser324

9415

Forum Posts

14945

Wiki Points

0

Followers

Reviews: 1

User Lists: 15

#3 chaser324  Moderator

The two "4080" models appearing to be completely different cards is a pretty bad. I don't see any reason why they wouldn't just call the lower spec one a 4070 aside from the fact that they think they can charge a higher premium for something called a 4080.

Regardless, I just recently managed to get a 3080 at a reasonable price, so I won't be in the market for one of these for at least a year or two.

Avatar image for facelessvixen
FacelessVixen

4009

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

I'm perfectly fine with the 2080 Ti being the one flagship GPU that I'll most likely ever own. And I don't want to buy a new power supply, so I'm caring more about the 4070.

Avatar image for rorie
rorie

7887

Forum Posts

1502

Wiki Points

0

Followers

Reviews: 4

User Lists: 3

I'm perfectly fine with the 2080 Ti being the one flagship GPU that I'll most likely ever own. And I don't want to buy a new power supply, so I'm caring more about the 4070.

Yeah the power thing might wind up making this a full rebuild. Been a few years since I got a new processor, I guess.

Avatar image for tartyron
tartyron

794

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By tartyron  Online

I was lucky and got a vanilla 3080 in 2020 for the retail of $800. IT was the most I ever paid for a PC part, and I don't think I can justify throwing down another $900 only two years later. Probably wait until the 5000 series.

Besides, I would need to pretty much replace everything else (CPU, MB, RAM) to keep it from bottlenecking. I already get a bit of that with my current hardware, so I think I'll get all the other parts first, and even those I want to stretch out for at least one more year.

My GTX 1080 was good for 4 years. I think in the end that the expectation I have on my GPU hardware.

Avatar image for stealydan
stealydan

207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#7  Edited By stealydan

GPU power for the latest games has seemingly never mattered less. There are almost zero AAA-level games that are PC exclusives these days, and almost everything on that level is using temporal reconstruction to get amazing framerates with great anti-aliasing.

Unless console games start going back to squeezing the most eye candy out of the hardware as they can at 30 fps (or averaging in the 20s like the late 360 era, blech) after they finally abandon the last generation, I really don't see why more powerful GPUs are needed since the 30-series is already way more powerful than consoles.

Maybe if you play on a 4k/240Hz/ultrawide screen, these cards will appeal to you, but that would only justify the insane prices since those people clearly have tons of disposable cash.

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

So I bought a 3080ti earlier this year. Do I regret my purchase? No. I will admit that I spent quite a chunk of change on it, but also I probably won't have to buy another card for a long time.

That said, as the owner of a 3080ti, you don't really need that kind of power these days? Like, if you have the cash, go nuts. I certainly did. But you're not getting all that much for your money if you aren't aiming for 4K60FPS ultra settings on every game. Go for something cheaper if you're using a 1080p or 1440p display, get a 30 series card now that they're in stock or look into whatever AMD's offering now. You're not losing much of anything if you refuse to spend $900+ on a GPU.

Speaking of AMD, Nvidia really needs to offer a reasonable sub $300/$400 option or AMD is going to have a wide opening to just jump in and take that market, kind of like they did versus Intel several years ago.

Avatar image for y2ken
Y2Ken

3308

Forum Posts

82

Wiki Points

0

Followers

Reviews: 2

User Lists: 28

I got a 3080 a couple months ago as the prices dropped - feeling very happy with my purchase. I'm excited to see what these new cards can do but the 3080 will almost certainly cover me for a good long while. I'm currently running at 1080p144 (with three monitors, but only using one to play games) but planning to upgrade my main monitor to 1440p144 (helped by the announcement that PS5 will now support that resolution if I get one down the line).

I'd needed a new card for a while and was starting to run into games that flat out didn't work at all, so that forced my hand somewhat alongside increased availability and reduced prices. But hearing rumours of the increased power draw for the new cards certainly helped my consideration to pick up a current-gen card rather than waiting (also I'm sure the demand will peak again with the launch of the 4000 series cards).

Avatar image for sanity
Sanity

2255

Forum Posts

178

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Gonna wait to see what AMD has this time around, i have a 2080ti and might upgrade but not real happy with how Nvidia has been throwing its weight around these past two gens and treating reviewers like crap along with the whole EVGA thing. If AMD puts out something at least semi-competitive in price and performance to the 4080 i think i may give them a go for the first time since Skyrim came out, haha.

Avatar image for av_gamer
AV_Gamer

2882

Forum Posts

17819

Wiki Points

0

Followers

Reviews: 15

User Lists: 13

#12  Edited By AV_Gamer

Good luck to the FOMO sufferers out there. Right now, I'm happy with my 6600xt. It can play games at both 1080p and 1440p as smooth as butter with plenty of FPS on my 144Hz monitor. Right now, these new cards are overkill. Video game development have yet to reach a point where these cards are mandatory. Like a new fancy model 8K television that cost a fortune, its completely pointless at the moment, outside of bragging rights.

Avatar image for yyninja
yyninja

280

Forum Posts

83

Wiki Points

0

Followers

Reviews: 91

User Lists: 4

@rorie said:

My fervent hope is that this generation finally makes RTX "free" from a processing perspective, i.e. you can flip it on without losing any framerate. I don't know if that's possible or if I'm just dumb but with my 3080 I always keep RTX off because it's not worth the framerate hit. Hopefully it either looks nicer or doesn't come with a performance hit on these new cards.

When it comes to pure rasterization, the AMD cards like the RX 6900 XT has the RTX 3080 beat dollar per frame. It's RT and DLSS that really makes the Nvidia cards shine. RT will never be "free", it will always hamper performance, it's the same with upping the resolution. I wish I can get 1080p frame rate in a 4k native resolution but that's not possible.

I think the newly announced cards won't matter to most gamers unless they plan on gaming on a 4K screen above 60 FPS with RT. Also the RTX 4080 12 GB is pretty slimy marketing considering it's a cut die compared to the 16 GB model. This is different from the RTX 3080 10 GB and RTX 3080 12 GB that are nearly the same die. The 12 GB model should be branded as the RTX 4070 TI instead.

IMO I also think the RTX 30 series is not a good buy right now. They are still ridiculously expensive and I'm predicting that there will be deals around Cyber Monday. I think people are forgetting that AMD is also launching their RDNA3 cards this year and they might again be better cost per frame vs Nvidia cards if you don't care about RT. I think DLSS is less of a strong argument because FSR is catching up rapidly and there is a faster adoption rate.

Avatar image for myke_tuna
myke_tuna

2050

Forum Posts

101

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#14  Edited By myke_tuna

I go for a long time when it comes to graphics cards. I've averaged 3.5 years between upgrades and every 2 upgrades, it's time for a new build. That being said, games seem to be much better optimized now and I only game at 1080p60 (sometimes an unusual 1080p75 because my monitor is 75hz), so my 3060 Ti should last me longer than usual.

I'll probably wait for the 5000, maybe 6000 series Nvidia cards or I'll finally swap over to AMD if they knock it out of the park in the coming years. The power draw I'm looking at for Nvidia with 4000 does have me a little worried. I went over on my PSU like I usually do in my current build, but seems like that will become a standard if Nvidia cards keep climbing higher in juice needs.

Avatar image for gtxforza
gtxforza

2185

Forum Posts

5217

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

Oh dear, these GPUs are expensive and we have to deal with it but I'm so impressed with their specs.

Avatar image for isomeri
isomeri

3528

Forum Posts

300

Wiki Points

0

Followers

Reviews: 0

User Lists: 26

My PC is now 10 years old and my GPU 6 years old. I've been planning a new build for the last couple of years, but haven't really seen much need for it. If I suddenly stumble across a big chunk of money next year I might consider buying some new parts, but I don't think I'm ready to spend more money on a single component than a nice OLED TV + console is worth.

I don't have enough time to play all the new stuff on the Series X and PS5 as is and if something interesting comes out on PC I might even consider trying out a streaming solution for the moment.

Avatar image for sombre
sombre

2239

Forum Posts

34

Wiki Points

0

Followers

Reviews: 0

User Lists: 4

But for real, do you expect to be able to buy one?

Avatar image for brendan
Brendan

9414

Forum Posts

533

Wiki Points

0

Followers

Reviews: 0

User Lists: 7

@justin258: I feel similarly. In terms of graphical performance even looking at PS5 vs. PS4, it seems we're finally at the place where extra power is producing virtually no visual difference. Combine that with the limiting factor being the massive art resources needed to make games that can push the envelope further, and the fact that the most popular games don't exactly do that anyway...it seems like graphic power this intense has never been less important.

Doesn't stop me from lusting after one of these!

Avatar image for tartyron
tartyron

794

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 tartyron  Online

@gtxforza: I would honestly take those specs with a grain of salt until independent outlets are able to run proper tests on them. Twice the cuda cores doesn’t translate to twice the fps exactly. Gamersnexus is one that I follow and they are usually pretty good for making hardware purchasing decisions.

Avatar image for monkeyking1969
monkeyking1969

9095

Forum Posts

1241

Wiki Points

0

Followers

Reviews: 0

User Lists: 18

To be honest I don't need the "fastest" video card. I'm sure most of use are still running the same 1440p monitor or 1080p monitor using the games "high settings" and doing fine.

NVIDIA and their investor want the best return....fine. But, consumers don't have to buy these cards. Jensen Huang can do what he wants...buy more leather jackets. I think we need a return to XX90's being under $1000. But the market will do what they market will do, I just hope people around here are more astute with their money.

Avatar image for spacemanspiff00
spacemanspiff00

442

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#22  Edited By spacemanspiff00

We're getting killed by prices in other territories(I am in Canada). I'm still running a 970 which I got a couple months after release in 2014 for 400 something dollars. The 3070 is double or more, up to over a 1000 bucks, when looking at Memory Express where I bought my 970. Its hard to even consider it when the 80 series isn't really that much more in the grand scheme. Whenever I do decide to build a new computer I'm thinking I'll just pony up on the best deal I can get for an 80 series(seen some good sales on the TI models too) and then go HAM on finding good deals for every other component since I'll have to do a ground up build again. Which is fine since my current rig is almost 8 years old and I intend to keep my next one for just as long or longer.

Avatar image for lapsariangiraff
lapsariangiraff

594

Forum Posts

629

Wiki Points

0

Followers

Reviews: 2

User Lists: 3

#23  Edited By lapsariangiraff

Oof. The performance bumps look nice, but nothing worth upgrading from last gen over, especially for that price. I think this will be a neat option for people who are a generation or two behind, but even that feels dicey since we're going to be flooded by so many ex-crypto-mining 30 series cards at a cheaper price. And even then -- I can't think of a single game I'm excited to see run faster. Microsoft Flight Simulator? Maybe? One game isn't worth $1600 to me.

Now the hypothetical 50 series, two years from now, when a few more benchmark games have come out and my 3090 is finally getting sweated? Maybe.

DLSS 3 looks exciting though! Continues to be the coolest thing NVIDIA is doing.

Avatar image for bybeach
bybeach

6754

Forum Posts

1

Wiki Points

0

Followers

Reviews: 0

User Lists: 1

#24  Edited By bybeach

I think the new top nvidia cards follow the old adage that once prices go up, there is strong resistance for the prices going back down. It can be nuanced, but when manufacturers and retailers finding themselves making more money they well, don't want to quit doing that. Bitcoin has fallen, and supply has improved, but people are making money do not want that to diminish, though there will be some price reduction.

Sad loss is the democracy of people with less money now being forced out of the pc home-build, while people of better means still being able to afford the cards, along with the PSU's, and probably other components required to make the upgrade-rebuild worthwhile. And with thinking tv's into the mix, many will be forced into the console approach for gaming.

This may be what happens to me, or I will be interested in the 5000's or next series, with one eye solidly glued to the price performances. I have a 3080 I paid too much for even with that newegg raffle. It performs well, so no hurry.

Avatar image for humanity
Humanity

21858

Forum Posts

5738

Wiki Points

0

Followers

Reviews: 40

User Lists: 16

#25  Edited By Humanity

With these prices it really feels like PC gaming is slipping away from me. I used to update my gaming PC every 6 years or so to be up to date in case something that I'd prefer to play on the PC comes along.. but upgrading my computer today would cost me 4x as much as when I upgraded it last time. With inflation and companies (at least mine) being cagey about yearly raises because of everything thats going on in the world (and because they are greedy corporations) I'm just seeing PC gaming becoming quite literally too expensive.

Avatar image for justin258
Justin258

16684

Forum Posts

26

Wiki Points

0

Followers

Reviews: 11

User Lists: 8

@brendan said:

@justin258: I feel similarly. In terms of graphical performance even looking at PS5 vs. PS4, it seems we're finally at the place where extra power is producing virtually no visual difference. Combine that with the limiting factor being the massive art resources needed to make games that can push the envelope further, and the fact that the most popular games don't exactly do that anyway...it seems like graphic power this intense has never been less important.

Doesn't stop me from lusting after one of these!

I wouldn't say that there's "virtually no difference" between a PS4 and PS5's graphical performance - if you put a base PS4 up against a PS5, there's quite a difference. However, I would personally say that the biggest difference most of the time is in framerate, resolution, and effects, rather than detail and texture quality. That is my personal experience with the Xbox One S versus Xbox Series X, anyway.

Avatar image for cikame
cikame

4473

Forum Posts

10

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Came across this amazing comparison video of the size of the 4090.

Loading Video...

I feel like we need to take a breath and try to make these smaller this is... stupid. Intel announcing their cheaper smaller card recently was very timely.

Avatar image for mistersims
MisterSims

167

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28  Edited By MisterSims

i have a 2080TI right now, and I'll say this... I'm glad that I sat out the 30 series. The new cards seem great so far what with the DLSS3 and not having to worry about RT. Still pretty pricey, but I remember paying $1200 (with tax included, I think) for the 2080 so it's not something I wasn't expecting. I have a 1080p 144hz monitor, but I usually play on my 4KTV so I am definitely excited. I'll probably get the cheapest option though, because I'm pretty sure my CPU is going to bottleneck, and I'll have to spend on a new power supply as well. I read the 4080 needs at least 700. Sheesh... I'm 100 too low SMH.

Avatar image for mak_wikus
mak_wikus

818

Forum Posts

283

Wiki Points

0

Followers

Reviews: 0

User Lists: 2

@cikame: Jay(the guy from the video you posted) just said on his podcast stream that they are already testing these cards and that they absolutely did not had to be this big.

Avatar image for monkeyking1969
monkeyking1969

9095

Forum Posts

1241

Wiki Points

0

Followers

Reviews: 0

User Lists: 18

Thank god Intel is starting to make GPUs again. We need more competition. I think Samsung, Sony, Texas Instruments and Micron Technology all need to get back into GPUs. Some of those companies like Ti and Micron might specialize on the low end like Think Silicon does with ist RISC-V chips; but I think we need someone like Samsung making consumer grade computer GPUs too. If Intel, AMD, NVIDIA and Samsung were all competing for the consumer GPU market prices would be down to $500 for the highest of the high end consumer cards. The prices would be $100, $200, $300, $400 and $500 (RT 4030, RTX 4050, RTX 4060, RTX 4070, RTX4080)

I think the USA, UK, and EU all see that they need to have 'at least some' chip manufacturing in their own nations. Certainly, Turkey, Israel, and India all know they do not want to rely on the East or West for chips. Evan a small low-end fab would be useful to most of those nations just for national security.