hey guys. Something Ive noticed lately, more and more, is that using DX11 in most games means a huge performance loss for me, for what seems like little graphical difference. I have a modest rig by todays standards, the CPU is only a i5 (really wish I went with a i7 back then, but hindsight is 20/20 etc) but I later got a GTX580 which pretty much picks up the slack for most games, unless the ones whicha re CPU intensive. But anyway, I can run pretty much any game maxed, as long as I use DX9, as soon as I try to use DX11 is most (all?) cases, I take a huge performace dip, especially frame rate wise.
And here's the kicker, I honestly dont notice any difference graphically, except the worse framerate of course. Maybe its the years of console gaming that have lowered my standards, or the fact that my monitor only does 1920X1080, I dont know, but the fact is, I cant tell a difference in most cases. Latest example is Crysis 2, the game actually defaults to Ultra with DX11, but the framerate drops were really annoying me, and after playing around with the various settings and not getting any results, I tried running it in DX9 and behold, butter smooth 60FPS, even with all the settings on ultra.
Ive seen many people complain on several forums about DX11 in general, so what gives? Is it just poorly optimized? Does it give bells and whistles that only the real hardcore notice? And what do you usually go with if given the choice?
EDIT: also, would getting more RAM help? its the cheapest / fastest upgrade I can get, I dont want to get a new CPU because it involves getting a new Motherboard, and im really happy with my gtx580. At the moment Im using 4gig of ram