Something went wrong. Try again later

AlexGlass

This user has not updated recently.

704 5 8 4
Forum Posts Wiki Points Following Followers

AlexGlass's forum posts

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#1  Edited By AlexGlass

@razielcuts said:

@agnosticjesus:

That's the thing though, there is no 'team' it's just all video games. I go to where the good games are, wherever they may be. It's just funny to me to see people align themselves to certain corporate entities when really they're just doing themselves a disservice and blindly denying themselves good content because it came from a particular source.

I may have to agree with

@nekroskop

here, I mean look at the original post. No opinion here just an info dump with a video link and an animated gif. Typically threads like this would be marked as spam on these forums.

Here's an opinion. How about you guys post some pics and videos, that way when I and others visit GB there's actually some UPDATED content for people to reply to! And then I don't have to do it. Because at times, this place was pretty much a graveyard when it came to content before I started posting here.

It would be nice for me too, if when I wanted to see the latest pics or videos I could just come, scroll down, check them out and post an opinion.

I mean damn, talk about ungrateful.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#3  Edited By AlexGlass

@Slaegar Yeah, I don't see how Nvidia can afford to maintain those prices now for much longer, noise level and all. It's just too big a price gap.

@slashdance said:

Is this a good time to upgrade, though?

I feel like waiting until the first "real" next gen games start coming out. I could buy something that runs BF4 and Watch Dogs at 1080p/60hz with everything turned up, but then what happens when the equivalent of Gears of War comes out? (edit: I guess Crysis 1 is a better example, but you get the point)

For me, Brigade is going to dictate if I make the jump back into PC gaming. After the recent demo, with more noise due to fresnel effects and needing 2 Titans, I think I'll be waiting to see what Nvidia and Maxwell has to offer. Hopefully they go heavy in MMID architecture and develop something that can really clean up that noise.

But if you're worried there's going to be any type of tech developed on the console side, that might not run on well on this card I highly doubt it. I doubt there's going to be anything developed this generation on the console side that this card would not been able to run as its PC equivalent.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#5  Edited By AlexGlass
No Caption Provided

No Caption Provided

Anandtech review: http://www.anandtech.com/show/7457/the-radeon-r9-290x-review

Wrapping things up, it’s looking like neither NVIDIA nor AMD are going to let today’s launch set a new status quo. NVIDIA for their part has already announced a GTX 780 Ti for next month, and while we can only speculate on performance we certainly don’t expect NVIDIA to let the 290X go unchallenged. The bigger question is whether they’re willing to compete with AMD on price.

GTX Titan and its prosumer status aside, even with NVIDIA’s upcoming game bundle it’s very hard right now to justify GTX 780 over the cheaper 290X, except on acoustic grounds. For some buyers that will be enough, but for 9% more performance and $100 less there are certainly buyers who are going to shift their gaze over to the 290X. For those buyers NVIDIA can’t afford to be both slower and more expensive than 290X. Unless NVIDIA does something totally off the wall like discontinuing GTX 780 entirely, then they have to bring prices down in response to the launch of 290X. 290X is simply too disruptive to GTX 780, and even GTX 770 is going to feel the pinch between that and 280X. Bundles will help, but what NVIDIA really needs to compete with the Radeon 200 series is a simple price cut.

$579 with Battefield 4 at Newegg:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058

Thoughts?

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

#6  Edited By AlexGlass

@flacracker said:

@mb This is the worst thread I have ever seen on Giant Bomb. It is literally physically hurting me to read this. Please lock.

@alexglass You cant post pictures of some fucking cgi that is at 720p and say it will look better than stuff running at 1080p. Not it if looks like a blurry piece of shit because it needs to be upscaled to fit on any sort of TV or monitor.

The hyperbole and exaggerations are amazing. A blurry piece of shit....really?Ok.

And yeah that looks better than anything you could possibly run now at 1080 or above. Which is just simply pointing out that overall graphics aren't as dependent on resolution as people are currently making it out to be and that the largest improvements still currently come from other graphical areas. And CG isn't necessary to see this. I used it only to accentuate what should already be a pretty obvious point. The most evident increases in overall graphics come from an increase in geometry and lighting, not resolution.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@andorski said:

Higher resolution or better shadowing/polygonal count/physics/etc?

Why not both?! PC master race reporting in!

Well it's funny you mention that because I think the PC gamer, probably the most concerned of all when it comes to frame rate and resolution, is actually the one getting ripped off the most.

When you consider the fact that a Titan can nearly run a real time path tracer in real time, and no doubt future cards from here on out, but the only benefit you get on a PC is increased resolution and frame rate, it's actually quite the rip-off. The reason for this being that games are constantly being held back by consoles. And resolution and frame rate are the two areas which practically require almost no additional work on the part of developers. It's just an automatic increase.

But that's a total rip-off compared to what graphics could look like if instead of just bumping low quality graphics, textures and lighting engines to the highest resolution possible, devs actually targeted some of these cards, and began developing geometry, lighting engines, animations that are up to the standard of what they can perform. I have no doubt games would look more impressive on PC at 1080p, if devs actually did that, then they do now even at Ultra resolution.

Which is why I believe, that thanks to devs like the Brigade devs, PC gaming is going to make a big comeback 2-3 years from now. And it might actually be worth the price of admission for more than just enthusiasts. And that if those guys get the support they deserve, as they have been busting their ass for 5 years now, PC gamers will no longer give as much of a shit about Ultra HD, and they'll be far more interested in getting something like Brigade to run noiseless at lower resolutions, and yes even 720p. Because the overall final gaphics quality, and the effects, and geometry that engine will push, will be on a completely different level than even current rasterized games even in Ultra HD. That is of course, if cloud gaming doesn't get it first, but even then, not everyone will have access to it.

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@darji said:

@alexglass said:

@darji said:

@alexglass: This is again most likely CG and even more FFXV runs in 1080P not 720P

The screen I posted is 720p. And again, what makes CG, CG? My question is clear. Where does the improvement come from? Which graphical areas?

The actual scene was in 1080P that is not a comparison at all.... And CG makes it CG if its prerendered for a cutscene for example. And the improvement comes from not needed to be rendered in real time like actual gam eplay.

Lol. Ok. Way to answer.

Do you honestly not know, or is that your way of just conceding the point?

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

@darji said:

@alexglass: This is again most likely CG and even more FFXV runs in 1080P not 720P

The screen I posted is 720p. And again, what makes CG, CG? My question is clear. Where does the improvement come from? Which graphical areas?

Avatar image for alexglass
AlexGlass

704

Forum Posts

5

Wiki Points

4

Followers

Reviews: 0

User Lists: 0

I am not sure about everyone elses TV, but mine has a native resolution of 1080p. Anything less than that and some piece of hardware has to scale the image to 1080p. This scaling ends up making everything look blurry and I find that to be incredibly distracting.

The OP mentions engines like Brigade 3, and how that will be better. I agree that engines like that will be better in the long run, but we do not have anywhere near the hardware capacity to run a game like that right now. At the moment a Geforce Titan ($1000 video card) runs brigade at sub-30fps, 720p, and with massive amounts of grain that make any visual enhancements gained completely worthless. Oh yeah, and those are real-time rendered videos, not interactive game-play, so the performance is not an example of how it would really look.

Loading Video...

Sure, things will get better, and who knows, maybe next generation rasterized engines will start to go away. Right now though, consoles do not have the power to run an engine like Brigade properly, so all we have to go by is current rendering pipelines and currently, engines running in 1080p look better than those running in 720p. There is a reason why so much money, and in-game hardware cycles, is spent on anti-aliasing. It gets rid of jaggies, and even though this is only anecdotal, 1080p with current engines looks nicer than 720p with Anti-aliasing enabled. The image is clearer, sharper, and easier on the eyes.

No Caption Provided

Can you tell which one is 720p and 1080p? Which image looks cleaner? Which image would you rather stare at for 50 hours? I choose the image on the left everytime.

It seems to me that no matter how many times I say it, people want to twist this argument to what they want. I will give it my last shot. I don't disagree with what you say, however I still don't think you and many others understood the point.

I'm not sure why people insist on comparing the exact same assets at different resolutions as if that has anything to do with anything I have been writing in this thread. Why the hell would I be making such an obvious point? Just as it is obvious, a no brainer, absolutely non-debatable, that there are other, more important aspects than resolution that goes into making your graphics look good, especially when talking about 3D graphics. In fact that goes into making your graphics exist in the first place. A higher resolution game with a low polygon budget for its assets and poor lighting will generally look less impressive than even a lower resolution game with a more robust polygon budget and better lighting

Resolution is not going to ever help a game that has poor texturing, a sub-par low polygon budget, or poor lighting look better no matter how much you crank it up. In fact an argument can be made it will make it look worse.

A more generous polygon budget, a better lighting engine, even great AA, will do far more for your final image quality than resolution will at this point in computer graphics. Devs don't do CG because of resolution. They do CG for these reasons: polygon budget, ray tracing, animations physics, and AA. That's what separates real time graphics from CG. Those are the areas that have the biggest impact in overall image quality. And those are the areas that actually need to improve in order for graphics to catch up. That's why we have LOD even in real time games. That's why we have cutscenes. That's why we have autovistas in games like Forza. Because we can bump up the polygon count and improve the lighting, animations, physics and AA.

Where is the largest visual improvement coming from in the scene below, even at 720p, compared to current real time games?

No Caption Provided

It comes from primarily an insane polygon count, and then from the lighting and physics in order to make the water look and move realistic. When we can actually create and display this level of detail and lighting in real time, resolution will play a bigger role. But right now it's putting the carriage before the horse. You cannot get there with 4k. Not even remotely close. You can get there from improved lighting engines and an increase polygon budget.

So let me make my point clear: The difference in a game's graphics or computer graphics in general is far greater and far more noticeable from polygon increase, and actual 3D detail, in assets and characters as well as lighting and animation than it is from another level of resolution. This is proven fact. The reason CG graphics exist, or Pixar graphics exist, and they look so much better is primarily due to polygon count and lighting, NOT resolution. And this is visible even on a sub HD TV. Yet the only thing people talk about anymore is resolution. Despite the fact one can easily argue it's not even in the top 3 most important areas of overall graphics at this stage in the game.

As far as Brigade goes, it's not as far away as you think. Brigade is designed with cloud streaming in mind and owned by the same company who owns the Octane cloud render. In fact, Brigade 3 just introduced the Octane material system. You'll probably see it on something like OnLive or similar sooner than you will on a dedicated GPU. And Brigade games, whenever the hell we will get them will run at lower resolutions compared to rasterized graphics. And even at lower resolution, they overall quality of graphics will surpass that of high res rasterized graphics. Just like CG does right now.

PS: That's a great comparison to show the difference in amount of pixels, but everyone here who has ever owned even a regular tube TV knows you never notice the type of AA artifacts present in your comparison. The scaling today isn't quite as poor.